Human Generated Data

Title

Untitled (bride and groom departing, East Coast)

Date

1960s, printed later

People

Artist: Bachrach Studios, founded 1868

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.974

Human Generated Data

Title

Untitled (bride and groom departing, East Coast)

People

Artist: Bachrach Studios, founded 1868

Date

1960s, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.974

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 89.7
Human 89.7
Paper 78.6
Person 68.9
Confetti 62
Tile 56.3

Clarifai
created on 2023-10-26

street 99.3
monochrome 99.2
art 98.5
city 97.5
sepia 97.1
wedding 97.1
girl 97
abstract 96.8
vintage 96.5
people 96.4
black and white 96.3
urban 96.2
old 96.1
flower 95.1
desktop 94.6
couple 94.4
winter 94.3
square 93.9
design 93.8
light 92.4

Imagga
created on 2022-01-23

architecture 32.7
old 29.9
puzzle 29.7
building 26.8
grunge 26.4
city 23.3
crossword puzzle 22.4
game 21.5
window 20.1
house 19.3
wall 19.1
urban 18.3
design 18
texture 16.7
structure 16.3
art 16.2
map 15.9
vintage 15.7
mosaic 15.5
travel 15.5
aged 14.5
jigsaw puzzle 14.1
retro 13.9
town 13.9
stone 13.2
negative 13.1
modern 12.6
pattern 12.3
exterior 12
style 11.9
rough 11.8
paint 11.8
paper 11.8
roof 11.7
graphic 10.9
film 10.9
dirty 10.8
tourism 10.7
grungy 10.4
buildings 10.4
home 10.4
decoration 10.3
landmark 9.9
tower 9.8
textured 9.6
artistic 9.6
rusty 9.5
brick 9.5
color 9.5
construction 9.4
transducer 9
surface 8.8
device 8.7
ancient 8.6
business 7.9
balcony 7.9
frame 7.8
houses 7.7
tile 7.7
medieval 7.7
residential 7.7
north 7.6
damaged 7.6
weathered 7.6
street 7.4
historic 7.3
digital 7.3
world 7.3
black 7.2
history 7.2

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 97.7
black and white 59.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-31
Gender Female, 99.7%
Happy 50%
Sad 28%
Calm 14.6%
Fear 3%
Confused 1.2%
Disgusted 1.1%
Angry 1.1%
Surprised 1%

AWS Rekognition

Age 33-41
Gender Male, 100%
Calm 98.9%
Sad 0.6%
Angry 0.2%
Confused 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0%
Happy 0%

AWS Rekognition

Age 48-56
Gender Male, 100%
Sad 40.5%
Calm 32.9%
Angry 22.5%
Confused 2.7%
Disgusted 0.5%
Happy 0.3%
Fear 0.3%
Surprised 0.2%

AWS Rekognition

Age 20-28
Gender Male, 82.9%
Calm 49.6%
Happy 28.1%
Sad 16.5%
Surprised 1.7%
Confused 1.5%
Fear 1.1%
Disgusted 1.1%
Angry 0.4%

AWS Rekognition

Age 23-33
Gender Female, 78.9%
Happy 99.6%
Sad 0.2%
Surprised 0.1%
Calm 0.1%
Angry 0%
Fear 0%
Confused 0%
Disgusted 0%

AWS Rekognition

Age 27-37
Gender Female, 87.8%
Sad 61.9%
Angry 12.2%
Happy 10%
Calm 7.7%
Surprised 3%
Fear 2.2%
Disgusted 1.8%
Confused 1.2%

AWS Rekognition

Age 6-16
Gender Female, 99.9%
Sad 98.2%
Calm 1%
Happy 0.4%
Fear 0.2%
Angry 0%
Surprised 0%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 18-26
Gender Female, 77.8%
Happy 89.3%
Calm 4.7%
Fear 1.9%
Sad 1.6%
Surprised 1.2%
Angry 0.6%
Disgusted 0.4%
Confused 0.3%

AWS Rekognition

Age 27-37
Gender Female, 71.7%
Sad 100%
Calm 0%
Happy 0%
Fear 0%
Disgusted 0%
Angry 0%
Confused 0%
Surprised 0%

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 89.7%

Categories

Captions

Microsoft
created on 2022-01-23

map 42.6%

Text analysis

Amazon

128
P141
90%

Google

(128) 9070 P141
(128)
9070
P141