Human Generated Data

Title

Untitled (men in suits behind large desk, Federal Trade Commission)

Date

1937

People

Artist: Harris & Ewing, American 1910s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22242

Human Generated Data

Title

Untitled (men in suits behind large desk, Federal Trade Commission)

People

Artist: Harris & Ewing, American 1910s-1940s

Date

1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.5
Human 99.5
Person 99.4
Person 99.2
Person 99.2
Person 98.2
Accessory 97.1
Tie 97.1
Accessories 97.1
Indoors 95.4
Room 95.3
Tie 94.4
Court 89.6
Jury 87.2
Fireplace 81.8
Sitting 70.2
People 64.8
Crowd 60
Judge 56

Imagga
created on 2022-03-11

classroom 24.5
room 21.5
architecture 20.7
building 20.5
man 18.2
sculpture 16.5
statue 16.4
old 15.3
travel 14.8
water 14.7
monument 14
fountain 13.6
history 13.4
people 13.4
stringed instrument 13.4
tourism 13.2
male 12
church 12
historic 11.9
landmark 11.7
city 11.6
interior 11.5
historical 11.3
famous 11.1
art 11.1
adult 11
musical instrument 10.7
happiness 10.2
stone 10.1
religion 9.8
lifestyle 9.4
black 9
night 8.9
marble 8.8
couple 8.7
love 8.7
ancient 8.6
god 8.6
person 8.6
sitting 8.6
culture 8.5
house 8.5
hall 8.4
grand piano 8.3
column 8.1
device 8
portrait 7.8
men 7.7
vintage 7.4
tourist 7.3
home 7.2
women 7.1
river 7.1
indoors 7
sky 7

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

person 94.2
man 86.6
window 86.5
text 80.3
black and white 79.2
clothing 54.7

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 99.9%
Happy 73.8%
Sad 12.4%
Confused 4.1%
Calm 2.7%
Disgusted 2.5%
Surprised 2.5%
Angry 1%
Fear 0.9%

AWS Rekognition

Age 43-51
Gender Male, 100%
Calm 94.6%
Sad 3.2%
Disgusted 0.9%
Angry 0.4%
Confused 0.3%
Surprised 0.2%
Happy 0.2%
Fear 0.1%

AWS Rekognition

Age 28-38
Gender Male, 100%
Confused 59.1%
Calm 30%
Surprised 7.6%
Sad 2.3%
Disgusted 0.3%
Angry 0.3%
Happy 0.2%
Fear 0.1%

AWS Rekognition

Age 49-57
Gender Male, 99.5%
Calm 93.8%
Fear 2.6%
Disgusted 1%
Happy 0.9%
Confused 0.6%
Surprised 0.4%
Angry 0.4%
Sad 0.3%

AWS Rekognition

Age 36-44
Gender Male, 95.7%
Calm 88.5%
Sad 4.2%
Angry 2%
Surprised 1.9%
Confused 1.7%
Happy 1%
Disgusted 0.4%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Tie 97.1%
Fireplace 81.8%

Captions

Microsoft

a group of people sitting in front of a window 71.5%
a group of people sitting in front of a building 71.4%
a group of people sitting on a bench in front of a window 54.8%