Human Generated Data

Title

Untitled (two photographs: portrait of smiling baby on shag rug on chair; portrait of little boy in short pants standing precariously in front of fire place)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6095

Human Generated Data

Title

Untitled (two photographs: portrait of smiling baby on shag rug on chair; portrait of little boy in short pants standing precariously in front of fire place)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Interior Design 99.3
Indoors 99.3
Human 98.8
Person 98.8
Apparel 96.7
Clothing 96.7
Person 92.8
Person 89.4
Room 87.5
Face 76.1
Monitor 68.9
Electronics 68.9
Display 68.9
Screen 68.9
Costume 67.3
Furniture 65.7
Baby 62.1
Coat 61.1
Overcoat 61.1
Kid 55.6
Child 55.6

Clarifai
created on 2019-11-16

people 100
two 98.7
group 98.2
child 98.1
adult 97.7
man 95.9
woman 94.9
family 94
movie 94
three 93.6
furniture 92.5
portrait 92.4
group together 92.2
baby 90.9
room 90.4
war 90.3
wear 89.9
one 87.8
actor 87.1
actress 86.9

Imagga
created on 2019-11-16

world 30.9
newspaper 21.5
product 16.6
black 16.4
man 16.1
people 15.6
person 14.5
creation 13.2
statue 12.4
portrait 11.6
male 11.4
old 11.1
art 11.1
sculpture 10.5
adult 9.8
daily 9.6
bride 9.6
couple 9.6
religion 9
sexy 8.8
happy 8.8
happiness 8.6
clothing 8.6
face 8.5
dark 8.3
one 8.2
kin 8.1
symbol 8.1
detail 8
family 8
love 7.9
architecture 7.8
culture 7.7
faith 7.7
head 7.6
fashion 7.5
religious 7.5
vintage 7.4
famous 7.4
color 7.2
dress 7.2

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

clothing 97.4
person 95.9
human face 94.1
baby 93.3
child 92.8
toddler 92.4
text 91
black and white 90.3
boy 79.8
monochrome 59.7
fireplace 21.8

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 1-5
Gender Female, 52%
Calm 46.2%
Angry 46%
Disgusted 45%
Happy 45%
Sad 52.4%
Confused 45.1%
Fear 45.3%
Surprised 45%

AWS Rekognition

Age 2-8
Gender Female, 70.2%
Sad 0.1%
Disgusted 0.1%
Surprised 0.3%
Happy 70.8%
Angry 0.1%
Fear 0%
Confused 0.1%
Calm 28.4%

Microsoft Cognitive Services

Age 4
Gender Male

Microsoft Cognitive Services

Age 2
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Monitor 68.9%

Captions

Microsoft

a person standing in front of a window 64.7%
a person standing in front of a window 55.1%
a person standing next to a fireplace 55%