Human Generated Data

Title

Untitled (portrait of couple holding two children in corner of room)

Date

c. 1931, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5805

Human Generated Data

Title

Untitled (portrait of couple holding two children in corner of room)

People

Artist: Durette Studio, American 20th century

Date

c. 1931, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5805

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 97.3
Person 97.3
Person 97.1
Person 95
Clothing 90.5
Apparel 90.5
Sitting 87.9
Tie 87.1
Accessory 87.1
Accessories 87.1
Flooring 86.9
Floor 82.9
Furniture 79.4
Couch 79.4
Indoors 71
Room 70
Suit 69.1
Overcoat 69.1
Coat 69.1
Living Room 63.8

Clarifai
created on 2019-11-16

people 99.9
group 99.2
adult 97.7
woman 97.1
child 96.5
man 96
room 92.7
family 91.3
portrait 90.2
leader 89.9
group together 88.9
two 88
indoors 83.8
furniture 83.7
music 82.5
wear 81.9
boy 80.1
administration 78.7
many 78.6
art 77.7

Imagga
created on 2019-11-16

wall 23.1
man 20.2
grunge 19.6
old 19.5
kin 18.4
person 16.4
people 16.2
cell 16
child 14.9
male 14.7
antique 13.8
ancient 13.8
portrait 13.6
aged 13.6
dirty 13.5
black 13.3
vintage 13.2
room 12.7
adult 11.8
texture 11.8
silhouette 11.6
retro 11.5
couple 11.3
barbershop 11.1
dress 10.8
happy 10.6
building 10.5
attractive 9.8
urban 9.6
love 9.5
teenager 9.1
shop 9
world 9
family 8.9
brick 8.6
grungy 8.5
face 8.5
art 8.5
youth 8.5
city 8.3
style 8.2
window 8
posing 8
body 8
happiness 7.8
architecture 7.8
dark 7.5
mother 7.4
light 7.3
alone 7.3
paint 7.2
mercantile establishment 7.2
sibling 7.1
businessman 7.1
decoration 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

wall 97.9
text 97.4
clothing 97
person 95.3
indoor 88.1
black and white 87.2
man 83.1
suit 75.2
woman 71.5
gallery 67
human face 56.3
room 46.1
old 45.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Female, 54.1%
Fear 45%
Happy 53.4%
Confused 45.1%
Calm 46.1%
Disgusted 45.1%
Sad 45%
Angry 45.1%
Surprised 45.1%

AWS Rekognition

Age 4-12
Gender Female, 50.7%
Angry 45%
Calm 45%
Fear 45.3%
Disgusted 45%
Surprised 54.7%
Happy 45%
Sad 45%
Confused 45%

AWS Rekognition

Age 37-55
Gender Male, 54.6%
Calm 53%
Confused 45.1%
Fear 45%
Sad 46%
Angry 45.5%
Surprised 45.1%
Disgusted 45.1%
Happy 45%

AWS Rekognition

Age 12-22
Gender Male, 50.5%
Fear 49.5%
Angry 49.5%
Calm 49.6%
Surprised 49.5%
Happy 49.5%
Confused 49.5%
Sad 50.4%
Disgusted 49.5%

Microsoft Cognitive Services

Age 37
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.3%
Tie 87.1%
Suit 69.1%