Human Generated Data

Title

Untitled (studio portrait of woman with large bow in hair seated at side table)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6082

Human Generated Data

Title

Untitled (studio portrait of woman with large bow in hair seated at side table)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Clothing 99.7
Apparel 99.7
Person 99.5
Human 99.5
Furniture 97.8
Chair 95.2
Person 95
Sitting 93.2
Hat 75
Shoe 73.9
Footwear 73.9
Chair 69.5
Coat 68.9
Overcoat 65.9
Photo 61.3
Face 61.3
Photography 61.3
Portrait 61.3
Sun Hat 59.1
Pants 58.1
Suit 57.5
Couch 56.4

Clarifai
created on 2019-11-16

people 99.7
woman 98.6
chair 98
furniture 97.2
adult 96.4
movie 95.7
seat 94.1
indoors 93.8
room 93.6
group 93.2
sit 92.6
wear 92.6
man 91.8
two 91.3
outfit 91.3
actor 90.7
actress 89.5
sitting 89
child 88.2
portrait 87.5

Imagga
created on 2019-11-16

chair 35
man 21.5
person 19.9
electric chair 19.4
seat 19.1
people 18.4
device 18.2
sexy 16.9
dark 16.7
adult 16.4
black 15.9
instrument of execution 15.7
male 14.9
attractive 14.7
lady 13.8
fashion 13.6
body 12.8
style 12.6
window 12.6
model 12.4
silhouette 12.4
interior 12.4
elegant 12
dress 11.7
light 11.4
instrument 11.2
support 11.1
suit 10.9
business 10.9
sensual 10.9
old 10.4
room 10.4
portrait 10.3
sitting 10.3
hair 10.3
love 10.3
elegance 10.1
pretty 9.8
posing 9.8
businessman 9.7
one 9.7
looking 9.6
couple 9.6
women 9.5
wall 9.4
passion 9.4
musical instrument 9.3
luxury 8.6
art 8.3
vintage 8.3
clothing 8.3
indoor 8.2
furniture 8.1
shadow 8.1
brunette 7.8
military uniform 7.8
men 7.7
erotic 7.6
human 7.5
sculpture 7.4
sensuality 7.3
office 7.2
romance 7.1
architecture 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

furniture 96.7
chair 90.9
clothing 87.6
text 86.1
indoor 85.2
black and white 82.8
person 73.7
hat 54

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 1-7
Gender Female, 79%
Surprised 0%
Sad 98.9%
Calm 0.8%
Happy 0%
Confused 0.1%
Disgusted 0%
Fear 0.1%
Angry 0%

AWS Rekognition

Age 9-19
Gender Male, 53%
Angry 45.1%
Disgusted 45.1%
Surprised 45.1%
Fear 45%
Confused 45.1%
Happy 45.1%
Sad 45.1%
Calm 54.4%

Microsoft Cognitive Services

Age 3
Gender Female

Microsoft Cognitive Services

Age 21
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Chair 95.2%
Shoe 73.9%

Captions

Microsoft

a person sitting on a chair in front of a window 72.9%
a person sitting in a chair in front of a window 72.8%
a person sitting in a chair 72.7%