Human Generated Data

Title

Untitled (two photographs: studio portrait of woman standing with dress, hat, and hand bag; woman seated with crossed eyes)

Date

c. 1905-1915, printed c. 1970

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5983

Human Generated Data

Title

Untitled (two photographs: studio portrait of woman standing with dress, hat, and hand bag; woman seated with crossed eyes)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed c. 1970

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5983

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.5
Person 99.5
Person 97.6
Apparel 97.4
Clothing 97.4
Female 94.6
Woman 84.1
Sitting 74
Sleeve 68.6
Photo 65.3
Photography 65.3
Portrait 65.2
Face 65.2
Overcoat 63.1
Suit 63.1
Coat 63.1
Girl 62.1
Performer 60
Military 59.7
Military Uniform 59.7
Poster 56.6
Advertisement 56.6
Long Sleeve 56.2
Officer 56
Blonde 55.3
Teen 55.3
Kid 55.3
Child 55.3

Clarifai
created on 2019-11-16

people 99.9
adult 98.8
woman 98.8
two 97.7
monochrome 95.8
actress 95.7
wear 95.1
music 94.8
furniture 94.6
man 94.4
movie 94
portrait 93.8
room 93.7
actor 93.5
theater 92.9
street 92.2
child 91.6
three 91.5
one 91
opera 90.2

Imagga
created on 2019-11-16

person 35.5
people 21.7
adult 21
man 20.8
male 20.6
kin 18.4
portrait 17.5
lady 17
model 16.3
attractive 16.1
fashion 15.8
sexy 15.3
body 15.2
posing 15.1
dark 15
human 15
love 14.2
clothing 14
style 13.3
pretty 13.3
one 12.7
dress 12.6
black 12.3
silhouette 11.6
couple 11.3
patient 11.2
world 11.1
happy 10.6
passion 10.3
hair 10.3
case 10.2
lifestyle 10.1
sensuality 10
pose 10
sport 9.9
vintage 9.9
athlete 9.6
performance 9.6
erotic 9.4
elegance 9.2
sick person 9.2
sensual 9.1
old 9.1
suit 9
ballplayer 8.9
cool 8.9
player 8.9
performer 8.6
men 8.6
studio 8.4
exercise 8.2
interior 8
boy 7.9
happiness 7.8
face 7.8
uniform 7.7
sitting 7.7
wife 7.6
pleasure 7.5
makeup 7.3
dirty 7.2
looking 7.2
sunset 7.2
home 7.2
family 7.1
modern 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

person 97.4
clothing 96.8
wall 96.2
text 93.6
human face 78
dress 74.9
woman 74.1
black 67.4
old 44.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 12-22
Gender Female, 50.8%
Happy 45.1%
Sad 45.1%
Disgusted 45%
Surprised 45.1%
Fear 45%
Angry 45.1%
Confused 45.1%
Calm 54.6%

AWS Rekognition

Age 23-37
Gender Female, 89.7%
Disgusted 0.1%
Angry 0.2%
Fear 0.1%
Happy 0.7%
Confused 0.2%
Sad 0.1%
Calm 98%
Surprised 0.6%

Microsoft Cognitive Services

Age 42
Gender Female

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Captions