Human Generated Data

Title

Untitled (studio portrait of infant with gown and cap in chair)

Date

c. 1905-1910, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5973

Human Generated Data

Title

Untitled (studio portrait of infant with gown and cap in chair)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1910, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5973

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.2
Human 99.2
Person 98.9
Dance 95.1
Person 94.4
Performer 87.7
Leisure Activities 81.3
Dance Pose 81.3
Stage 79.6
Ballet 77.9
Ballerina 59.4

Clarifai
created on 2019-11-16

people 100
woman 98.8
adult 98.5
group 97.6
group together 95.8
man 95.3
child 94.8
furniture 94.6
wear 94.3
room 93.9
music 92.8
actress 91.4
theater 91.3
portrait 88.2
dancer 87.9
seat 87.5
three 86.7
musician 85
two 84.4
sit 84.2

Imagga
created on 2019-11-16

musical instrument 33.3
man 27.5
kin 24.8
people 23.4
adult 19.6
person 19.4
male 18.6
keyboard instrument 17.9
stringed instrument 17.4
fashion 17.3
wind instrument 16.6
style 16.3
black 15.2
accordion 15.2
silhouette 14.9
couple 13.9
chair 13.5
window 13.2
business 12.8
dress 12.6
posing 12.4
room 12.4
sexy 12
body 12
attractive 11.9
love 11.8
businessman 11.5
passion 11.3
men 11.2
sensuality 10.9
dark 10.9
interior 10.6
human 10.5
portrait 10.4
hair 10.3
women 10.3
model 10.1
suit 9.9
romance 9.8
one 9.7
world 9.6
sitting 9.4
vintage 9.1
lady 8.9
piano 8.6
wall 8.6
future 8.4
city 8.3
percussion instrument 8.3
grand piano 8.3
fun 8.2
indoor 8.2
dirty 8.1
office 8
romantic 8
looking 8
together 7.9
standing 7.8
boy 7.8
pretty 7.7
expression 7.7
old 7.7
performer 7.5
light 7.4
sensual 7.3
bowed stringed instrument 7.3
home 7.2
happiness 7.1
indoors 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

clothing 92.3
person 88.4
text 75.2
black 72.3
white 62.3
man 52.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 0-3
Gender Female, 89.1%
Confused 0.8%
Surprised 1.6%
Sad 2.1%
Calm 2.6%
Disgusted 0.2%
Happy 0.1%
Fear 92.2%
Angry 0.3%

AWS Rekognition

Age 12-22
Gender Female, 54.5%
Confused 45.1%
Surprised 45%
Happy 45%
Calm 54.1%
Disgusted 45%
Sad 45.4%
Angry 45.4%
Fear 45.1%

AWS Rekognition

Age 19-31
Gender Female, 54.9%
Surprised 45.1%
Sad 45.4%
Disgusted 45.5%
Calm 47.1%
Fear 45.1%
Angry 51.2%
Happy 45.1%
Confused 45.5%

Microsoft Cognitive Services

Age 0
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%