Human Generated Data

Title

Untitled (two girls in matching dreses seated in studio for portrait, legs crossed)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12851

Human Generated Data

Title

Untitled (two girls in matching dreses seated in studio for portrait, legs crossed)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12851

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 98.3
Person 98.3
Person 97.1
Apparel 96.1
Clothing 96.1
Costume 89.9
Evening Dress 87.6
Robe 87.6
Fashion 87.6
Gown 87.6
Dress 85.4
Female 81.7
Leisure Activities 80
Face 77.8
Performer 72.4
Dance Pose 68.2
Photo 67.9
Photography 67.9
Portrait 67.9
People 65.1
Hair 64.8
Woman 64.2
Girl 62.7
Dance 56.8
Lace 56
Finger 56

Clarifai
created on 2019-11-16

people 99.8
wear 99.1
portrait 98.9
two 98.8
child 98.6
adult 97.7
woman 97.3
costume 96.7
dancer 96.5
outfit 96.2
dress 95.5
wedding 94.7
veil 94.4
group 93.9
sit 93.5
facial expression 93.2
dancing 92.3
retro 92.2
one 92.1
actress 91.3

Imagga
created on 2019-11-16

child 25.9
person 24.6
people 24.5
portrait 20.7
adult 19.5
male 18.4
black 18.4
fashion 18.1
happy 17.5
man 17.5
parent 17.2
love 15
happiness 14.9
lady 14.6
mother 14.6
youth 14.5
dress 14.4
suit 14.4
dad 14.2
style 14.1
pretty 14
business 14
boy 13.9
face 13.5
attractive 13.3
world 13.2
father 13.1
couple 13.1
dark 12.5
model 12.4
smiling 12.3
clothing 12.1
sexy 12
hair 11.9
pose 11.8
smile 11.4
studio 11.4
cheerful 11.4
sitting 11.2
professional 11.1
expression 11.1
kin 11
cute 10.8
family 10.7
kid 10.6
brunette 10.5
two 10.2
elegance 10.1
indoor 10
posing 9.8
women 9.5
makeup 9.1
sport 9.1
sibling 9
fun 9
costume 8.8
looking 8.8
lifestyle 8.7
men 8.6
brother 8.5
one 8.2
plaything 8.1
corporate 7.7
bride 7.7
casual 7.6
tie 7.6
career 7.6
clothes 7.5
vintage 7.4
holding 7.4
executive 7.4
teen 7.3
domestic 7.3
teenager 7.3
make 7.3
celebration 7.2
romantic 7.1
work 7.1
businessman 7.1
look 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

wall 99.1
text 97.4
human face 96.7
person 94.8
dress 92.7
clothing 92.7
dance 87.6
indoor 87.1
smile 80.4
woman 55.7
picture frame 9.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 8-18
Gender Female, 98.1%
Confused 0%
Surprised 0%
Happy 98.9%
Calm 0.7%
Disgusted 0%
Sad 0.2%
Angry 0%
Fear 0.1%

AWS Rekognition

Age 5-15
Gender Female, 96.9%
Sad 0.6%
Fear 0.2%
Angry 0.1%
Surprised 0.1%
Confused 0.1%
Calm 0.2%
Happy 98.8%
Disgusted 0%

Microsoft Cognitive Services

Age 30
Gender Female

Microsoft Cognitive Services

Age 19
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%

Categories

Captions

Microsoft
created on 2019-11-16

a person sitting on a bed 30.6%
a person sitting in a room 30.5%
a person sitting on a bed 25.4%