Human Generated Data

Title

Untitled (school girl)

Date

c. 1920

People

Artist: Lewis Wickes Hine, American 1874 - 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.811

Human Generated Data

Title

Untitled (school girl)

People

Artist: Lewis Wickes Hine, American 1874 - 1940

Date

c. 1920

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.811

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 99.5
Apparel 99.5
Person 98.3
Human 98.3
Face 91.7
Head 81.7
Advertisement 75.3
Poster 72.5
Photography 69.7
Photo 69.7
Portrait 69.7
Hood 69.5
Fashion 60.7
Cloak 58.3
Hat 58
Collage 58
Blanket 56

Clarifai
created on 2023-10-25

portrait 99.9
people 99.5
one 99.1
child 98.8
girl 98.1
monochrome 96.7
baby 95.9
smile 94.5
face 93.7
adult 93.4
son 93.2
wear 92.3
woman 91.9
art 91.4
person 91.2
model 88.9
beautiful 88.4
display 87.4
man 87.4
vintage 87.3

Imagga
created on 2021-12-14

portrait 29.8
person 29.6
sexy 26.5
people 24.5
face 23.4
child 23.1
attractive 23.1
adult 22
fashion 21.1
model 21
black 20.6
pretty 20.3
blond 18.5
body 18.4
hair 17.4
lady 17
looking 16
cute 15.8
happy 15.7
skin 15.2
smile 15
sensuality 14.5
smiling 14.5
human 14.2
women 14.2
baby 13.9
eyes 13.8
elegance 13.4
love 13.4
man 13.2
male 13
studio 12.9
posing 12.4
lifestyle 12.3
hand 11.6
couple 11.3
one 11.2
lips 11.1
expression 11.1
youth 11.1
gorgeous 10.9
holding 10.7
lovely 10.7
erotic 10.5
brunette 10.5
hands 10.4
mother 10.4
closeup 10.1
computer 10
make 10
antique 9.7
style 9.6
laptop 9.6
wife 9.5
happiness 9.4
head 9.2
wedding 9.2
dress 9
bride 8.9
home 8.8
boy 8.7
two 8.5
modern 8.4
feminine 8.4
makeup 8.3
vintage 8.3
representation 8.3
girls 8.2
creation 8.1
family 8
clothing 7.9
look 7.9
sexual 7.7
reflection 7.7
old 7.7
marriage 7.6
thinking 7.6
relationship 7.5
cosmetics 7.5
20s 7.3
cheerful 7.3
student 7.2
kid 7.1
little 7.1
work 7.1
snapshot 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

human face 98.4
wall 98.1
person 97.7
clothing 97.5
gallery 94.2
indoor 91.9
smile 91.7
toddler 86.5
text 84.8
baby 82.8
room 76.7
black and white 54.2
picture frame 28.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 13-25
Gender Female, 98.8%
Happy 78.1%
Calm 19.6%
Sad 0.7%
Surprised 0.4%
Fear 0.4%
Disgusted 0.3%
Confused 0.3%
Angry 0.2%

AWS Rekognition

Age 17-29
Gender Female, 66.7%
Calm 88.8%
Happy 6.4%
Sad 1.4%
Angry 1.4%
Surprised 0.7%
Confused 0.5%
Disgusted 0.4%
Fear 0.4%

AWS Rekognition

Age 19-31
Gender Female, 69%
Disgusted 43.4%
Calm 28.9%
Surprised 8%
Angry 6.9%
Happy 6.2%
Confused 5.5%
Sad 0.6%
Fear 0.5%

Microsoft Cognitive Services

Age 7
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%