Human Generated Data

Title

Untitled (parents holding children who are dressed up as king and queen)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15556

Human Generated Data

Title

Untitled (parents holding children who are dressed up as king and queen)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-19

Human 98.8
Person 98.8
Accessories 98.6
Accessory 98.6
Jewelry 98.3
Person 98.1
Person 88.4
Crown 81.8
Tiara 76.2
Painting 76
Art 76
Clothing 70.6
Apparel 70.6

Imagga
created on 2022-03-19

child 37.6
portrait 29.8
adult 25.9
people 24.6
black 23.6
person 22.8
man 21.5
mother 21.5
love 21.3
sibling 21
happiness 20.4
sexy 20.1
happy 20.1
attractive 19.6
family 19.6
lady 19.5
male 19.4
baby 18.6
youth 17.9
couple 17.4
fashion 17.4
bow tie 17.2
model 17.1
face 17.1
neonate 15.8
girls 15.5
parent 15.1
women 15
smile 15
pretty 14.7
daughter 14.5
cute 14.4
hair 14.3
brother 14.2
body 13.6
kin 13.4
posing 13.3
brunette 13.1
boy 13
necktie 13
human 12.8
two 12.7
skin 12.7
dress 12.7
dark 12.5
father 12.5
together 12.3
clothing 12.3
hand 12.2
studio 11.4
expression 11.1
world 11
elegance 10.9
sensuality 10.9
smiling 10.9
one 10.5
eyes 10.3
relationship 10.3
lips 10.2
lifestyle 10.1
holding 9.9
blond 9.9
cheerful 9.8
kid 9.8
style 9.7
passion 9.4
suit 9
husband 8.7
erotic 8.7
lingerie 8.2
sensual 8.2
pose 8.2
childhood 8.1
garment 8
romantic 8
dad 8
looking 8
night 8
look 7.9
girlfriend 7.7
wife 7.6
fun 7.5
evening 7.5
retro 7.4
lovely 7.1

Google
created on 2022-03-19

Microsoft
created on 2022-03-19

person 99.1
human face 98.3
clothing 96.7
text 94.1
posing 93
old 85.5
standing 81.7
baby 79.3
toddler 78.6
smile 73.2
woman 68.3
black 65.9
dress 64.1
white 61.4
girl 60.8
clothes 18.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 0-3
Gender Male, 99.6%
Calm 84.8%
Sad 11.9%
Angry 1.5%
Disgusted 0.6%
Confused 0.6%
Happy 0.3%
Surprised 0.2%
Fear 0.2%

AWS Rekognition

Age 28-38
Gender Female, 99.9%
Disgusted 77.9%
Calm 18.1%
Sad 1.4%
Surprised 1.1%
Fear 0.6%
Angry 0.4%
Confused 0.3%
Happy 0.2%

Microsoft Cognitive Services

Age 32
Gender Female

Microsoft Cognitive Services

Age 1
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Painting 76%

Captions

Microsoft

a man and a woman posing for a photo 88.1%
a man and woman posing for a photo 84.9%
a person standing in front of a window posing for the camera 84.8%