Human Generated Data

Title

Catherine Wells Oppel and William Wells Oppel

Date

1904

People

Artist: F. W. Curtiss, American 19th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1802

Human Generated Data

Title

Catherine Wells Oppel and William Wells Oppel

People

Artist: F. W. Curtiss, American 19th century

Date

1904

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 98.5
Person 98.2
Face 97.5
Person 97.5
Baby 87.7
Newborn 84.5
Female 83.1
Dress 82.8
Clothing 82.8
Apparel 82.8
Portrait 78.1
Photography 78.1
Photo 78.1
Art 76.4
Drawing 72.8
Floor 70.6
Girl 69.4
Smile 65.7
Kid 64.6
Child 64.6
Woman 63.9
People 61.9
Sketch 58.7
Wood 56.8
Head 56.3

Imagga
created on 2021-12-14

portrait 36.9
happy 35.7
child 35.3
people 30.7
baby 29.1
senior 29
smile 28.5
mother 28.4
person 28
family 26.7
face 26.3
smiling 24.6
cute 24.4
adult 23.9
little 23.8
hair 23.8
blond 23.3
love 22.1
elderly 22
old 21.6
childhood 21.5
retired 21.3
retirement 21.1
fun 20.9
male 20.7
happiness 19.6
mature 19.5
grandma 19.3
couple 17.4
human 17.2
kid 16.8
home 16.7
pretty 16.1
together 15.8
eyes 15.5
man 15.5
innocent 15
boy 14.8
infant 14.5
health 13.9
adorable 13.8
cheerful 13.8
grandmother 13.7
toddler 13.4
day 13.3
parent 13.2
care 13.2
daughter 13
life 12.6
healthy 12.6
joy 12.5
lifestyle 12.3
bathroom 11.7
clean 11.7
newborn 11.7
innocence 11.5
bath 11.4
looking 11.2
women 11.1
gray 10.8
hand 10.6
husband 10.5
one 10.4
sweet 10.3
expression 10.2
tub 10.1
lady 9.7
older 9.7
look 9.6
body 9.6
wife 9.5
sitting 9.4
bathtub 9.3
two 9.3
head 9.2
close 9.1
children 9.1
active 9
grandparent 8.9
affection 8.7
married 8.6
laugh 8.6
holiday 8.6
loving 8.6
laughing 8.5
attractive 8.4
grandfather 8.4
camera 8.3
holding 8.3
girls 8.2
aged 8.1
dress 8.1
romance 8
water 8
soap 7.8
play 7.8
affectionate 7.7
vessel 7.6
casual 7.6
skin 7.6
studio 7.6
bubble 7.5
leisure 7.5
park 7.4
father 7.3
wet 7.2
romantic 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.6
human face 99.2
person 98.9
clothing 98.1
baby 97.8
toddler 96.2
smile 95.5
child 94.5
old 71.2
posing 65.4
vintage 52.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-8
Gender Female, 78.2%
Calm 55.2%
Happy 39.5%
Confused 1.9%
Sad 1.2%
Disgusted 0.7%
Angry 0.6%
Surprised 0.6%
Fear 0.2%

AWS Rekognition

Age 1-7
Gender Female, 69%
Calm 81.3%
Happy 15.3%
Confused 1.5%
Surprised 1.2%
Sad 0.3%
Angry 0.1%
Fear 0.1%
Disgusted 0.1%

Microsoft Cognitive Services

Age 1
Gender Male

Microsoft Cognitive Services

Age 7
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.2%

Captions

Microsoft

a vintage photo of a person 54%
a vintage photo of a person 53.9%
a vintage photo of a person holding a sign 25.9%