Human Generated Data

Title

Untitled (two girls)

Date

c. 1930

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1250

Human Generated Data

Title

Untitled (two girls)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

c. 1930

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Art 96.3
Person 95.8
Human 95.8
Painting 83.1
Portrait 56.9
Face 56.9
Photography 56.9
Photo 56.9
Poster 55.1
Advertisement 55.1

Imagga
created on 2022-01-23

child 82
family 48.1
father 43.4
kid 40.8
kin 39.3
happy 38.9
mother 37.6
parent 36.5
male 35
happiness 34.5
sibling 34.5
dad 34.1
boy 33.9
love 33.2
cute 32.3
childhood 32.3
portrait 31.7
smiling 29
son 28.6
smile 27.8
brother 27.7
little 27.4
cheerful 26.9
people 26.8
children 26.5
fun 26.2
baby 25.9
bow tie 24
together 23.7
man 23.6
casual 22.9
couple 20.9
joy 19.2
home 19.2
necktie 19
kids 18.9
daughter 18.5
face 18.5
adorable 18.5
adult 18.1
mom 17.5
youth 17.1
lifestyle 16.6
husband 16.2
two 16.1
person 15.9
toddler 15.3
holding 14.9
studio 14.5
clothing 14.2
girls 13.7
infant 13.5
attractive 13.3
togetherness 13.2
innocent 13
wife 12.3
group 12.1
looking 12
color 11.7
care 11.5
expression 11.1
boys 10.7
hug 10.7
innocence 10.6
indoors 10.6
loving 10.5
sit 10.4
black 10.3
sitting 10.3
relationship 10.3
parenthood 9.8
two people 9.7
garment 9.5
playful 9.5
men 9.5
life 9.4
offspring 9.3
indoor 9.1
human 9
parents 8.8
brunette 8.7
laugh 8.6
laughing 8.5
playing 8.2
women 7.9
sweet 7.9
mommy 7.9
standing 7.8
play 7.8
pretty 7.7
30s 7.7
generation 7.7
four 7.7
hand 7.6
horizontal 7.5
enjoy 7.5
friendship 7.5
leisure 7.5
mature 7.4
hold 7.4
joyful 7.4
handsome 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

human face 96.9
text 96.2
clothing 94.6
painting 94.3
drawing 91.1
indoor 90.5
person 88.6
sketch 86.8
portrait 68.4

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-8
Gender Female, 97.8%
Calm 74.9%
Sad 18.6%
Fear 2%
Confused 2%
Angry 1.3%
Surprised 0.5%
Disgusted 0.4%
Happy 0.3%

AWS Rekognition

Age 4-12
Gender Female, 99.4%
Sad 52.4%
Calm 44.7%
Angry 0.8%
Confused 0.8%
Fear 0.5%
Surprised 0.3%
Disgusted 0.2%
Happy 0.1%

Microsoft Cognitive Services

Age 11
Gender Female

Microsoft Cognitive Services

Age 6
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 95.8%
Painting 83.1%