Human Generated Data

Title

Untitled (seated girl with baby)

Date

1930s

People

Artist: Jack Delano, American 1914 - 1997

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.347

Human Generated Data

Title

Untitled (seated girl with baby)

People

Artist: Jack Delano, American 1914 - 1997

Date

1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.347

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Furniture 100
Person 98.8
Human 98.8
Chair 98.7
Person 97.2
Bar Stool 89.4
Person 82.1
Sitting 75.3
Text 58.6
Art 57.3
Drawing 56.2

Clarifai
created on 2023-10-15

people 99.9
child 99.4
chair 98.7
furniture 98.4
seat 97.6
portrait 97.5
monochrome 95.6
two 95.2
art 94.8
sit 93.8
wear 92.6
son 92.1
adult 91.8
family 91.6
one 91.5
documentary 91.5
vintage 91.2
easel 91
retro 89.9
group 89.8

Imagga
created on 2021-12-14

mother 27.7
child 26.4
people 23.4
happy 23.2
family 22.2
portrait 22
adult 21.4
chair 20.5
person 20.2
sitting 19.8
male 19.3
man 18.1
happiness 18
parent 17.9
casual 17.8
sibling 17.2
couple 16.5
kin 16.5
attractive 16.1
home 16
smiling 15.9
dress 15.4
interior 15
lifestyle 14.5
rocking chair 14.4
cute 13.6
fashion 13.6
smile 13.5
father 13.5
love 13.4
kid 13.3
room 13
cheerful 13
two 12.7
model 12.4
boy 12.2
seat 12
fun 12
women 11.9
life 11.6
brother 11.4
sexy 11.2
old 11.1
clothing 11.1
girls 10.9
posing 10.7
couch 10.6
together 10.5
youth 10.2
furniture 10.2
indoor 10
house 10
holding 9.9
style 9.6
husband 9.5
daughter 9.4
son 9.2
playing 9.1
children 9.1
pretty 9.1
black 9
one 9
handsome 8.9
call 8.9
looking 8.8
body 8.8
indoors 8.8
brunette 8.7
cell 8.6
face 8.5
elegance 8.4
vintage 8.3
20s 8.2
lady 8.1
childhood 8.1
hair 7.9
domestic 7.6
wife 7.6
laughing 7.6
window 7.6
relaxation 7.5
enjoyment 7.5
human 7.5
sensuality 7.3
stylish 7.2

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 96.8
drawing 89.5
chair 83.9
clothing 83.3
furniture 83.3
child 82
person 80.8
black and white 79.8
sketch 74.7
toddler 65.3
posing 64.9
cartoon 60.2
footwear 56.3
human face 54
old 46.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 4-12
Gender Female, 75.7%
Calm 57.3%
Happy 28%
Disgusted 9.8%
Surprised 2.8%
Confused 0.9%
Angry 0.7%
Sad 0.4%
Fear 0.1%

AWS Rekognition

Age 0-3
Gender Female, 63%
Calm 93.1%
Surprised 5.2%
Sad 0.7%
Confused 0.5%
Disgusted 0.2%
Angry 0.2%
Happy 0.1%
Fear 0.1%

Microsoft Cognitive Services

Age 1
Gender Male

Microsoft Cognitive Services

Age 11
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Chair 98.7%

Text analysis

Amazon

BASTER
il
PROGUET
THE EMOILEST PROGUET
BASKZTZ
THE EMOILEST