Human Generated Data

Title

Untitled (woman in long striped dress seated with two children in armchair by window and table)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12935

Human Generated Data

Title

Untitled (woman in long striped dress seated with two children in armchair by window and table)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12935

Machine Generated Data

Tags

Amazon
created on 2019-11-18

Human 99.2
Person 99.2
Chair 98.4
Furniture 98.4
Person 95.3
Sitting 93.6
Couch 89.7
Apparel 88.6
Clothing 88.6
Room 88.4
Indoors 88.4
Home Decor 88.4
Flooring 83.9
Floor 75.8
Living Room 74.7
Footwear 70.8
Shoe 70.8
Shoe 66.3
People 63.7
Wood 56.4
Curtain 56.4

Clarifai
created on 2019-11-18

people 99.8
woman 96.3
adult 95.3
man 93.4
wear 92.7
two 92.2
group 90.2
monochrome 89.1
administration 88.2
furniture 87.8
dress 86.2
one 85.4
wedding 85
group together 84.7
chair 84.3
actress 83.6
child 83.5
three 81
room 79.6
street 78.1

Imagga
created on 2019-11-18

musical instrument 60
accordion 33.9
wind instrument 29.5
people 27.9
keyboard instrument 27.2
man 25.5
male 23.4
person 22.6
couple 22.6
family 22.2
adult 21.3
happiness 20.4
home 19.1
happy 18.8
lifestyle 17.3
percussion instrument 17.3
love 16.6
child 16.4
portrait 16.2
smiling 15.9
room 15.2
teacher 15.1
women 15
pretty 14.7
concertina 14.4
two 14.4
kin 13.8
holding 13.2
indoors 13.2
boy 13
cheerful 13
together 12.3
lady 12.2
mother 12.1
fashion 12.1
men 12
free-reed instrument 12
attractive 11.9
husband 11.4
device 10.9
interior 10.6
human 10.5
wife 10.4
dress 9.9
kid 9.7
fun 9.7
clothing 9.7
loving 9.5
sitting 9.4
stringed instrument 9.4
senior 9.4
youth 9.4
chair 9.3
face 9.2
old 9.1
color 8.9
bride 8.6
smile 8.5
togetherness 8.5
mature 8.4
joy 8.3
style 8.2
father 8.1
romance 8
to 8
cute 7.9
standing 7.8
black 7.8
couch 7.7
educator 7.7
married 7.7
house 7.5
professional 7.5
domestic 7.3
back 7.3
business 7.3
new 7.3
aged 7.2
sexy 7.2
looking 7.2
holiday 7.2

Google
created on 2019-11-18

Microsoft
created on 2019-11-18

wall 98.2
clothing 96
furniture 92.9
floor 92
indoor 91.9
person 90.2
dress 83.1
woman 75.7
chair 73.5
text 72.5
table 69.2
black and white 66
footwear 63.8
smile 52.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-8
Gender Male, 98.9%
Surprised 0.1%
Angry 0.1%
Happy 97.3%
Fear 0%
Calm 1.9%
Disgusted 0.1%
Sad 0%
Confused 0.4%

AWS Rekognition

Age 0-3
Gender Female, 51.1%
Confused 45.3%
Angry 45.5%
Surprised 45.1%
Calm 46.8%
Disgusted 46.4%
Fear 45.4%
Sad 46%
Happy 49.6%

AWS Rekognition

Age 20-32
Gender Female, 54.6%
Disgusted 45%
Surprised 45%
Happy 45.1%
Confused 45%
Sad 45%
Fear 45%
Angry 45%
Calm 54.9%

Microsoft Cognitive Services

Age 1
Gender Female

Microsoft Cognitive Services

Age 4
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Shoe 70.8%

Captions