Human Generated Data

Title

Untitled (portrait of woman with three girls on living room couch, Phildelphia)

Date

c. 1940, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12210

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (portrait of woman with three girls on living room couch, Phildelphia)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12210

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.4
Person 99.4
Person 99.4
Furniture 99.3
Person 98.8
Couch 95.5
Clothing 95.4
Apparel 95.4
Person 94.4
People 93.5
Living Room 86.3
Indoors 86.3
Room 86.3
Family 84.3
Female 72.5
Sitting 63.9
Girl 61
Shorts 59.8
Flooring 56.7
Shoe 55.4
Footwear 55.4
Woman 55.2

Clarifai
created on 2019-11-16

people 99.8
woman 98.5
group 97.8
adult 97.5
portrait 97.3
child 95.8
two 94.8
family 91.7
girl 90.9
wear 90.4
man 90.2
one 89.5
music 89.2
furniture 86.6
room 85.7
seat 85.4
street 85.2
movie 85.2
group together 85.2
offspring 83.9

Imagga
created on 2019-11-16

kin 82.4
mother 51.1
family 40.9
couple 39.2
man 35.6
home 35.1
male 35.1
happy 33.2
together 31.6
people 31.3
child 29.4
father 28.7
adult 27.5
happiness 27.4
love 26.8
smiling 26.8
parent 26.3
room 26
couch 24.2
portrait 24
interior 23.9
lifestyle 22.4
sitting 21.5
son 21.4
daughter 21.1
attractive 20.3
boy 19.1
casual 17.8
indoors 17.6
cheerful 17.1
indoor 16.4
smile 16.4
sofa 16.3
husband 15.4
loving 15.3
two 15.3
togetherness 15.1
dad 15
brother 14.8
children 14.6
30s 14.4
fun 14.2
dress 13.6
kid 13.3
holding 13.2
person 12.4
cute 12.2
women 11.9
parents 11.7
house 11.7
chair 11.5
wife 11.4
living 11.4
group 11.3
looking 11.2
playing 10.9
face 10.7
fashion 10.6
jeans 10.5
relationship 10.3
camera 10.2
domestic 10.1
relaxing 10
clothing 10
childhood 9.9
handsome 9.8
pretty 9.8
old 9.8
affectionate 9.7
aged 9.1
romance 8.9
color 8.9
mom 8.7
hug 8.7
affection 8.7
comfort 8.7
youth 8.5
sit 8.5
horizontal 8.4
joy 8.4
girls 8.2
lady 8.1
sexy 8
grandfather 8
holiday 7.9
living room 7.8
their 7.8
bonding 7.8
play 7.8
men 7.7
married 7.7
laugh 7.7
laughing 7.6
dark 7.5
enjoyment 7.5
vintage 7.4
mature 7.4
teen 7.4
20s 7.3
teenager 7.3
romantic 7.1
model 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

person 98.9
clothing 98.1
human face 97.6
smile 97.2
text 91.2
family 86
woman 84.9
furniture 52.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 32-48
Gender Female, 99.5%
Fear 0.1%
Confused 0.3%
Calm 96.5%
Sad 1%
Disgusted 0.7%
Happy 0.7%
Surprised 0.2%
Angry 0.6%

AWS Rekognition

Age 4-12
Gender Female, 55%
Surprised 45%
Calm 52.2%
Fear 45%
Disgusted 45%
Happy 45%
Angry 47.4%
Sad 45.3%
Confused 45%

AWS Rekognition

Age 4-12
Gender Female, 54.6%
Angry 45.1%
Calm 52.2%
Sad 47.6%
Happy 45%
Fear 45%
Confused 45%
Surprised 45%
Disgusted 45%

AWS Rekognition

Age 2-8
Gender Female, 52.3%
Confused 45%
Disgusted 45%
Surprised 45%
Calm 45%
Angry 45%
Sad 55%
Happy 45%
Fear 45%

Microsoft Cognitive Services

Age 40
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Couch 95.5%
Shoe 55.4%

Text analysis

Amazon

672)