Human Generated Data

Title

Untitled (young girl giving plate of food to man seated on couch)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10548

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (young girl giving plate of food to man seated on couch)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Clothing 99.4
Apparel 99.4
Human 99
Person 99
Person 96.4
Furniture 86
Couch 86
Hat 76.5
Sun Hat 70.8
Female 60.2
Wood 59.3
Food 57.4
Meal 57.4
Indoors 56.9
Living Room 56.9
Room 56.9

Imagga
created on 2022-01-09

person 37.9
home 35.1
people 31.8
man 30.9
indoors 28.1
senior 27.2
male 26.2
adult 25.3
room 25.2
happy 23.2
patient 22.8
sitting 20.6
salon 19.8
lifestyle 19.5
casual 17.8
smiling 16.6
elderly 16.3
old 15.3
portrait 14.9
happiness 14.9
mature 14.9
indoor 14.6
couple 13.9
camera 13.9
retired 13.6
case 13.4
family 13.3
interior 13.3
sick person 13.1
cheerful 13
men 12.9
house 12.5
retirement 12.5
office 12.2
chair 12.1
computer 12.1
smile 11.4
living 11.4
nurse 10.9
health 10.4
business 10.3
mother 10.3
horizontal 10
holding 9.9
70s 9.8
60s 9.8
lady 9.7
table 9.7
looking 9.6
day 9.4
clothing 9.3
laptop 9.2
joy 9.2
grandfather 9.1
kin 9.1
working 8.8
together 8.8
two people 8.7
women 8.7
couch 8.7
husband 8.6
face 8.5
adults 8.5
modern 8.4
life 8.4
child 8.4
alone 8.2
facing camera 7.9
work 7.8
sixties 7.8
grandmother 7.8
desk 7.8
pretty 7.7
30s 7.7
newspaper 7.7
two 7.6
females 7.6
togetherness 7.5
leisure 7.5
one 7.5
vintage 7.4
phone 7.4
inside 7.4
school 7.3
kitchen 7.2
handsome 7.1
love 7.1
kid 7.1
businessman 7.1
antique 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.6
person 90
outdoor 86.6
clothing 85.4
furniture 69.9

Face analysis

Amazon

Google

AWS Rekognition

Age 54-62
Gender Male, 51.8%
Happy 40.3%
Calm 25.9%
Sad 9.2%
Confused 7.5%
Surprised 5.2%
Fear 5.1%
Angry 3.5%
Disgusted 3.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Captions

Microsoft

a man sitting in front of a window 77.2%
a man sitting in front of a building 77.1%
a man sitting in front of a store window 72%

Text analysis

Amazon

20
20 530.
20330.
530.

Google

Lo330. 20 530.
20
Lo330.
530.