Human Generated Data

Title

Untitled (young girls sitting on grass eating lunch)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7852

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (young girls sitting on grass eating lunch)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.7
Human 98.7
Person 98.1
Person 97.6
Person 96.6
Person 94.9
Reading 92.7
Blonde 90.2
Woman 90.2
Female 90.2
Girl 90.2
Teen 90.2
Kid 90.2
Child 90.2
Outdoors 72.3
Furniture 69
People 66.9
Room 62.2
Indoors 62.2
Classroom 59.5
School 59.5
Sitting 59.3
Couch 59
Countryside 58.5
Nature 58.5
Apparel 57.4
Clothing 57.4

Imagga
created on 2022-01-09

people 32.9
man 32.2
male 30.5
person 28.4
happy 22.5
senior 19.7
together 19.3
adult 19.1
work 18.8
laptop 18.4
outdoor 18.3
men 18
working 17.7
smile 17.1
worker 17.1
computer 16.9
couple 16.5
business 16.4
home 15.9
lifestyle 15.9
park 15.6
outdoors 15.3
smiling 15.2
table 14.7
portrait 14.2
spectator 14.2
job 14.1
businessman 14.1
room 13.8
sitting 13.7
elderly 13.4
office 13
retired 12.6
old 12.5
teenager 11.8
businesswoman 11.8
happiness 11.7
professional 11.7
talking 11.4
mature 11.2
outside 11.1
women 11.1
two 11
kin 11
communication 10.9
family 10.7
group 10.5
newspaper 10.4
meeting 10.4
grass 10.3
teamwork 10.2
team 9.8
retirement 9.6
relax 9.3
leisure 9.1
fun 9
technology 8.9
mother 8.8
indoors 8.8
student 8.8
love 8.7
corporate 8.6
casual 8.5
study 8.4
phone 8.3
teen 8.3
successful 8.2
care 8.2
teacher 8.1
executive 8.1
uniform 8.1
success 8
day 7.8
child 7.8
discussion 7.8
color 7.8
blond 7.8
colleagues 7.8
older 7.8
summer 7.7
reading 7.6
relaxed 7.5
glasses 7.4
relaxing 7.3
product 7.3
building 7.3
classroom 7.2
suit 7.2
looking 7.2
engineer 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 91.2
text 85.3
clothing 85
black and white 69.4
man 62.8

Face analysis

Amazon

Google

AWS Rekognition

Age 12-20
Gender Female, 88.2%
Happy 89.8%
Sad 2.9%
Calm 2.7%
Confused 1.9%
Disgusted 0.9%
Fear 0.7%
Angry 0.7%
Surprised 0.5%

AWS Rekognition

Age 25-35
Gender Female, 74.9%
Happy 82.3%
Surprised 7%
Calm 4.9%
Sad 2%
Confused 1.7%
Disgusted 0.8%
Angry 0.7%
Fear 0.5%

AWS Rekognition

Age 33-41
Gender Female, 98.5%
Calm 96.2%
Sad 1.8%
Happy 1%
Confused 0.4%
Angry 0.2%
Disgusted 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 36-44
Gender Male, 86.6%
Fear 93.1%
Happy 2.7%
Calm 2.3%
Sad 1.1%
Angry 0.3%
Surprised 0.2%
Disgusted 0.2%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%

Captions

Microsoft

a group of people around each other 77%
a group of people sitting on the ground 76.9%
a group of people sitting at a table 70.5%

Text analysis

Amazon

УТЭЗА-О