Human Generated Data

Title

Untitled (group of children playing on living room floor)

Date

1959

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10511

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of children playing on living room floor)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.6
Human 99.6
Person 98.8
Person 98.3
Person 98.2
Clinic 97.1
Person 92.3
Person 86.1
Person 82
Indoors 80.9
Room 80.9
People 69
Furniture 62.6
Clothing 59.9
Apparel 59.9
Hospital 58.1
Operating Theatre 57.9
Doctor 57.2

Imagga
created on 2022-01-09

sax 44.7
wind instrument 24.4
man 21.7
glass 20.8
people 20.1
brass 19.9
table 19.1
person 17.4
male 16.3
cornet 14.9
home 14.3
room 14.3
wedding 13.8
laboratory 13.5
musical instrument 13.3
interior 13.3
medical 13.2
men 12.9
chemistry 12.5
business 12.1
party 12
work 11.8
lab 11.7
research 11.4
cheerful 11.4
event 11.1
team 10.7
chair 10.7
chemical 10.6
working 10.6
setting 10.6
businessman 10.6
office 10.5
biology 10.4
wine 10.4
instrument 10.3
biotechnology 9.8
experiment 9.7
medicine 9.7
group 9.7
sitting 9.4
dinner 9.4
smiling 9.4
happy 9.4
worker 9.2
indoors 8.8
device 8.7
happiness 8.6
music 8.6
dining 8.6
development 8.5
food 8.5
black 8.4
modern 8.4
drink 8.3
groom 8.1
decoration 8
water 8
family 8
adult 8
celebration 8
restaurant 7.9
banquet 7.8
scene 7.8
elegant 7.7
test 7.7
flower 7.7
bride 7.7
equipment 7.6
sign 7.5
house 7.5
fun 7.5
silhouette 7.4
teamwork 7.4
desk 7.4
light 7.3
life 7.3
lifestyle 7.2
professional 7.2
romance 7.1
love 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99
person 69
cartoon 61.6
clothing 56.3
drawing 54.2
room 40.6

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Male, 63.2%
Calm 47.4%
Sad 22%
Angry 17%
Confused 7.8%
Disgusted 3.3%
Surprised 1.4%
Happy 0.5%
Fear 0.5%

AWS Rekognition

Age 41-49
Gender Male, 98.7%
Calm 78.8%
Sad 13.2%
Angry 2.5%
Happy 1.4%
Confused 1.3%
Surprised 1.2%
Fear 0.9%
Disgusted 0.7%

AWS Rekognition

Age 39-47
Gender Male, 56.1%
Calm 99.8%
Sad 0.1%
Happy 0%
Angry 0%
Confused 0%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 37-45
Gender Male, 66.7%
Calm 78.5%
Happy 5.6%
Angry 4.2%
Surprised 3.8%
Disgusted 3.6%
Sad 3%
Fear 0.7%
Confused 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a group of people in a room 81.8%
a group of people standing in a room 71.9%
a group of people sitting in a room 65.8%

Text analysis

Amazon

44971