Human Generated Data

Title

Untitled (two couples seated in chairs on slate patio)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10640

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two couples seated in chairs on slate patio)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Clothing 99.8
Apparel 99.8
Person 99.5
Human 99.5
Person 98.8
Person 96.6
Bonnet 93.8
Hat 93.8
Person 92.1
Person 86.5
Face 81.9
Furniture 79.6
People 73.1
Person 71.2
Chair 66.2
Photography 65.4
Photo 65.4
Portrait 65.4
Female 59.2
Girl 59.2
Coat 56.7
Room 56.3
Indoors 56.3

Imagga
created on 2022-01-15

person 28.5
people 24.5
man 21.5
sport 20.7
adult 19.3
male 19.1
silhouette 18.2
teacher 18
player 17.9
black 17.4
musical instrument 17.4
event 16.6
blackboard 16.3
wind instrument 15.8
athlete 15.8
training 15.7
competition 14.6
men 14.6
muscular 14.3
stadium 13.6
skill 13.5
professional 13.4
symbol 12.8
field 12.5
crowd 12.5
park 12.4
brass 12.3
lights 12
flag 12
cheering 11.7
championship 11.7
educator 11.6
match 11.6
dancer 11.5
active 10.9
lifestyle 10.8
leisure 10.8
nighttime 10.8
audience 10.7
happy 10.6
patriotic 10.5
performer 10.4
nation 10.4
sitting 10.3
model 10.1
exercise 10
versus 9.8
shorts 9.8
lady 9.7
portrait 9.7
body 9.6
design 9.6
icon 9.5
art 9.5
youth 9.4
bright 9.3
smile 9.3
attractive 9.1
fashion 9
human 9
style 8.9
business 8.5
modern 8.4
power 8.4
dark 8.3
entertainment 8.3
fun 8.2
cornet 8.2
activity 8.1
sax 7.9
world 7.9
love 7.9
vibrant 7.9
serve 7.8
tennis 7.8
court 7.8
play 7.8
room 7.5
glowing 7.4
group 7.2
team 7.2
shiny 7.1
women 7.1
businessman 7.1
happiness 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 96.5
outdoor 89.9
person 88
clothing 82.3
drawing 69.7

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 98.1%
Surprised 49.6%
Happy 33%
Confused 5.5%
Calm 3.8%
Sad 3.4%
Disgusted 2.3%
Fear 1.2%
Angry 1.2%

AWS Rekognition

Age 36-44
Gender Male, 92.8%
Surprised 59.7%
Fear 18.7%
Sad 9.5%
Confused 4.6%
Calm 2.8%
Disgusted 1.9%
Angry 1.7%
Happy 1.1%

AWS Rekognition

Age 38-46
Gender Male, 99.7%
Happy 54.2%
Surprised 43.3%
Sad 0.8%
Confused 0.7%
Angry 0.4%
Calm 0.4%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 39-47
Gender Male, 99.9%
Calm 28.6%
Confused 26.4%
Happy 23.1%
Sad 9.5%
Disgusted 4.9%
Surprised 4%
Angry 2.1%
Fear 1.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a group of people standing in front of a window 69.8%
a group of people posing for a photo 69.7%
a group of people posing for a photo in front of a window 65.5%

Text analysis

Amazon

34965

Google

349
65
349 65