Human Generated Data

Title

Untitled (woman holding child and bag of groceries)

Date

1953

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8787

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman holding child and bag of groceries)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1953

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 98.3
Person 98.3
Apparel 97.1
Clothing 97.1
Plant 92.6
Tree 92.6
Arecaceae 84.9
Palm Tree 84.9
Face 84.7
Female 83.4
Leisure Activities 78.6
Portrait 72.7
Photo 72.7
Photography 72.7
Outdoors 69.4
Shorts 68.4
Woman 67.6
Girl 61.5
Tarmac 59.3
Asphalt 59.3
Dress 58.2
Footwear 56.7
Shoe 56.7
Musical Instrument 55.8
Suit 55.5
Coat 55.5
Overcoat 55.5
Musician 55.1

Imagga
created on 2022-01-09

sax 100
wind instrument 36.2
man 29.6
person 20.8
music 19.9
male 19.1
black 18
musician 17
silhouette 16.6
people 15.6
guitar 15.5
adult 14.9
style 14.8
performer 14.8
rock 14.8
studio 14.4
musical 14.4
instrument 14
hair 12.7
concert 12.6
singer 11.9
holding 11.6
performance 11.5
musical instrument 11.4
happy 11.3
sexy 11.2
smoke 11.2
entertainment 11
guitarist 10.8
attractive 10.5
portrait 10.4
play 10.3
youth 10.2
bass 10.1
microphone 9.9
hand 9.9
fashion 9.8
song 9.8
together 9.6
looking 9.6
couple 9.6
sky 9.6
men 9.4
happiness 9.4
professional 9.3
dark 9.2
danger 9.1
suit 9
sunset 9
businessman 8.8
light 8.7
model 8.6
art 8.5
electric 8.4
outdoor 8.4
power 8.4
hot 8.4
guy 8.3
outdoors 8.2
industrial 8.2
lifestyle 7.9
women 7.9
love 7.9
standing 7.8
player 7.5
fun 7.5
business 7.3
protection 7.3
metal 7.2
stage 7.2
smile 7.1
working 7.1
day 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.2
tree 85.3
outdoor 85.2
black and white 66.1

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 96.6%
Calm 43.2%
Sad 34.3%
Happy 9.6%
Confused 5.8%
Disgusted 2.1%
Surprised 1.8%
Fear 1.7%
Angry 1.3%

AWS Rekognition

Age 42-50
Gender Male, 99%
Surprised 51.5%
Calm 42.3%
Sad 2%
Angry 1.2%
Happy 1.2%
Disgusted 1%
Fear 0.5%
Confused 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%
Shoe 56.7%

Captions

Microsoft

a person standing in front of a sign 51.1%
a person holding a sign 39.9%
a person standing in front of a sign 30.3%

Text analysis

Amazon

39297.
8SA
YT37A°-

Google

8SA
39297. 8SA
39297.