Human Generated Data

Title

Untitled (workers unloading bananas from conveyor belt)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7531

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (workers unloading bananas from conveyor belt)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.8
Human 99.8
Person 99.4
Person 99.1
Person 99.1
Person 99
Person 98.7
Clothing 97
Apparel 97
Shoe 88.7
Footwear 88.7
Shoe 83.1
Pants 78.7
Outdoors 76.2
Shorts 67.2
Nature 66.8
Coat 63.2
People 62
Photography 60.4
Photo 60.4
Overcoat 60
Suit 60
Shop 59.5
Market 59.5
Bazaar 59.5
Building 57.3

Imagga
created on 2022-01-08

seller 36.9
stall 28
man 25.5
person 23.4
people 20.6
male 19.8
business 17.6
adult 17
work 15.2
businessman 15
sky 12.7
job 12.4
supermarket 12.3
human 12
technology 11.9
portrait 11.6
professional 11.4
men 11.2
building 10.9
city 10.8
outdoor 10.7
hand 10.6
chart 10.5
life 10.4
love 10.2
lifestyle 10.1
power 10.1
patient 9.8
case 9.8
construction 9.4
smile 9.3
grocery store 9.2
dress 9
health 9
new 8.9
success 8.8
together 8.8
couple 8.7
architecture 8.6
world 8.4
one 8.2
mercantile establishment 8.2
happy 8.1
engineer 8
looking 8
smiling 8
urban 7.9
stage 7.8
corporate 7.7
bride 7.7
customer 7.6
engineering 7.6
worker 7.5
manager 7.4
wedding 7.4
occupation 7.3
successful 7.3
design 7.3
office 7.2
transportation 7.2
women 7.1
day 7.1
modern 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 98.8
clothing 97.3
person 92.5
man 92.1
outdoor 91.4
black and white 89.8

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Female, 99.7%
Calm 72.1%
Angry 12.4%
Disgusted 5.8%
Sad 5.1%
Happy 2.2%
Surprised 1%
Confused 0.9%
Fear 0.6%

AWS Rekognition

Age 28-38
Gender Male, 75.7%
Calm 60.3%
Fear 18.8%
Sad 7.2%
Disgusted 6.6%
Happy 3.2%
Confused 2.1%
Angry 1%
Surprised 0.8%

AWS Rekognition

Age 54-64
Gender Female, 68.8%
Calm 77.7%
Sad 5.3%
Disgusted 4.9%
Surprised 4.4%
Confused 3.2%
Angry 2%
Fear 1.5%
Happy 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 88.7%

Captions

Microsoft

a group of people standing outside of a building 85.8%
a group of people standing in front of a building 84.6%
a group of people standing next to a building 84.5%

Text analysis

Amazon

28739.
KOPELK

Google

28739.
28739.