Human Generated Data

Title

Untitled (workers unloading bananas from conveyor belt)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7530

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (workers unloading bananas from conveyor belt)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7530

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.6
Human 99.6
Person 99.6
Person 99.2
Person 98.9
Person 98.3
Person 97.3
Clothing 93.1
Apparel 93.1
Person 90.5
Shorts 69.4
People 63.1
Person 62.5
Coat 61.3
Text 60.2
Suit 60
Overcoat 60
Clinic 59.4
Amusement Park 59
Theme Park 58.7

Clarifai
created on 2023-10-25

people 99.9
many 98
group 97.7
group together 97.1
man 96.9
adult 96.4
woman 93.8
vehicle 92.6
several 90.3
monochrome 89
administration 85.8
child 85.3
war 83
commerce 82.6
watercraft 81
three 80.2
wear 79.5
recreation 79
military 78.8
merchant 77

Imagga
created on 2022-01-08

man 22.8
person 21.7
people 19.5
building 17.9
business 16.4
male 16.3
sky 15.9
work 15.6
city 14.1
adult 13
men 12.9
architecture 12.5
uniform 12.3
urban 12.2
construction 12
stage 11.8
industrial 11.8
industry 11.1
power 10.9
life 10.7
factory 10.6
travel 10.6
technology 10.4
women 10.3
worker 10.2
clothing 9.9
businessman 9.7
equipment 9.7
metal 9.6
engineer 9.2
old 9
human 9
transportation 9
job 8.8
seller 8.6
military uniform 8.6
leisure 8.3
transport 8.2
sport 7.9
patient 7.8
portrait 7.8
modern 7.7
mask 7.7
outdoor 7.6
two 7.6
hand 7.6
plan 7.6
symbol 7.4
office 7.2
lifestyle 7.2
activity 7.2
tower 7.2
structure 7.1
interior 7.1
steel 7.1
working 7.1
case 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 98.2
clothing 95.8
person 92.7
outdoor 90.4
man 86.3
woman 78.1
black and white 53

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 99%
Calm 99.7%
Sad 0.1%
Happy 0%
Confused 0%
Angry 0%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 23-33
Gender Male, 92.2%
Calm 99.1%
Sad 0.2%
Confused 0.2%
Surprised 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 53-61
Gender Male, 89.7%
Surprised 29.7%
Fear 28.3%
Confused 15.8%
Calm 14.2%
Disgusted 5%
Sad 2.8%
Happy 2.5%
Angry 1.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Categories

Text analysis

Amazon

SE
28739A.
KODAK-EVEETA

Google

28739 A.
28739
A.