Human Generated Data

Title

Untitled (man standing at workbench, Broomall, PA)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11966

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man standing at workbench, Broomall, PA)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11966

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.8
Human 99.8
Restaurant 88.7
Shelf 74.2
Meal 73.6
Food 73.6
Helicopter 68.8
Transportation 68.8
Vehicle 68.8
Aircraft 68.8
Cafe 65.1
Cafeteria 63.4
Shop 62.6
Chef 58.1
Bakery 55

Clarifai
created on 2023-10-26

people 99.6
adult 98
monochrome 97.9
man 96.3
one 94.6
vehicle 94.4
furniture 92.2
room 92.1
two 87.2
indoors 86.6
transportation system 84.5
wear 81.7
employee 81.6
military 80.6
three 79.4
group 77.7
chalk out 75
veil 74.7
print 73.1
woman 72.2

Imagga
created on 2022-01-15

brass 44.5
wind instrument 36.4
musical instrument 23.4
bass 21.5
sax 18.9
work 14.9
machinist 14.7
man 14.1
metal 13.7
people 13.4
equipment 13
technology 12.6
machine 11.8
adult 11.6
hand 11.4
computer 11.2
worker 10.9
industrial 10.9
mechanic 10.7
working 10.6
person 10.6
device 10.2
power 10.1
military 9.7
engine 9.6
black 9.6
repair 9.6
music 9.2
male 9.2
protection 9.1
industry 8.5
professional 8.4
cornet 8.3
horn 8.1
light 8
job 8
steel 8
business 7.9
war 7.7
old 7.7
vintage 7.5
safety 7.4
danger 7.3
baritone 7.1
modern 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 98.9
man 95.6
clothing 92.1
person 91.2
black and white 89.4
table 62.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 18-24
Gender Male, 52%
Calm 88.8%
Sad 6.7%
Happy 1.8%
Surprised 0.9%
Confused 0.8%
Disgusted 0.5%
Fear 0.3%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Helicopter 68.8%

Text analysis

Amazon

15810
15810.
T3T

Google

15810. 15810. IE HAGON-YT3JA2-NAMTZA3
15810.
IE
HAGON-YT3JA2-NAMTZA3