Human Generated Data

Title

Untitled (mill worker using machinery)

Date

1951

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18433

Human Generated Data

Title

Untitled (mill worker using machinery)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Human 98.2
Person 98.2
Wheel 94.2
Machine 94.2
Weapon 81.2
Weaponry 81.2
Gun 73.7
Apparel 73.1
Helmet 73.1
Clothing 73.1
Military 72.6
Military Uniform 69.4
Person 64.1
People 60.7
Armored 59.6
Army 59.6

Imagga
created on 2022-03-04

machine 37.8
industry 32.4
industrial 29
building 28
steel 26.8
construction 26.5
device 26.3
power 25.2
heavy 22.9
equipment 22.4
factory 21
engineering 20
work 19.6
machinery 17.3
pipe 17.3
tool 16.6
metal 16.1
gas 15.4
transportation 15.2
energy 15.1
structure 14.5
artillery 14.5
environment 14
vehicle 14
station 13.5
fuel 13.5
sky 12.7
field artillery 12.7
waste 12.6
pollution 12.5
valve 11.9
architecture 11.7
dirt 11.4
plant 11.4
urban 11.4
modern 11.2
pipeline 10.8
production 10.7
armament 10.7
science 10.7
steam 10.7
tube 10.6
supply 10.6
truck 10.3
heat 10.2
man 10.1
transport 10
global 10
pump 10
piping 9.9
refinery 9.8
complex 9.7
chemical 9.6
engineer 9.5
wheel 9.4
environmental 9.4
site 9.4
oil 9.3
economy 9.3
outdoor 9.2
pipes 8.9
electricity 8.5
iron 8.4
turbine 8.2
technology 8.2
plow 8.1
facility 8.1
cannon 8
working 7.9
combustible 7.9
petrol 7.8
manufacturing 7.8
burn 7.7
system 7.6
ground 7.6
hot 7.5
house 7.5
outdoors 7.5
water 7.3
earth 7.3
business 7.3
danger 7.3
dirty 7.2
crane 7.1
worker 7.1
backhoe 7.1
high-angle gun 7
travel 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

outdoor 98.2
text 94.9
person 93.4
man 92.4
black and white 90.7
old 41.9

Face analysis

Amazon

AWS Rekognition

Age 30-40
Gender Male, 99.1%
Calm 68%
Sad 15%
Fear 4.9%
Disgusted 3.3%
Confused 3.2%
Surprised 2.3%
Happy 1.7%
Angry 1.6%

Feature analysis

Amazon

Person 98.2%
Wheel 94.2%
Helmet 73.1%

Captions

Microsoft

a man standing in front of a building 73.4%
a man that is standing in front of a building 67.2%
a man standing next to a building 67.1%

Text analysis

Amazon

as
YE3
KAGOX