Human Generated Data

Title

Untitled (man in large steamboat engine room)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14346

Human Generated Data

Title

Untitled (man in large steamboat engine room)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Building 99.9
Factory 99.9
Assembly Line 98
Person 91.7
Human 91.7
Manufacturing 83.8
Machine 69.6

Imagga
created on 2022-01-29

industry 50.4
industrial 49.9
factory 41.8
power 38.6
pipe 36
engineering 32.4
steel 30.6
turbine 30.5
energy 30.3
pollution 29.8
equipment 29.7
gas 28.9
fuel 27.9
plant 27.8
oil 26.9
production 26.3
machine 24.9
technology 24.5
pump 24.5
heavy 23.9
device 22.8
building 22.8
station 22.2
water 22
pipeline 21.7
pipes 21.7
tube 21.2
structure 21.2
cyclotron 21.2
piping 20.8
environment 20.6
metal 20.1
valve 19.9
accelerator 19.6
waste 19.4
steam 19.4
refinery 18.7
complex 18.4
machinery 18.1
transportation 17.9
ship 17.4
chemical 17.4
work 17.3
modern 16.8
chimney 16.7
vessel 16.5
supply 16.4
science 16
facility 15.8
business 15.8
heat 15.7
urban 15.7
manufacturing 15.6
construction 15.4
electricity 15.1
economy 14.8
petrol 14.7
scientific instrument 14.7
sea 14.1
sky 14
system 13.3
boat 13.2
park 13
global 12.8
architecture 12.7
warming 12.7
mechanical 12.6
port 12.5
burn 12.5
environmental 12.2
combustible 11.9
harbor 11.6
concrete 11.5
high 11.3
boiler 11.2
transport 11
hot 10.9
mechanic 10.7
generator 10.6
travel 10.6
instrument 10.4
tract 10.1
ocean 10
tower 9.8
deck 9.8
manufacture 9.8
reflection 9.7
iron 9.3
inside 9.2
oil industry 8.9
tubes 8.9
storage 8.6
control 8.1
gear 8.1
smokestack 7.9
apparatus 7.9
barrel 7.8
nautical 7.8
technical 7.7
engine 7.7
craft 7.4
vacation 7.4
container 7.3
engineer 7.3
lines 7.2

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 95
black and white 89.9
outdoor 89.4
ship 82.5
white 61.1

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 90.6%
Calm 97.5%
Sad 1%
Surprised 0.5%
Confused 0.4%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 91.7%

Captions

Microsoft

a person standing in front of a building 82.8%
a person sitting in front of a building 72.6%
a person that is standing in front of a building 72.5%

Text analysis

Amazon

MJIR
MJIR ACHAA
SYPHON
A SYPHON
A
ACHAA

Google

A
MJI7
0
MJI7 YT3RA2 0 A
YT3RA2