Human Generated Data

Title

Untitled (circus workers standing on covered train car containing wild animals)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5138

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (circus workers standing on covered train car containing wild animals)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.8
Human 99.8
Person 99.5
Person 99.4
Person 99.3
Person 99
Person 98.4
Person 94.6
Truck 94
Vehicle 94
Transportation 94
Person 85.8
Nature 85.3
Person 79.5
Road 75.2
Smoke 65.6
Bus 62.7
Pedestrian 59.7
Fog 57.1
Person 56.7
Spoke 55.2
Machine 55.2

Imagga
created on 2022-01-23

ship 100
cargo ship 92.7
container ship 90.4
vessel 70.8
shipping 44.5
port 37.6
sea 36.8
harbor 34.7
boat 34.3
industry 30.8
water 30.1
drilling platform 29.3
transport 27.4
oil tanker 26.4
dock 26.3
ocean 25.7
craft 25.5
industrial 25.4
drill rig 23.4
cargo 23.3
rig 22.5
transportation 22.4
crane 20.9
sky 20.4
freight 18.6
river 17.8
power 16.8
wharf 16.7
export 15.8
boats 15.5
travel 14.8
gas 14.5
trade 14.4
heavy 14.3
commerce 14
oil 14
business 13.4
bulk 12.8
loading 12.8
pier 12.6
nautical 12.6
fuel 12.5
architecture 12.5
container 12.2
shore 12.1
building 12
island 11.9
cranes 11.9
quay 11.8
maritime 11.8
coast 11.7
city 11.7
equipment 11.5
gear 11.4
clouds 11
carrier 10.8
factory 10.6
steel 10.6
pollution 10.6
international 10.5
bridge 10.4
tourism 9.9
tower 9.9
logistics 9.9
ships 9.8
platform 9.7
energy 9.3
tourist 9.1
old 9.1
terminal 8.9
chimney 8.8
goods 8.8
cruise 8.8
storage 8.6
town 8.4
vacation 8.2
work 7.9
marine 7.6
cityscape 7.6
commercial 7.5
smoke 7.4
landscape 7.4
global 7.3
vehicle 7.2
holiday 7.2
world 7.1
summer 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.7
ship 92.3
vehicle 84.5
black and white 84.4
sky 66.6
people 59.4
old 43.6

Face analysis

Amazon

AWS Rekognition

Age 37-45
Gender Male, 99.3%
Confused 34.2%
Angry 25.4%
Fear 17.3%
Disgusted 6.2%
Happy 5.4%
Sad 4.1%
Surprised 3.8%
Calm 3.6%

Feature analysis

Amazon

Person 99.8%
Truck 94%
Bus 62.7%

Captions

Microsoft

a group of people standing in front of a crowd 68.9%
a group of people standing in front of a crowd of people 65.6%
a group of people standing in front of a building 65.5%

Text analysis

Amazon

BROS.
ARN
BROS. AND
175
AND
SH
RINGLING
5571
LEY
SHOW
RINGLING BROS. AND BARNOMS BAREY
MBINED SHOW
2
CO
MBINED
BAREY
BARNOMS
125
15591.
BANDER
1620
S

Google

BROS.
S
LEY
BROS,
175
BuREY
SON
AND.
15577
CO
ARN
RINGLING
BARNGM
175 RINGLING BROS. AN BARNGM S BuREY BINED SON 125 LEY CO ARN BROS, AND. 15577
AN
BINED
125