Human Generated Data

Title

Untitled (men sitting and laying on the deck of a ship)

Date

c. 1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4786

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men sitting and laying on the deck of a ship)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 97.6
Human 97.6
Person 92.8
Person 92
Boat 85.3
Vehicle 85.3
Transportation 85.3
Person 84.3
Amusement Park 83.3
Theme Park 83.3
Person 78.4
Porch 74
Clothing 66.8
Apparel 66.8
Person 64.8
Shorts 62.4
Carousel 59.9
Sailor Suit 58
Bridge 57.3
Building 57.3
Boardwalk 57.3
Deck 55.1
Person 52.2
Person 50.5

Imagga
created on 2022-01-29

sky 39.5
wire 38.6
electricity 37.8
power 34.4
high 31.2
energy 31.1
industry 30.7
tower 30.4
voltage 30.3
battleship 28.7
cable 28.4
electric 28.1
structure 24.2
warship 23.5
chairlift 23.2
line 22.6
supply 22.2
shop 21.5
ship 21
industrial 20.9
technology 20.8
electrical 20.1
shoe shop 19.9
steel 19.4
construction 18.8
ski tow 18.7
military vehicle 17.9
cables 17.6
mercantile establishment 17.2
pylon 16.8
building 15.3
metal 15.3
network 14.8
distribution 14.7
conveyance 14.7
current 14.7
equipment 14.6
engineering 14.3
tall 14.1
architecture 14.1
station 13.7
clouds 13.5
facility 13.3
frame 12.9
business 12.8
urban 12.2
cloud 12
volts 11.8
wires 11.8
pole 11.7
city 11.6
place of business 11.3
vessel 11
transmission 10.8
landscape 10.4
water 10
transformer 9.9
environment 9.9
mast 9.8
silhouette 9.1
danger 9.1
work 8.6
modern 8.4
crane 8.4
gymnasium 8.3
sea 8.1
lines 8.1
utility 7.9
clear 7.8
engineer 7.7
generation 7.7
cityscape 7.6
outdoors 7.5
drawing 7.4
river 7.1
travel 7

Microsoft
created on 2022-01-29

text 97.1
ship 95.6
outdoor 86
black and white 76.1
several 10.2

Face analysis

Amazon

AWS Rekognition

Age 20-28
Gender Male, 94.8%
Calm 78.9%
Happy 5%
Surprised 4.6%
Sad 4.3%
Angry 3.6%
Fear 1.7%
Disgusted 1%
Confused 0.9%

AWS Rekognition

Age 24-34
Gender Female, 55.1%
Disgusted 75%
Sad 8.8%
Happy 5.7%
Angry 3.6%
Calm 2.7%
Fear 1.7%
Confused 1.2%
Surprised 1.2%

AWS Rekognition

Age 23-33
Gender Female, 77.4%
Surprised 40.7%
Confused 19.7%
Sad 18.6%
Calm 11.7%
Fear 3.6%
Angry 2.7%
Happy 1.8%
Disgusted 1.3%

Feature analysis

Amazon

Person 97.6%
Boat 85.3%

Captions

Microsoft

a group of people riding on the back of a boat 26.9%
a group of people on a boat 26.8%
an old photo of a person 26.7%

Text analysis

Amazon

247

Google

247.
247.