Human Generated Data

Title

Untitled (three men seated outside of a trailer)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10653

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (three men seated outside of a trailer)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99
Human 99
Nature 96.8
Outdoors 95.1
Person 94.9
Vehicle 86.2
Transportation 86.2
Ice 83.9
Snow 81
Car 76.5
Automobile 76.5
Helicopter 76.1
Aircraft 76.1
Wheel 74.1
Machine 74.1
Building 73.9
Boat 70.4
Watercraft 66.6
Vessel 66.6
Person 66
Ship 65.4
Car 63.3
Sand 56.3

Imagga
created on 2022-01-15

ship 63.5
craft 49.9
liner 45.9
vessel 45.6
hovercraft 39.2
passenger ship 35.4
vehicle 35.3
sea 33.8
water 32.7
ocean 27.4
warship 26.2
sky 24.9
travel 23.9
tourism 23.1
boat 22.2
city 21.6
port 20.2
transport 19.2
harbor 18.3
bay 18.2
transportation 17.9
architecture 17.2
aircraft carrier 16.9
military vehicle 16.6
tourist 16.3
bridge 16.1
night 16
conveyance 15.3
river 15.1
landscape 14.9
building 14.4
coast 14.4
urban 14
vacation 13.9
tower 13.4
cityscape 13.2
scene 13
beach 12.8
industry 12.8
pier 12.2
battleship 11.8
dock 11.7
cruise 11.7
downtown 11.5
panorama 11.4
old 11.1
construction 11.1
winter 11.1
summer 10.9
landmark 10.8
nautical 10.7
skyline 10.5
town 10.2
roll-on roll-off 10.1
shipping 9.8
sailing 9.7
pacific 9.7
black 9.6
clouds 9.3
island 9.2
business 9.1
sunset 9
military 8.7
light 8.7
structure 8.6
holiday 8.6
snow 8.5
buildings 8.5
famous 8.4
shore 8.4
ice 8.3
day 7.8
boats 7.8
cargo 7.8
cold 7.7
seascape 7.7
marine 7.6
war 7.5
evening 7.5
commerce 7.5
industrial 7.3
sand 7.1
marina 7.1
season 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.9
ship 99.3
black and white 92.2
outdoor 92.1
watercraft 90
boat 86.1
vehicle 81.3
white 79.5
black 79.2
monochrome 65.8
old 51.4
vintage 29.4

Face analysis

Amazon

AWS Rekognition

Age 19-27
Gender Male, 99.7%
Happy 60.8%
Calm 23.7%
Surprised 6%
Fear 3.3%
Angry 2.4%
Disgusted 2.3%
Sad 1%
Confused 0.5%

AWS Rekognition

Age 13-21
Gender Male, 98%
Fear 91.3%
Sad 5.2%
Calm 1.7%
Surprised 0.6%
Angry 0.5%
Confused 0.3%
Happy 0.2%
Disgusted 0.2%

Feature analysis

Amazon

Person 99%
Car 76.5%
Helicopter 76.1%
Wheel 74.1%
Boat 70.4%

Captions

Microsoft

a vintage photo of a ship 67.7%
a vintage photo of a plane 60.6%
a vintage photo of a large ship in the background 58.9%

Text analysis

Amazon

35115
SDVR-COVEEIA

Google

3S115
3S115