Human Generated Data

Title

Untitled (three men seated outside of a trailer)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10652

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (three men seated outside of a trailer)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Nature 98.4
Aircraft 96.8
Transportation 96.8
Vehicle 96.8
Helicopter 96.8
Outdoors 94.6
Person 91.2
Human 91.2
Wheel 85.5
Machine 85.5
Building 81.4
Weather 77.9
Person 69.4
Sea 69.3
Ocean 69.3
Water 69.3
Snow 64.8
Airplane 64.5
Person 62
Urban 61.2
Ice 60.7
Architecture 59
Road 55.1
Tower 55.1

Imagga
created on 2022-01-15

ship 32.3
craft 31.7
hovercraft 27.6
vehicle 26.9
sea 26.7
sky 26.2
travel 25.4
vessel 25.2
water 24
landscape 20.1
city 20
ocean 19.3
tourism 19
boat 17
car 16.5
port 16.4
urban 15.7
transport 15.5
structure 15.3
transportation 15.2
architecture 14.8
town 14.8
harbor 14.4
night 14.2
scene 13.8
road 13.6
coast 13.5
bay 13.4
oil tanker 13.3
river 13.3
vacation 13.1
cargo ship 12.6
tower 12.5
bridge 12.4
light 12
tourist 12
industry 12
negative 11.8
building 11.8
dock 11.7
outdoor 11.5
snow 11.3
beach 11.1
day 11
liner 10.8
pier 10.7
shipping 10.5
outdoors 10.5
cloud 10.3
winter 10.2
street 10.1
dark 10
conveyance 9.5
cityscape 9.5
famous 9.3
clouds 9.3
film 9.3
old 9.1
summer 9
season 8.6
construction 8.6
landmark 8.1
holiday 7.9
rock 7.8
wave 7.8
automobile 7.7
traffic 7.6
lake 7.6
commerce 7.5
motor vehicle 7.4
park 7.4
ice 7.4
speed 7.3
device 7.2
sunset 7.2
photographic paper 7.2
trees 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.4
ship 99.2
outdoor 98
black and white 85.8
watercraft 82.3
white 75
boat 72.9
vehicle 69.3

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Male, 62%
Calm 64%
Sad 15.1%
Happy 4.5%
Fear 4.4%
Disgusted 4.2%
Angry 2.8%
Confused 2.6%
Surprised 2.4%

Feature analysis

Amazon

Helicopter 96.8%
Person 91.2%
Wheel 85.5%
Airplane 64.5%

Captions

Microsoft

a vintage photo of a truck 67%
a vintage photo of a man 66.9%
a vintage photo of a man riding on the back of a truck 26.4%

Text analysis

Amazon

35113

Google

3S113
3S113