Human Generated Data

Title

Untitled (two men in cargo door of plane)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7161

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two men in cargo door of plane)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Hangar 99.8
Building 99.8
Person 98.5
Human 98.5
Person 97.9
Airplane 94
Aircraft 94
Vehicle 94
Transportation 94
Person 84.2
Train 78.2
Person 60.4
Person 50.7

Imagga
created on 2021-12-15

submarine 99
submersible 79.8
warship 60.5
military vehicle 59.5
vessel 56.9
ship 50.8
vehicle 39.4
boiler 38.6
container 32.8
transportation 32.3
industry 30.7
aircraft 28.8
industrial 28.1
transport 26.5
warplane 25.3
steel 22.1
power 19.3
sky 17.9
tank 17.8
oil 17.7
cargo 17.5
business 16.4
engine 16.4
travel 16.2
metal 16.1
old 16
train 15.5
rail 14.7
station 14.5
plane 14.5
factory 14.5
technology 14.1
railway 13.7
track 13.5
work 13.3
car 13.3
craft 13.1
airplane 13
air 12.9
railroad 12.8
energy 12.6
fuel 11.6
storage 11.4
delivery 10.7
pipe 10.7
building 10.3
architecture 10.2
horizontal 10
water 10
equipment 10
wagon 9.9
shipment 9.9
logistics 9.9
petrol 9.8
barrel 9.8
goods 9.8
production 9.7
plant 9.7
chemical 9.7
supply 9.7
gas 9.6
jet 9.6
flight 9.6
wheel 9.4
iron 9.3
boat 9.3
locomotive 9.2
freight 8.8
aviation 8.8
port 8.7
move 8.6
heavy 8.6
construction 8.6
grunge 8.5
airport 8.4
fly 8.4
vintage 8.3
tourism 8.3
machine 7.9
tanker 7.9
urban 7.9
structure 7.8
shipping 7.8
military 7.7
modern 7.7
war 7.7
winter 7.7
concrete 7.7
trade 7.7
engineering 7.6
clouds 7.6
journey 7.5
speed 7.3
road 7.2
snow 7.2
device 7.1
cockpit 7.1

Microsoft
created on 2021-12-15

black and white 97.6
text 93.6
person 89.4
clothing 80.6
footwear 78.5
monochrome 73.3
airplane 69.3
aircraft 66.7
man 52.1

Face analysis

Amazon

Google

AWS Rekognition

Age 26-40
Gender Male, 95.9%
Calm 92.6%
Sad 3.2%
Angry 2.3%
Happy 1.4%
Confused 0.2%
Surprised 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 42-60
Gender Male, 63.8%
Sad 59.8%
Calm 37.2%
Fear 1.6%
Angry 0.4%
Happy 0.4%
Confused 0.3%
Surprised 0.2%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Airplane 94%
Train 78.2%

Captions

Microsoft

a group of people standing around a plane 63.8%
a person standing next to a plane 53.9%
a group of people standing next to a plane 53.8%

Text analysis

Amazon

LIFE
U.S.
20499.
20499
PRESERVES
PRESERVER
NAMTRAR

Google

20499. PRESERV PRESEVE 20499.
20499.
PRESERV
PRESEVE