Human Generated Data

Title

[Workers and machinery at the Panama Canal, viewed from above]

Date

June 1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.154.23

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Workers and machinery at the Panama Canal, viewed from above]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

June 1936

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-04-04

Nature 89.8
Landscape 83.5
Outdoors 83.5
Road 83
Transportation 82.9
Vehicle 82.3
Scenery 71.6
Freeway 69
Military 68.6
Military Uniform 68.6
Aerial View 67.6
Animal 67.4
Bird 67.4
Army 63.4
Armored 63.4
Aircraft 60.3
Airplane 60.3

Clarifai
created on 2021-04-04

vehicle 98.3
people 96.3
transportation system 95.2
no person 94.9
art 92.5
waste 92.2
dust 90.9
industry 90.4
monochrome 89.9
street 88
aircraft 87.9
trash 87.7
abandoned 87.6
grinder 87.4
retro 86.6
broken 85.4
car 84.3
one 84.2
watercraft 82.5
war 82.5

Imagga
created on 2021-04-04

cockpit 58.8
seat 46.5
car 41.2
vehicle 36
support 35.1
device 32.3
transportation 24.2
automobile 22
auto 21
equipment 18.8
motor 18.6
engine 18.3
part 18
repair 17.2
industry 17.1
metal 16.9
transport 16.5
black 16.3
old 14.6
speed 13.7
mechanical 13.6
machine 13.4
wheel 13.3
rubbish 13.1
industrial 11.8
work 11.8
broken 11.6
steel 11.5
tool 11
metallic 11
road 10.8
shop 10.8
wreckage 10.2
safety 10.1
closeup 10.1
travel 9.8
hood 9.8
automotive 9.8
drive 9.4
grand piano 9.1
wreck 9
dirty 9
technology 8.9
shiny 8.7
damaged 8.6
modern 8.4
power 8.4
wheeled vehicle 8.3
detail 8
job 8
design 7.9
parts 7.8
crash 7.8
mechanic 7.8
newspaper 7.7
driving 7.7
construction 7.7
expensive 7.7
tools 7.6
piano 7.5
iron 7.5
classic 7.4
close 7.4
sport 7.4
man 7.4
tire 7.4
light 7.3
new 7.3
garage 7.2
product 7

Microsoft
created on 2021-04-04

text 96.8
black and white 80.6
sketch 69.8
drawing 64.6
several 17.6
raft 12.9

Face analysis

Amazon

AWS Rekognition

Age 26-40
Gender Male, 99.6%
Angry 55%
Calm 44.2%
Sad 0.4%
Fear 0.2%
Confused 0.1%
Happy 0%
Surprised 0%
Disgusted 0%

AWS Rekognition

Age 36-52
Gender Female, 77.8%
Sad 76.3%
Fear 6.6%
Happy 6.6%
Calm 4.7%
Angry 1.8%
Disgusted 1.7%
Surprised 1.3%
Confused 0.8%

Feature analysis

Amazon

Bird 67.4%
Airplane 60.3%

Captions

Microsoft

a close up of a suitcase 33.8%

Text analysis

Amazon

Rnee