Human Generated Data

Title

[Workers and machinery at the Panama Canal, viewed from above]

Date

June 1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.154.11

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Workers and machinery at the Panama Canal, viewed from above]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

June 1936

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-04-04

Machine 94.9
Aircraft 79.8
Transportation 79.8
Airplane 79.8
Vehicle 79.8
Motor 64
Person 60.6
Human 60.6
Tire 59.9
Engine 58.7
Wheel 55.4

Clarifai
created on 2021-04-04

vehicle 98.8
people 98.6
war 96.7
transportation system 96.6
military 95.4
no person 95
group 94
waste 91.8
one 91
two 88.9
many 88.7
group together 88.5
adult 87.7
weapon 87.1
industry 86.4
watercraft 85.9
print 85.8
military vehicle 84.2
administration 83.6
container 81.8

Imagga
created on 2021-04-04

car 43.3
vehicle 37
gondola 31.9
boat 25.9
vessel 20.8
old 20.2
industry 17.9
automobile 17.2
transportation 17
device 16.6
repair 16.3
auto 16.3
craft 16.2
black 15.8
wheel 13.9
transport 13.7
metal 13.7
machine 12.9
speed 12.8
industrial 12.7
equipment 11.6
steel 11.6
travel 11.3
safety 11
job 10.6
broken 10.6
drive 10.4
seat 10.4
road 9.9
accident 9.8
part 9.7
engine 9.6
shop 9.5
wheeled vehicle 9.1
cockpit 9
motor 9
tool 8.9
support 8.9
crash 8.8
driving 8.7
rubbish 8.7
damaged 8.6
construction 8.5
tire 8.2
open 8.1
man 8.1
work 7.8
motor vehicle 7.8
mechanical 7.8
box 7.7
iron 7.6
wreck 7.4
detail 7.2
dirty 7.2
worker 7.1
working 7.1

Google
created on 2021-04-04

Microsoft
created on 2021-04-04

black and white 90.6
airplane 89.5
text 85.3
aircraft 84.1
vehicle 63

Face analysis

Amazon

AWS Rekognition

Age 31-47
Gender Female, 69.2%
Sad 98.1%
Calm 1.2%
Fear 0.2%
Confused 0.1%
Angry 0.1%
Happy 0.1%
Surprised 0%
Disgusted 0%

Feature analysis

Amazon

Airplane 79.8%
Person 60.6%

Captions

Microsoft

an old photo of a vehicle 56.7%
an old photo of a truck 56.6%
an old photo of a car 50.7%