Human Generated Data

Title

[Station stairs]

Date

late 1930's

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.518.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Station stairs]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

late 1930's

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.518.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 96.6
Person 96.6
Person 87.9
Person 85.2
City 84.4
Urban 84.4
Town 84.4
Building 84.4
Metropolis 84.4
Person 83.8
Person 82.6
Person 66.3
Person 65.2
People 62.3
Transportation 60.4
Vehicle 60.4
Motorcycle 60.4

Clarifai
created on 2019-11-19

people 99.9
group 98.7
many 98.3
vehicle 98.2
adult 97.8
group together 97.7
one 96.8
transportation system 94.7
man 94.3
war 93.3
military 92.7
administration 92.5
two 92.2
wear 91.4
indoors 89.1
aircraft 88.8
no person 87.8
watercraft 86.8
outfit 84.5
furniture 83.6

Imagga
created on 2019-11-19

tank 23.6
industrial 23.6
vehicle 23
old 22.3
uniform 20.9
industry 20.5
military uniform 19.7
gun 19.6
architecture 19.5
device 18.4
weapon 18.3
city 17.5
iron lung 17.3
metal 16.9
steel 16.8
factory 16.5
machine 16.4
military vehicle 16.4
cannon 15.4
building 15.3
clothing 15.2
respirator 14.8
wreckage 14.6
part 14.6
military 14.5
tracked vehicle 14.3
war 13.4
destruction 12.7
artillery 12.2
armored vehicle 12.1
danger 11.8
power 11.8
man 11.4
smoke 11.2
conveyance 10.9
history 10.7
breathing device 10.5
iron 10.3
camouflage 10.2
protection 10
dirty 9.9
travel 9.9
engine 9.6
black 9.6
work 9.4
culture 9.4
light 9.4
equipment 9.3
covering 9.3
weaponry 9.1
environment 9
transportation 9
sky 8.9
soldier 8.8
mask 8.8
urban 8.7
structure 8.7
pollution 8.7
male 8.5
energy 8.4
monument 8.4
vintage 8.3
tourism 8.2
wheeled vehicle 8.2
missile 8.2
machinery 8
consumer goods 7.9
rock 7.8
art 7.8
rust 7.7
construction 7.7
statue 7.6
stone 7.6
wheel 7.5
house 7.5
fire 7.5
famous 7.4
technology 7.4
town 7.4
inside 7.4
rifle 7.3
design 7.3
transport 7.3
armament 7.1
night 7.1

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

black and white 90.2
text 87
old 45.4
clothes 31.1
cluttered 12.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-42
Gender Male, 50.1%
Disgusted 49.5%
Confused 49.5%
Fear 49.5%
Happy 49.5%
Calm 49.6%
Angry 49.5%
Surprised 49.5%
Sad 50.3%

AWS Rekognition

Age 26-42
Gender Female, 50.2%
Sad 50.4%
Disgusted 49.5%
Fear 49.5%
Happy 49.5%
Calm 49.5%
Surprised 49.5%
Angry 49.5%
Confused 49.5%

AWS Rekognition

Age 18-30
Gender Male, 50.5%
Happy 49.5%
Sad 49.6%
Disgusted 49.5%
Surprised 49.5%
Confused 50.4%
Calm 49.5%
Angry 49.5%
Fear 49.5%

Feature analysis

Amazon

Person 96.6%
Motorcycle 60.4%

Categories