Human Generated Data

Title

[View of train in station]

Date

1937

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.521.8

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[View of train in station]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.521.8

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Person 99.6
Human 99.6
Train Track 99.5
Rail 99.5
Railway 99.5
Transportation 99.5
Person 99.4
Train 90.9
Vehicle 90.9
Train 86.3
Asphalt 66.1
Tarmac 66.1
Road 58.8
Pedestrian 55.8

Clarifai
created on 2019-11-19

monochrome 99.8
street 99.6
rain 98.2
transportation system 97.9
railway 97.9
locomotive 97.6
people 97.2
umbrella 97.1
light 96.6
train 96.5
road 95.6
city 95.6
black and white 95.3
vehicle 94.6
travel 94.3
car 93.8
sepia 91.8
tree 91.2
no person 90.6
man 89.9

Imagga
created on 2019-11-19

tunnel 100
passageway 100
passage 83.3
way 61
old 24.4
travel 22.5
architecture 22.2
sky 20.4
landscape 18.6
track 18.4
train 17.5
building 16.7
city 16.6
urban 16.6
light 15.4
house 14.3
dark 14.2
structure 13.8
railway 13.7
transportation 13.4
water 13.3
road 12.6
tourism 12.4
railroad 11.8
history 11.6
night 11.5
stone 11.2
construction 11.1
car 10.9
wall 10.7
river 10.7
rural 10.6
black 10.2
clouds 10.1
station 9.4
town 9.3
country 8.8
antique 8.7
scene 8.7
tree 8.5
street 8.3
vintage 8.3
transport 8.2
danger 8.2
industrial 8.2
home 8
corridor 7.9
rail 7.9
ancient 7.8
empty 7.7
culture 7.7
industry 7.7
outdoors 7.5
environment 7.4
vacation 7.4
historic 7.3
new 7.3
tourist 7.3
religion 7.2
tower 7.2
trees 7.1
grass 7.1
scenic 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

outdoor 96.9
black and white 88.7
building 86.9
monochrome 73.4
sky 56.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-32
Gender Female, 50.4%
Sad 49.7%
Confused 49.6%
Happy 49.5%
Surprised 49.6%
Disgusted 49.5%
Angry 49.8%
Calm 49.7%
Fear 49.5%

Feature analysis

Amazon

Person 99.6%
Train 90.9%

Text analysis

Amazon

TRACK
TRACK TRACK

Google

TRACK TRACK
TRACK