Human Generated Data

Title

[Train tracks with bicylers and pedestrian in foreground]

Date

1936-1937

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.212.18

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Train tracks with bicylers and pedestrian in foreground]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936-1937

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.212.18

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 99.6
Person 99.6
Person 99.6
Person 97.8
Bike 97.8
Vehicle 97.8
Transportation 97.8
Bicycle 97.8
Train Track 97.4
Rail 97.4
Railway 97.4
Person 90.4
Asphalt 62.9
Tarmac 62.9
Road 59.4
Person 44.8

Clarifai
created on 2019-11-19

people 99.8
group together 99
monochrome 98.8
adult 98.1
street 98
group 97.7
vehicle 96.2
transportation system 96.1
man 95.7
many 95.7
one 94.7
two 94
woman 91.6
road 90.5
black and white 87.7
four 86.8
no person 86.5
wear 86.2
three 85.3
home 85.2

Imagga
created on 2019-11-19

city 36.6
architecture 35.6
tunnel 29
old 27.2
building 26.1
travel 25.3
history 23.2
urban 21.8
passageway 20.6
structure 20.6
stone 20.1
tourism 18.1
sky 17.9
wall 17.9
town 16.7
passage 15.8
tourist 15.2
famous 14.9
cityscape 14.2
track 13.7
landmark 13.5
street 12.9
way 12.7
tower 12.5
fountain 12.5
ancient 12.1
culture 12
river 10.7
water 10.7
bridge 10.5
landscape 10.4
historic 10.1
light 10
palace 9.5
buildings 9.5
day 9.4
historical 9.4
monument 9.3
church 9.2
dark 9.2
arch 9.1
religion 9
outdoors 9
color 8.9
column 8.7
medieval 8.6
art 8.5
traditional 8.3
vintage 8.3
fortress 7.9
cobblestone 7.9
scene 7.8
construction 7.7
device 7.7
outdoor 7.6
hall 7.6
path 7.6
destination 7.5
brick 7.4
house 7.3
temple 7.3
industrial 7.3
castle 7.1
sea 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

outdoor 98.5
black and white 93.7
monochrome 72.1
black 65.4
text 58.6
street 50.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 47-65
Gender Female, 50.1%
Disgusted 49.5%
Fear 49.7%
Happy 49.5%
Confused 49.5%
Sad 50.3%
Angry 49.5%
Surprised 49.5%
Calm 49.6%

AWS Rekognition

Age 22-34
Gender Male, 51.1%
Confused 45.6%
Disgusted 45.2%
Surprised 45.4%
Sad 49%
Angry 48.1%
Fear 45.2%
Happy 45.2%
Calm 46.3%

Feature analysis

Amazon

Person 99.6%
Bicycle 97.8%