Human Generated Data

Title

[Woman balancing on train rail and group looking on]

Date

1930s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.349.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Woman balancing on train rail and group looking on]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.349.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Train Track 100
Rail 100
Railway 100
Transportation 100
Human 99.7
Person 99.7
Person 99.7
Person 99.6
Person 97.8
Clothing 97.2
Apparel 97.2
Vehicle 92.4
Train 92.4
Aircraft 89.6
Helicopter 89.6
Sleeve 80.1
Female 72.5
People 70.1
Animal 60.4
Bird 60.4
Flying 60.4
Photography 60.1
Photo 60.1
Face 59.8
Portrait 59.8
Woman 58.4

Clarifai
created on 2019-05-29

people 99.6
railway 98.2
adult 97.8
transportation system 97.7
man 97.7
train 97
group together 96.3
vehicle 96.2
two 93.8
locomotive 93.7
group 93.6
track 91.9
one 90.9
woman 90.1
war 85.7
three 81.9
leader 81.6
wear 81.4
child 81.1
street 79.1

Imagga
created on 2019-05-29

track 68.9
travel 26.7
tie 26.3
transportation 24.2
railway 23.5
train 22.3
water 21.3
railroad 20.6
brace 20.1
sea 18.7
beach 18.5
sky 18.5
ocean 18.4
sunset 18
landscape 17.8
rail 17.7
device 16.9
tourism 16.5
silhouette 15.7
transport 15.5
strengthener 15.1
vacation 14.7
outdoor 14.5
steel 14.2
shore 13.9
sun 13.8
outdoors 13.6
people 13.4
old 13.2
bridge 13
journey 12.2
evening 12.1
pier 11.9
summer 11.6
river 11.6
man 11.4
boat 11.4
building 11.1
alone 11
industrial 10.9
dusk 10.5
city 10
road 9.9
person 9.9
coast 9.9
sand 9.6
walk 9.5
support 9.4
industry 9.4
architecture 9.4
light 9.4
lake 9.1
line 9
horizon 9
fisherman 8.9
urban 8.7
male 8.6
way 8.6
construction 8.6
outside 8.6
walking 8.5
trip 8.5
tourist 8.5
danger 8.2
mountain 8
structure 7.8
dock 7.8
station 7.7
direction 7.6
ship 7.5
dark 7.5
sunrise 7.5
life 7.2
canal 7.1

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

sky 99.9
outdoor 99.8
person 97.1
clothing 91.4
black and white 86.8
man 86.1
people 73.6
train 71.1
vehicle 59.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Female, 54.3%
Angry 46.7%
Happy 45.1%
Disgusted 45.1%
Confused 45.1%
Sad 47.7%
Surprised 45.1%
Calm 50.2%

AWS Rekognition

Age 38-59
Gender Male, 54.6%
Angry 45.6%
Calm 49.1%
Disgusted 45.4%
Sad 48.3%
Surprised 45.4%
Confused 46.1%
Happy 45.1%

AWS Rekognition

Age 38-59
Gender Male, 50.4%
Angry 49.6%
Confused 49.6%
Disgusted 49.8%
Surprised 49.5%
Happy 49.6%
Calm 49.9%
Sad 49.6%

Microsoft Cognitive Services

Age 27
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Train 92.4%
Helicopter 89.6%