Human Generated Data

Title

[Men working on train, Julia Feininger in foreground]

Date

1937

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.521.12

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Men working on train, Julia Feininger in foreground]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.521.12

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Person 99.4
Human 99.4
Train 95.7
Vehicle 95.7
Transportation 95.7
Person 95
Leisure Activities 94
Person 87.2
Performer 84.5
Musical Instrument 83.9
Musician 83.9
Piano 78.8
Pianist 75.2
Person 64.8
Machine 58.3
Wheel 58.3

Clarifai
created on 2019-11-19

people 99.7
adult 97.8
transportation system 95.8
group 95.4
man 95
vehicle 95
group together 94.9
one 93.7
woman 93.5
two 92.7
wear 91.6
administration 89.6
train 87.3
aircraft 87.3
war 87.2
military 87.1
railway 86.4
three 86.3
several 86.2
many 82.7

Imagga
created on 2019-11-19

passenger car 52.6
passenger 50.3
car 48.6
vehicle 37.9
wheeled vehicle 37.6
transportation 33.2
train 28.3
conveyance 23.3
travel 23.2
transport 21
urban 20.1
city 19.1
building 14.5
industry 12.8
business 12.7
male 12.1
station 11.6
traffic 11.4
architecture 11.4
journey 11.3
boat 11.2
technology 11.1
public transport 10.7
tourism 10.7
vacation 10.6
working 10.6
tourist 10.6
old 10.4
street 10.1
man 10.1
people 10
road 9.9
ship 9.8
railway 9.8
public 9.7
work 9.4
machine 9.2
office 9.1
subway 8.9
departure 8.9
railroad 8.8
subway train 8.8
cargo 8.7
water 8
holiday 7.9
attendant 7.7
port 7.7
roof 7.6
commerce 7.5
structure 7.4
town 7.4
truck 7.4
bullet train 7.2
sky 7
modern 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

transport 95.3
clothing 90.4
black and white 88.3
standing 87.7
person 85.2
train 82.1
bus 27.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 48-66
Gender Male, 50%
Calm 54%
Confused 45.1%
Disgusted 45%
Angry 45.1%
Happy 45.2%
Surprised 45.1%
Sad 45.5%
Fear 45.1%

AWS Rekognition

Age 49-67
Gender Male, 51.9%
Angry 45.1%
Surprised 45%
Happy 45%
Disgusted 45%
Calm 45.1%
Confused 45.1%
Fear 45.2%
Sad 54.4%

AWS Rekognition

Age 35-51
Gender Female, 50.1%
Surprised 49.6%
Disgusted 49.5%
Happy 50.1%
Sad 49.6%
Calm 49.5%
Confused 49.5%
Fear 49.6%
Angry 49.5%

AWS Rekognition

Age 29-45
Gender Male, 50%
Angry 49.6%
Surprised 49.5%
Calm 49.6%
Sad 50%
Happy 49.5%
Disgusted 49.6%
Fear 49.5%
Confused 49.6%

Feature analysis

Amazon

Person 99.4%
Train 95.7%
Wheel 58.3%

Categories