Human Generated Data

Title

[New York World's Fair exhibit of trains]

Date

1940

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.527.5

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[New York World's Fair exhibit of trains]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.527.5

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 99.5
Person 99.5
Transportation 86.1
Vehicle 85.6
Machine 82.8
Spoke 81.8
Person 77.3
Helicopter 75.2
Aircraft 75.2
Airplane 74.1
Wheel 73.9
Outdoors 57.7
Weather 56.2
Nature 56.2

Clarifai
created on 2019-11-19

people 98.9
vehicle 97.9
transportation system 97.4
group 96
monochrome 93
street 92.9
adult 91.8
watercraft 91.7
group together 90.9
man 90.3
war 83
city 82.6
travel 81.6
aircraft 81.4
car 81.2
train 80.2
many 80
industry 79.6
railway 79.3
no person 79

Imagga
created on 2019-11-19

shopping cart 33.7
handcart 25.4
container 22.5
architecture 21.2
wheeled vehicle 21
house 20
building 17.3
white goods 17
dishwasher 15.6
structure 14.1
equipment 13.5
home appliance 13.4
city 13.3
metal 12.9
business 12.7
negative 12.2
construction 12
technology 11.9
black 11.4
old 11.1
chair 11
film 11
industrial 10.9
transportation 10.8
steel 10.6
apartment 10.5
modern 10.5
urban 10.5
home 10.4
industry 10.2
station 10.2
appliance 9.8
sky 9.6
furniture 9.5
empty 9.5
roof 9.5
window 9.5
light 9.4
water 9.3
travel 9.1
finance 8.4
street 8.3
landscape 8.2
design 7.9
sea 7.8
grunge 7.7
house of cards 7.6
ocean 7.5
conveyance 7.3
transport 7.3
glass 7.1
interior 7.1

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

ship 97.9
text 93.9
black and white 77.1
watercraft 69.3
street 64.7
boat 62.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 51-69
Gender Male, 51.4%
Sad 46.1%
Fear 45%
Happy 45%
Angry 45.3%
Calm 45.2%
Surprised 45%
Disgusted 45%
Confused 53.3%

Feature analysis

Amazon

Person 99.5%
Helicopter 75.2%
Airplane 74.1%

Categories

Imagga

interior objects 95.9%
cars vehicles 2.5%

Text analysis

Amazon

RESS
RA
RA TA RESS
TA

Google

RESS RAI MA RA
RESS
RAI
MA
RA