Human Generated Data

Title

[Locomotive]

Date

1930's

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.448.1

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Locomotive]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930's

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.448.1

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-04-05

Person 77.7
Human 77.7
Clothing 70.9
Apparel 70.9
Piano 56.5
Musical Instrument 56.5
Leisure Activities 56.5
Wood 55.4

Clarifai
created on 2021-04-05

monochrome 99.2
no person 98.7
one 94.8
indoors 94.4
still life 94.3
business 92.9
desk 91.5
people 91
paper 90.6
technology 90.3
office 89.6
blur 88.7
analogue 88.2
monochromatic 87.8
medicine 87.4
furniture 87.3
adult 86.5
black and white 85.8
telephone 84.9
electronics 84.9

Imagga
created on 2021-04-05

business 20
plan 19.8
glass 19.3
perfume 18.9
design 16.4
blueprint 15.7
3d 15.5
drawing 15.2
sketch 14.7
table 14.7
paper 14.5
construction 13.7
architecture 13.3
bottle 12.8
technology 12.6
project 12.5
toiletry 12.3
liquid 12.2
render 12.1
drafting 11.8
negative 11.7
equipment 11.3
office 11.2
modern 11.2
container 11.2
clean 10.9
transparent 10.7
work 10.2
device 10.1
interior 9.7
success 9.7
architect 9.7
engineer 9.6
graphic 9.5
closeup 9.4
designer 8.7
pencil 8.6
close 8.6
finance 8.4
professional 8.4
film 8.4
house 8.4
freshness 8.3
home 8.2
symbol 8.1
idea 8
silver 8
objects 7.8
paperwork 7.8
luxury 7.7
industry 7.7
health 7.6
engineering 7.6
power 7.6
drink 7.5
care 7.4
furniture 7.4
water 7.3
metal 7.2
detail 7.2
color 7.2
computer 7.2
celebration 7.2
businessman 7.1
medicine 7

Google
created on 2021-04-05

Microsoft
created on 2021-04-05

ship 94.6
indoor 85.1
fog 71.1
text 68.9
black and white 65.6

Color Analysis

Feature analysis

Amazon

Person 77.7%
Piano 56.5%

Categories

Imagga

interior objects 99.8%

Captions

Microsoft
created on 2021-04-05

a person sitting on a table 26.2%
a person sitting at a table 26.1%