Human Generated Data

Title

[View out shop window of women on sidewalk, San Francisco, California?]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.468.15

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[View out shop window of women on sidewalk, San Francisco, California?]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.468.15

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Person 99.2
Human 99.2
Automobile 95.6
Car 95.6
Transportation 95.6
Vehicle 95.6
Person 91.9
Apparel 89.3
Clothing 89.3
Lighting 86.1
Furniture 85.5
Wheel 83.9
Machine 83.9
Indoors 79.8
Room 79.8
Shelf 70.2
Coat 63.2
Overcoat 63.2
Meal 58.5
Food 58.5
Bookcase 57.8
Suit 56.5

Clarifai
created on 2019-11-19

no person 97.6
people 96.8
monochrome 94.9
street 92.7
architecture 91.6
room 90.9
group 90.2
city 89.8
travel 89.3
vehicle 88.5
transportation system 87.5
indoors 87.1
technology 86.8
business 86.6
building 85.9
reflection 84.1
adult 83.3
one 82.9
light 82.4
furniture 82.4

Imagga
created on 2019-11-19

library 73.7
building 66.6
structure 42.9
city 27.4
architecture 23.4
urban 20.1
shop 19.6
black 16.8
old 15.3
business 15.2
bookshop 14.9
music 13.5
people 13.4
office 13
travel 12
technology 11.9
mercantile establishment 11.8
work 11.8
room 11.3
sound 11.2
modern 11.2
window 11.1
man 11
industry 10.2
indoor 10
house 10
tourism 9.9
equipment 9.7
interior 8.8
piano 8.8
home 8.8
musical 8.6
art 8.5
street 8.3
inside 8.3
light 8
instrument 8
place of business 7.9
sitting 7.7
books 7.7
wall 7.7
horizontal 7.5
computer 7.4
university 7.2
history 7.2
steel 7.1
indoors 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

black and white 94.9
book 92.1
street 77.2
text 76.2
monochrome 74.9
person 69.3
clothing 65.5

Color Analysis

Feature analysis

Amazon

Person 99.2%
Car 95.6%
Wheel 83.9%

Categories

Imagga

interior objects 85.5%
paintings art 12.6%

Captions

Microsoft
created on 2019-11-19

a close up of a library 58.6%
a person in a library 27.5%

Text analysis

Amazon

G
A RII
F EE A RII
OMITO3D
F EE
OZHW564

Google

2/02 164
2/02
164