Human Generated Data

Title

[Unidentified man, San Francisco, California]

Date

1936-1937

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.584.30

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Unidentified man, San Francisco, California]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936-1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-20

Human 91.1
Person 91.1
Home Decor 88
Person 83.1
Smoke 78.2
Nature 75.7
Meal 73.1
Food 73.1
Leisure Activities 63.4
Clothing 63.3
Apparel 63.3
Crypt 58.4
Female 57.9
Dish 57.7
People 57.3
Building 55.4
Architecture 55

Clarifai
created on 2019-11-20

people 99.7
adult 98.5
monochrome 96.7
one 96.3
man 96.2
vehicle 92
war 90.2
smoke 88.8
portrait 88.5
grinder 87.9
wear 85.7
uniform 84.3
military 84.1
two 82.4
concentration 80.7
street 80.6
offense 80
transportation system 79.2
woman 78
skirmish 76.4

Imagga
created on 2019-11-20

architecture 46.2
ancient 38.9
stone 37.8
old 35.5
building 28.9
history 27.7
structure 24.9
travel 24.6
column 24.6
religion 23.3
tourism 23.1
culture 21.4
city 20
antique 19.9
monument 18.7
historic 18.3
sculpture 18.3
wall 17.9
temple 17.8
art 17
house 15.2
ruin 14.6
historical 14.1
church 13.9
arch 13.8
support 13.6
tourist 13.6
columns 12.7
statue 11.9
landmark 11.7
dark 11.7
traditional 11.6
brick 11.5
cemetery 11.3
device 11.3
exterior 11.1
chair 10.9
palace 10.8
marble 10.8
god 10.5
famous 10.2
town 10.2
light 10
memorial 10
anvil 9.9
civilization 9.8
sky 9.6
window 9.3
vintage 9.1
carving 8.9
urban 8.7
past 8.7
architectural 8.7
construction 8.6
religious 8.4
cell 8.4
place 8.4
inside 8.3
gravestone 8.2
home 8.2
aged 8.1
fountain 8
night 8
seat 8
corridor 7.9
prayer 7.7
medieval 7.7
block 7.6
design 7.3
dirty 7.2
black 7.2
tower 7.2
grave 7.2

Google
created on 2019-11-20

Microsoft
created on 2019-11-20

Feature analysis

Amazon

Person 91.1%

Captions

Microsoft

an old photo of a person 57.2%
old photo of a person 55.5%
a black and white photo of a person 43.5%