Human Generated Data

Title

[People watching ships in harbor]

Date

20th century

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.220.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[People watching ships in harbor]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.220.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Apparel 100
Clothing 100
Person 99.4
Human 99.4
Person 98.5
Person 96.6
Person 92.9
Hat 85
Sun Hat 85
Coat 71.9
Overcoat 71.9
Sleeve 62
Face 58.2

Clarifai
created on 2019-11-19

people 100
adult 98.3
group 97.8
man 95.9
group together 95.6
woman 94.4
street 94.1
monochrome 94
child 93.6
wear 93.4
two 92.5
one 89.9
portrait 88.7
recreation 88.4
many 82.5
war 82
room 81.7
four 81.4
boy 80.5
several 78.8

Imagga
created on 2019-11-19

world 27.4
person 22.1
man 19.6
people 19.5
old 18.8
dark 18.4
spectator 15.5
religion 15.2
adult 15.2
male 13.5
silhouette 13.2
black 12.8
statue 12.5
night 12.4
religious 11.2
architecture 10.9
one 10.4
portrait 10.4
model 10.1
symbol 10.1
dress 9.9
body 9.6
god 9.6
sculpture 9.6
light 9.5
stone 9.3
human 9
history 8.9
love 8.7
ancient 8.6
art 8.6
wall 8.5
culture 8.5
clothing 8.4
famous 8.4
musical instrument 8.1
sexy 8
lifestyle 7.9
hair 7.9
antique 7.8
sitting 7.7
grunge 7.7
fashion 7.5
serene 7.5
relaxation 7.5
passion 7.5
city 7.5
monument 7.5
vintage 7.4
historic 7.3
lady 7.3
sensual 7.3
pose 7.2
dirty 7.2
building 7.2
sunset 7.2
face 7.1

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

clothing 98
black and white 93.6
text 93.4
man 90.9
monochrome 89
person 88.1
street 57.6
hat 51.5

Color Analysis

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft
created on 2019-11-19

a man sitting on a bed 27%
a man sitting in a room 26.9%
a man sitting in a chair 26.8%