Human Generated Data

Title

[Two uniformed men and woman learning against ship railing]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.467.28

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Two uniformed men and woman learning against ship railing]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.467.28

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 98.6
Person 98.6
Person 97.9
Banister 87.1
Handrail 87.1
Food 70.9
Meal 70.9
Finger 70.3
Apparel 63
Clothing 63
Silhouette 58.6

Clarifai
created on 2019-11-19

people 99.6
man 98.4
light 98.3
monochrome 97.7
adult 96.8
street 95.3
city 95.2
portrait 94.6
shadow 93.6
one 91
silhouette 90.8
woman 90.7
landscape 89.7
group 89.5
girl 88.9
backlit 88.5
music 88.5
room 87.9
two 86.9
wear 86.9

Imagga
created on 2019-11-19

man 29.6
business 27.3
office 25.7
person 25.6
male 25.5
people 24
businessman 21.2
adult 19.4
work 18
black 18
piano 17.8
corporate 17.2
grand piano 17.2
silhouette 16.6
professional 16.4
computer 16.3
laptop 15.8
job 15
executive 14.9
percussion instrument 14.2
suit 13.7
light 12.7
keyboard instrument 12.6
stringed instrument 12.3
success 12.1
sitting 12
dark 11.7
working 11.5
couple 11.3
musical instrument 11.3
manager 11.2
men 11.2
portrait 10.3
modern 9.8
looking 9.6
happy 9.4
lifestyle 9.4
sky 9
human 9
night 8.9
happiness 8.6
boss 8.6
building 8.5
worker 8.5
finance 8.4
clothing 8.4
successful 8.2
alone 8.2
businesswoman 8.2
spectator 7.9
smile 7.8
face 7.8
chair 7.8
life 7.6
necktie 7.6
senior 7.5
confident 7.3
smiling 7.2
room 7.2
shadow 7.2
women 7.1
love 7.1
baron 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

person 94.9
black and white 88
monochrome 84.6
text 69.9
clothing 52.9
man 50.3

Color Analysis

Feature analysis

Amazon

Person 98.6%

Categories

Captions