Human Generated Data

Title

[Outdoor sculpture on building at New York World's Fair]

Date

1940

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.527.23

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Outdoor sculpture on building at New York World's Fair]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.527.23

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 95.3
Acrobatic 95.1
Leisure Activities 91.9
Circus 73.1
Person 65.8

Clarifai
created on 2019-11-19

people 98
man 94.9
performance 92.9
adult 91.4
one 90.9
music 88.1
woman 87.6
business 86
monochrome 85.7
fashion 84.8
indoors 82.4
musician 81.7
stage 80.7
two 80.7
concert 80.6
wear 79.5
dress 79
light 78.5
shadow 76.6
love 74.3

Imagga
created on 2019-11-19

black 32.5
man 24.2
person 23.2
portrait 20.1
male 19.9
dark 19.2
sax 18.9
sexy 18.5
people 17.3
lady 17
style 14.8
fashion 14.3
adult 14.3
smoke 13.9
hand 13.7
sensual 13.6
body 13.6
attractive 13.3
elegant 12.8
music 12.7
device 12.1
men 12
hair 11.9
model 11.7
face 11.4
musician 11.2
holding 10.7
love 10.3
danger 10
studio 9.9
instrument 9.8
art 9.8
human 9.7
hands 9.6
performer 9.5
women 9.5
light 9.4
emotion 9.2
metal 8.9
mystery 8.6
robe 8.6
performance 8.6
erotic 8.5
suit 8.4
hat 8.4
entertainment 8.3
clothing 8.2
dress 8.1
night 8
wine 7.9
wind instrument 7.8
cigarette 7.8
pretty 7.7
expression 7.7
musical 7.7
skin 7.6
garment 7.6
elegance 7.6
life 7.2
looking 7.2
handsome 7.1

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

concert 97.4
person 92.7
guitar 87.2
musical instrument 85.9
microphone 80
text 73.6
clothing 72.4
black and white 61.5
hand 46.4

Color Analysis

Feature analysis

Amazon

Person 65.8%

Categories

Captions