Human Generated Data

Title

[Man looking at ship]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.218.23

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Man looking at ship]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.218.23

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Head 96.5
Person 79.3
Human 79.3
Skin 76.5
Art 62
Back 61.9
Sculpture 60.6

Clarifai
created on 2019-11-19

monochrome 99.9
people 99.4
portrait 99.2
girl 98.8
black and white 97.9
shadow 97.3
silhouette 97.3
light 97.2
art 97.2
analogue 96.5
beach 96.3
man 95.3
sea 94.6
mono 94.5
baby 93.6
street 93.5
studio 93.3
nude 93.2
noir 93
adult 92.4

Imagga
created on 2019-11-19

world 33.2
black 32.8
person 29
adult 25.2
human 24.8
man 24.2
face 23.4
dark 23.4
portrait 23.3
male 19.7
sexy 19.3
attractive 18.9
skin 18.5
people 18.4
model 17.9
hair 15.9
head 15.1
expression 14.5
hand 14.4
looking 14.4
one 14.2
hat 14
eyes 13.8
close 13.7
body 13.6
child 13.3
guy 12.5
clothing 11.5
serious 11.4
light 11.4
cowboy hat 11.3
eye 10.7
look 10.5
pretty 10.5
love 10.3
emotion 10.1
handsome 9.8
fashion 9.8
lady 9.7
erotic 9.6
studio 9.1
style 8.9
beard 8.9
baby 8.8
healthy 8.8
lighting 8.7
headdress 8.6
mouth 8.5
lips 8.3
makeup 8.2
macho 8.2
lifestyle 8
women 7.9
masculine 7.8
cigarette 7.8
silhouette 7.5
smoke 7.4
fit 7.4
water 7.3
sensual 7.3
sensuality 7.3
sunset 7.2

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

text 88.1
black and white 85.4
statue 72.3
monochrome 62.5
dark 52.9
wedding dress 50.4

Color Analysis

Feature analysis

Amazon

Person 79.3%

Categories

Captions