Human Generated Data

Title

[Young man with ship model]

Date

1930s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.454.8

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Young man with ship model]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.454.8

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-05-30

Clarifai
created on 2019-05-30

people 99.5
monochrome 99.4
portrait 98.7
adult 96.8
music 95.9
one 95.8
light 94.5
shadow 94.3
black and white 94
man 93.9
studio 93.7
dark 93
concert 91.9
art 91.2
musician 90.9
girl 88.4
model 88.4
woman 88.3
jazz 86.6
singer 83.6

Imagga
created on 2019-05-30

body 44
sexy 39.4
black 36.1
person 35.3
model 34.3
nude 31.1
attractive 30.1
skin 28.4
studio 28.1
adult 27.6
posing 26.7
naked 26.1
erotic 24
hair 23.8
performer 23.5
sensuality 22.7
dancer 22.6
people 22.3
lady 21.1
portrait 20.7
fashion 20.4
dark 20.1
pose 20
legs 17.9
one 17.2
pretty 16.8
human 15.8
man 15.7
face 15.6
sensual 15.5
stage 15.4
women 14.2
slim 13.8
figure 13.7
entertainer 13.5
style 13.4
musical instrument 13.3
passion 13.2
cute 12.9
male 12.8
elegance 12.6
sitting 12
looking 12
sexual 11.6
desire 11.6
seductive 11.5
brunette 11.3
back 11.1
fitness 10.9
blond 10.8
art 10.6
percussion instrument 10.2
expression 10.2
lifestyle 10.1
exercise 10
dance 10
torso 9.8
musician 9.5
healthy 9.5
strength 9.4
platform 9.3
singer 9.3
device 9.2
bare 8.8
hands 8.7
love 8.7
gorgeous 8.2
dress 8.1
vibraphone 8.1
guy 8
active 7.8
elegant 7.7
health 7.7
serious 7.6
arm 7.6
belly 7.6
strong 7.5
sport 7.5
silhouette 7.5
macho 7.4
training 7.4
fit 7.4
lighting 7.3
shadow 7.2
wet 7.2

Google
created on 2019-05-30

Microsoft
created on 2019-05-30

man 92.9
person 89.4
clothing 83.8
human face 77.3
black and white 76.4
dark 59.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 27-44
Gender Male, 87.2%
Angry 1.9%
Surprised 1.8%
Confused 1.3%
Calm 88.7%
Happy 1.8%
Disgusted 2.2%
Sad 2.2%

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft
created on 2019-05-30

a man in a dark room 77.8%
a man sitting in a dark room 62.2%
a man standing in a dark room 62.1%