Human Generated Data

Title

[Young man with ship model]

Date

1930s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.454.6

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Young man with ship model]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-05-30

Human 99.8
Person 99.8
Musical Instrument 91.8
Musician 91.8
Leisure Activities 81.6
Pianist 72.5
Piano 72.5
Performer 72.5
Clothing 68.9
Apparel 68.9
Finger 62.6
Undershirt 58
Man 57.5

Clarifai
created on 2019-05-30

people 99.7
one 97.7
music 97.6
monochrome 97.1
portrait 96.5
adult 96.5
musician 96.1
man 96
instrument 92
piano 91.7
concert 91
singer 89.3
wear 88.6
jazz 87.9
shadow 86.8
stage 85.8
woman 85.5
dark 84.5
light 81.2
black and white 80.9

Imagga
created on 2019-05-30

stringed instrument 65
musical instrument 49
bowed stringed instrument 43.3
piano 43.2
grand piano 41
black 36.7
keyboard instrument 32.2
person 31.7
percussion instrument 31.4
body 29.6
people 29
adult 27.6
man 26.2
model 25.7
hair 25.4
portrait 25.2
sexy 24.1
male 23.4
attractive 23.1
violin 23
studio 22.8
dark 21.7
skin 21.3
sensuality 20.9
nude 20.4
fashion 18.9
human 18
one 17.9
naked 17.4
face 16.3
erotic 16.1
cello 15.5
posing 15.1
looking 14.4
expression 13.7
style 13.4
passion 13.2
hand 12.9
serious 12.4
musician 12.1
handsome 11.6
pretty 11.2
sensual 10.9
lifestyle 10.8
upright 10.8
torso 10.7
sexual 10.6
desire 10.6
lady 10.6
singer 10.3
sitting 10.3
art 9.8
viol 9.8
couple 9.6
love 9.5
men 9.4
holding 9.1
healthy 8.8
performer 8.7
women 8.7
light 8.7
hands 8.7
arm 8.5
health 8.3
slim 8.3
suit 8.2
blond 8.1
scholar 8
cute 7.9
bare 7.8
elegant 7.7
glamor 7.7
seductive 7.7
power 7.6
legs 7.6
silhouette 7.5
pose 7.3
music 7.2
businessman 7.1
device 7.1

Google
created on 2019-05-30

Microsoft
created on 2019-05-30

person 93.4
clothing 77.6
black and white 72.5
man 69.4
human face 66.7
dark 44.2
bowed instrument 28.1

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 72.3%
Confused 2.7%
Happy 13.3%
Angry 31.6%
Surprised 4%
Calm 31.6%
Disgusted 2.4%
Sad 14.4%

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a man holding a guitar 66.2%
a man sitting in front of a instrument 62%
a man holding a guitar in front of a instrument 52.5%