Human Generated Data

Title

[Woman with black cat]

Date

1932

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.225.7

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Woman with black cat]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1932

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Person 97.1
Human 97.1
Outdoors 87.1
Clothing 86.6
Apparel 86.6
Nature 84.6
Finger 82.8
Sand 75.2
People 66
Water 65.8
Snow 56.1

Clarifai
created on 2019-11-19

monochrome 99.6
people 98.2
one 96.5
adult 96.1
black and white 95.7
action 93.4
man 93.1
sport 90.6
winter 89.3
competition 89.3
woman 88.1
girl 87.6
snow 87.5
street 85.9
shadow 85.5
outdoors 85.4
nature 85
beach 85
sports equipment 84.7
portrait 84.4

Imagga
created on 2019-11-19

stingray 35.5
sexy 31.3
ray 27.3
adult 24.7
fashion 24.1
body 24
attractive 23.1
person 21.9
posing 20.4
people 19.5
model 19.4
pretty 18.9
lady 18.7
beach 18.5
sensuality 18.2
fitness 18.1
lifestyle 18.1
water 16.7
sport 16.6
slim 16.6
man 16.2
portrait 16.2
hair 15.1
style 14.8
black 13.8
nude 13.6
face 13.5
male 13.5
elegance 13.4
erotic 13.2
wall 12.8
human 12.7
exercise 12.7
dress 12.6
healthy 12.6
sand 12.3
happy 11.3
athlete 11.1
health 11.1
fit 11
bikini 11
dance 11
cute 10.8
brunette 10.5
summer 10.3
sea 10.2
ocean 10.1
sensual 10
pose 10
one 9.7
naked 9.6
skin 9.5
women 9.5
action 9.3
weight 9.2
modern 9.1
gorgeous 9.1
swimsuit 9
outdoors 9
clothing 8.9
cool 8.9
performer 8.6
sitting 8.6
device 8.5
strength 8.4
relaxation 8.4
guy 8.4
dark 8.4
training 8.3
blond 8.2
vacation 8.2
lovely 8
dancer 7.7
elegant 7.7
motion 7.7
hand 7.7
muscular 7.6
relax 7.6
tattoo 7.6
leisure 7.5
waves 7.4
20s 7.3
teenager 7.3
shower 7.2
smiling 7.2
active 7.2
travel 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

Face analysis

Amazon

AWS Rekognition

Age 23-35
Gender Male, 65.2%
Disgusted 2.2%
Surprised 14.4%
Calm 3.1%
Happy 1.3%
Angry 44.7%
Fear 26.1%
Sad 0.9%
Confused 7.3%

Feature analysis

Amazon

Person 97.1%

Captions

Microsoft

a man holding a gun 37%
a man flying through the air 35.2%
a man flying through the sky 33.4%