Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

[Werner Jackson with cat]

Date

1932

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.225.27

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Werner Jackson with cat]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1932

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.225.27

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-18

Person 98.4
Human 98.4
Pet 90.2
Animal 90.2
Mammal 87.7
Cat 83.5
Manx 73.8
Canine 69.3
Face 59.3
Female 56.3
Kitten 56.2

Clarifai
created on 2019-11-18

people 99.9
adult 99.2
one 99
man 97.5
wear 96.7
two 96.7
administration 95.8
group together 94.1
group 94
vehicle 93.1
leader 93
portrait 90.3
outfit 89.4
war 88.7
military 87
woman 82.8
chair 81.1
facial expression 80.7
veil 80.4
music 80.4

Imagga
created on 2019-11-18

man 31.6
male 27.7
people 26.2
person 25.7
adult 23.3
happy 18.2
portrait 17.5
lifestyle 16.6
conch 14.6
outdoors 14.2
business 14
human 13.5
job 13.3
couple 13.1
outside 12.8
pretty 12.6
attractive 12.6
happiness 12.5
businessman 12.3
outdoor 12.2
corporate 12
men 12
groom 11.9
women 11.9
love 11.8
gastropod 11.7
model 11.7
smile 11.4
one 11.2
day 11
suit 11
hands 10.4
hair 10.3
expression 10.2
smiling 10.1
wedding 10.1
successful 10.1
dress 9.9
modern 9.8
cheerful 9.7
bride 9.7
sport 9.6
professional 9.3
executive 9.2
clothing 9.2
building 9
success 8.8
looking 8.8
body 8.8
mollusk 8.8
together 8.8
standing 8.7
water 8.7
sitting 8.6
youth 8.5
relaxation 8.4
hand 8.3
fashion 8.3
teenager 8.2
worker 8
brunette 7.8
face 7.8
life 7.7
guy 7.5
park 7.4
girls 7.3
sensuality 7.3
exercise 7.3
office 7.2
sexy 7.2
work 7.1

Google
created on 2019-11-18

Microsoft
created on 2019-11-18

person 98.9
man 95.8
text 94.4
drawing 85.8
black and white 82.2
old 80.5
sketch 66.2
white 60.5
human face 59.2
clothing 54.3
posing 38.9
bowed instrument 6.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-35
Gender Male, 55.2%
Sad 4.5%
Confused 1.6%
Fear 2.4%
Angry 1%
Disgusted 0.3%
Surprised 55.3%
Calm 31.4%
Happy 3.6%

Feature analysis

Amazon

Person
Cat
Person 98.4%

Categories

Imagga

paintings art 99.2%

Captions