Human Generated Data

Title

[Lux Feininger by Christmas tree]

Date

1948-1955

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.539.5

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Lux Feininger by Christmas tree]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1948-1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 98.2
Person 98.2
Furniture 98
Couch 98
Chair 97.6
Sitting 94.3
Cushion 87.1
Apparel 74.7
Clothing 74.7
Face 74.2
Plant 69.8
Indoors 65.7
Room 65.7
Living Room 65.7
Display 64.6
Monitor 64.6
Screen 64.6
Electronics 64.6
Head 62.6
Suit 61
Coat 61
Overcoat 61
Photo 60.6
Photography 60.6
Skin 59.1

Clarifai
created on 2019-11-19

people 99.9
adult 98.5
one 97.9
man 97
two 93.6
wear 92.7
portrait 92.7
group 91.6
furniture 91.5
war 87.1
administration 86
room 84.2
chair 83.7
music 83.4
indoors 82.6
group together 82.2
leader 82.2
military 81.9
seat 80.7
facial expression 79.8

Imagga
created on 2019-11-19

man 33.6
person 31.7
male 31.5
adult 28.8
people 27.9
portrait 27.2
bow tie 23.2
suit 22.8
hair 20.6
face 19.9
necktie 19.7
businessman 18.5
business 17.6
human 17.3
sitting 17.2
attractive 16.8
garment 16.5
room 16.3
looking 16
clothing 15.9
black 15.5
expression 15.4
pretty 14.7
model 14
men 13.7
fashion 13.6
handsome 13.4
couple 13.1
lifestyle 13
home 12.8
guy 12.5
smile 12.1
sexy 12.1
serious 11.4
elegant 11.1
casual 11
work 11
alone 11
dress 10.8
happy 10.7
lady 10.6
corporate 10.3
love 10.3
happiness 10.2
office 10.1
indoor 10
hand 9.9
eye 9.8
world 9.8
interior 9.7
hairdresser 9.4
confident 9.1
old 9.1
one 9
romance 8.9
success 8.9
covering 8.8
indoors 8.8
look 8.8
executive 8.7
couch 8.7
professional 8.4
relationship 8.4
mature 8.4
cheerful 8.1
life 7.7
sofa 7.7
head 7.6
manager 7.5
inside 7.4
sensual 7.3
family 7.1
posing 7.1
modern 7
together 7
newspaper 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

person 95.8
human face 95
black and white 89.3
man 85.5
text 85.1
statue 84.8
clothing 84.2

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Male, 86.9%
Angry 0.7%
Surprised 13.1%
Happy 0.2%
Disgusted 0.2%
Calm 83.8%
Confused 0.3%
Fear 0.5%
Sad 1.2%

Feature analysis

Amazon

Person 98.2%

Captions

Microsoft

a man standing in a room 76.6%
a man sitting in a room 68.5%
a man in a room 68.4%