Human Generated Data

Title

[Feininger-Hägg Family in Stockholm]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.190.2

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Feininger-Hägg Family in Stockholm]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-18

Human 99.3
Person 99.3
Person 98.7
Accessory 97.8
Accessories 97.8
Tie 97.8
Person 97
Sitting 96.8
Person 95.3
Person 92.3
Person 81.2
Urban 78.1
People 75.8
Face 75.6
Person 67.8
Apparel 67.4
Clothing 67.4
Hat 67.4
Nature 65.6
Undershirt 60.7
Building 60.6
Outdoors 60.4
Crowd 58.6
Restaurant 55.9
Cafeteria 55.8
Suit 55.8
Overcoat 55.8
Coat 55.8
Pottery 55.2

Clarifai
created on 2019-11-18

people 100
group 99.4
adult 99.2
furniture 97.6
man 97
many 96.8
group together 96.6
woman 96.1
administration 96
sit 95.6
one 94.5
wear 94
leader 93.9
war 93.8
room 93.4
two 91.8
child 88.7
military 88.5
home 87.6
elderly 86.9

Imagga
created on 2019-11-18

musical instrument 32.2
stringed instrument 23.5
bowed stringed instrument 21.4
man 20.8
sax 20.3
people 17.8
male 17.7
black 17.4
violin 16.3
person 15.6
wind instrument 15.5
room 14.5
barbershop 14.2
silhouette 14.1
vintage 12.4
sexy 12
dark 11.7
shop 11.4
grunge 11.1
adult 10.8
play 10.3
business 10.3
men 10.3
music 9.9
night 9.8
old 9.7
style 9.6
body 9.6
brass 9.4
dirty 9
mercantile establishment 9
retro 9
businessman 8.8
urban 8.7
couple 8.7
women 8.7
youth 8.5
art 8.5
portrait 8.4
studio 8.4
sport 8.2
percussion instrument 8.2
suit 8.1
keyboard instrument 8
hair 7.9
design 7.9
classroom 7.9
wall 7.7
cello 7.6
power 7.6
poster 7.5
window 7.5
fun 7.5
city 7.5
light 7.3
back 7.3
accordion 7.2
aged 7.2
team 7.2

Google
created on 2019-11-18

Microsoft
created on 2019-11-18

text 97.9
black and white 97.8
person 96.5
street 78.5
clothing 77.5
monochrome 76.5
people 55.1
man 55

Face analysis

Amazon

AWS Rekognition

Age 36-52
Gender Male, 50.5%
Disgusted 45%
Fear 45.3%
Happy 45%
Surprised 45%
Angry 45.1%
Sad 53.9%
Calm 45.6%
Confused 45%

AWS Rekognition

Age 33-49
Gender Male, 50.1%
Surprised 45%
Happy 45%
Sad 48.7%
Angry 45.1%
Calm 45.3%
Fear 50.8%
Disgusted 45%
Confused 45%

Feature analysis

Amazon

Person 99.3%
Tie 97.8%
Hat 67.4%

Captions

Microsoft

a group of people sitting in front of a building 75.4%
a group of people sitting at a table 68.2%
a group of people sitting in chairs 68.1%