Human Generated Data

Title

[Feininger-Hägg Family in Stockholm]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.190.3

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Feininger-Hägg Family in Stockholm]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.190.3

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-18

Human 96.2
Person 96.2
Person 95.8
Person 94.8
Person 85.7
Cheetah 76.8
Animal 76.8
Mammal 76.8
Wildlife 76.8
Face 76.5
People 73.3
Photography 65.5
Portrait 65.5
Photo 65.5
Furniture 64.8
Indoors 58.8
Room 58.8

Clarifai
created on 2019-11-18

people 99.9
group 99.3
adult 99
group together 97.8
man 97.3
woman 96.7
wear 94.2
sit 93.6
child 92.7
two 92.1
several 90.7
four 90.4
recreation 89.9
war 88.8
military 88.4
furniture 86.8
many 84.6
three 84.3
administration 84.1
leader 83.8

Imagga
created on 2019-11-18

people 28.4
man 25.5
person 24.5
lifestyle 20.2
couple 17.4
love 17.4
newspaper 16.7
room 16.5
male 16.5
adult 16.2
kin 15.7
portrait 13.6
women 13.4
product 12.8
fashion 12.8
world 12.8
black 12.6
sexy 12
mother 12
outdoors 11.9
attractive 11.9
happy 11.3
classroom 11.2
holding 10.7
romance 10.7
lady 10.5
together 10.5
old 10.4
boy 10.4
style 10.4
men 10.3
youth 10.2
day 10.2
two 10.2
model 10.1
creation 10
child 9.9
one 9.7
hair 9.5
happiness 9.4
indoor 9.1
pretty 9.1
computer 8.8
life 8.7
parent 8.5
leisure 8.3
vintage 8.3
human 8.2
dress 8.1
romantic 8
interior 8
face 7.8
studio 7.6
passion 7.5
retro 7.4
business 7.3
aged 7.2
smiling 7.2
body 7.2
indoors 7
modern 7

Google
created on 2019-11-18

Microsoft
created on 2019-11-18

text 95.4
person 95.3
clothing 92.8
drawing 86.9
black and white 79.5
man 78.2
sketch 53.2
human face 51.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 31-47
Gender Female, 51.4%
Fear 45.5%
Sad 50.2%
Angry 46.2%
Calm 46.3%
Happy 45.1%
Surprised 45.2%
Confused 45.5%
Disgusted 45.9%

AWS Rekognition

Age 25-39
Gender Male, 69.7%
Fear 2.9%
Confused 0.2%
Angry 1.4%
Happy 0.5%
Sad 76%
Disgusted 0.1%
Calm 18.6%
Surprised 0.2%

AWS Rekognition

Age 29-45
Gender Female, 51.8%
Fear 45.1%
Surprised 45%
Disgusted 45%
Happy 45%
Sad 51.8%
Angry 45.2%
Confused 45%
Calm 47.8%

Feature analysis

Amazon

Person 96.2%

Categories

Imagga

people portraits 94.1%
paintings art 3.4%