Human Generated Data

Title

[Feininger-Hägg Family in Stockholm]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.190.14

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Feininger-Hägg Family in Stockholm]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.190.14

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-18

Person 98.7
Human 98.7
Person 98.7
Person 96.1
Person 96
Apparel 89.4
Clothing 89.4
Person 88.8
Face 88.3
Sitting 87.6
Furniture 78.9
People 76.7
Female 68.1
Outdoors 67.6
Photo 64.9
Photography 64.9
Portrait 64.9
Nature 60
Plant 59.7
Girl 56.6
Countryside 56.4
Pants 56.3
Funeral 56.2
Musician 55.6
Musical Instrument 55.6
Indoors 55.4

Clarifai
created on 2019-11-18

people 100
adult 99.7
group 98.9
wear 97.6
two 97.6
man 97.6
woman 97.2
furniture 96.6
group together 95
sit 94.2
child 93.6
military 91.1
administration 91
one 90.2
seat 87.8
leader 87.4
war 87.1
portrait 85.8
recreation 85.7
four 84.2

Imagga
created on 2019-11-18

grunge 31.5
old 21.6
antique 19.9
texture 19.4
vintage 19
newspaper 18.8
dirty 18.1
wall 17.9
device 17.4
art 17
classroom 16.4
aged 16.3
grungy 16.1
pattern 15.7
person 15
man 14.8
product 14.5
frame 14.1
black 13.9
retro 13.9
washboard 13.4
room 13.4
paint 12.7
border 12.7
dark 12.5
people 12.3
rough 11.8
danger 11.8
space 11.6
creation 11.4
structure 11.2
design 10.7
male 10.6
world 10.5
urban 10.5
ancient 10.4
building 10.4
textured 9.6
text 9.6
damaged 9.5
graphic 9.5
gun 9.4
architecture 9.4
light 9.4
silhouette 9.1
material 8.9
burnt 8.7
paper 8.6
adult 8.6
travel 8.4
color 8.3
mask 8
decay 7.7
dirt 7.6
old fashioned 7.6
outdoors 7.5
element 7.4
park 7.4
brown 7.4
protection 7.3
rifle 7.2
decoration 7.1
sky 7

Google
created on 2019-11-18

Microsoft
created on 2019-11-18

text 96.8
sitting 96.1
black and white 93.8
person 93.4
indoor 85.3
clothing 79.9
musical instrument 69.1
drawing 67.7
concert 64.8
man 57.8
monochrome 52.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 33-49
Gender Male, 53.7%
Happy 45%
Sad 46.9%
Fear 45.6%
Angry 45.2%
Surprised 45.1%
Calm 52%
Disgusted 45%
Confused 45%

AWS Rekognition

Age 11-21
Gender Male, 53.5%
Happy 45%
Angry 45.2%
Surprised 48.1%
Disgusted 45%
Fear 47%
Sad 46.3%
Confused 45.1%
Calm 48.2%

AWS Rekognition

Age 21-33
Gender Male, 52.5%
Fear 46.4%
Calm 47%
Disgusted 45.3%
Angry 45.8%
Surprised 45.5%
Confused 45.4%
Sad 48.8%
Happy 45.9%

AWS Rekognition

Age 26-40
Gender Male, 53.6%
Angry 45.1%
Confused 45.1%
Fear 48.1%
Surprised 47.2%
Disgusted 46.3%
Calm 45.7%
Sad 45.1%
Happy 47.3%

Feature analysis

Amazon

Person 98.7%