Human Generated Data

Title

[Sculptures near the Town Hall in Stockholm, 1936]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1003.145

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Sculptures near the Town Hall in Stockholm, 1936]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1003.145

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Art 98
Sculpture 98
Statue 94.9
Person 94.8
Human 94.8
Person 94.1
Person 90.3
Figurine 89.8
Person 88
Person 86.8
Person 82.7
Person 80
Symbol 78.9
Cross 73.2
Person 71.8
Crucifix 70.8
Person 66.4
Person 56.4

Clarifai
created on 2019-11-16

people 99.9
group 99.3
group together 98.5
adult 97.7
music 97.4
many 97.1
man 96.4
one 95.1
woman 95
two 94.6
wear 91.2
several 88.4
musician 86.8
three 86.1
recreation 84.6
dancing 84.3
administration 83.8
theater 82.6
crowd 80.3
dancer 79.3

Imagga
created on 2019-11-16

statue 30.9
sculpture 29.2
art 20.2
architecture 18.7
old 16.7
city 15
travel 14.8
building 14.7
tourism 14
monument 14
person 13.7
man 13
landmark 12.6
religion 12.5
history 12.5
ancient 12.1
performer 11.7
fountain 11.7
black 11.5
fashion 11.3
stone 11
dark 10.9
dress 10.8
god 10.5
historical 10.3
people 10
dirty 9.9
dancer 9.7
grunge 9.4
light 9.4
wind instrument 9.2
adult 9.1
dance 9.1
musical instrument 8.9
posing 8.9
marble 8.7
scene 8.7
famous 8.4
symbol 8.1
brass 8
women 7.9
horror 7.8
stage 7.8
motion 7.7
figure 7.6
elegance 7.6
boutique 7.5
silhouette 7.4
style 7.4
water 7.3
cornet 7.3
body 7.2
male 7.1
device 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

statue 99.4
text 98.9
sculpture 96.5
art 81.6
black 70.4
person 68.1
white 67.7
museum 67.2
black and white 56.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 16-28
Gender Male, 50%
Disgusted 49.7%
Happy 49.5%
Fear 49.5%
Confused 49.6%
Surprised 49.5%
Calm 49.8%
Angry 49.7%
Sad 49.6%

AWS Rekognition

Age 24-38
Gender Female, 50.1%
Fear 49.6%
Calm 49.6%
Disgusted 49.5%
Happy 49.5%
Sad 49.6%
Confused 49.5%
Surprised 49.5%
Angry 50.1%

AWS Rekognition

Age 18-30
Gender Male, 50.2%
Calm 49.5%
Angry 50.4%
Surprised 49.5%
Disgusted 49.5%
Happy 49.5%
Sad 49.5%
Fear 49.5%
Confused 49.5%

AWS Rekognition

Age 6-16
Gender Male, 50.2%
Disgusted 49.7%
Sad 49.6%
Confused 49.7%
Happy 49.5%
Fear 49.6%
Surprised 49.5%
Calm 49.5%
Angry 49.9%

Feature analysis

Amazon

Person 94.8%

Categories

Imagga

interior objects 56.2%
paintings art 43.1%

Text analysis

Amazon

WN
V