Human Generated Data

Title

[Ladies seated near monument, Stockholm, 1936]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1003.100

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Ladies seated near monument, Stockholm, 1936]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1003.100

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 97.5
Human 97.5
Person 96.9
Person 95.4
Monument 92.3
Building 87.7
Architecture 87.7
Person 86.1
Urban 80.8
Art 80
Sculpture 80
Statue 80
City 79.5
Town 79.5
Downtown 79.5
Person 76
Plant 63.8
Tree 63.8
Person 62.9
Pillar 58.4
Column 58.4
Tower 56.7
Steeple 56.5
Spire 56.5

Clarifai
created on 2019-11-16

people 99.9
one 98.3
adult 97.4
man 95.2
art 94
group 93.2
two 92.4
street 90.9
woman 90.3
no person 86.9
music 86.7
monochrome 85
leader 84.9
administration 84.2
portrait 83.4
wear 83.4
home 81.4
group together 78.3
vehicle 78
sculpture 77.8

Imagga
created on 2019-11-16

architecture 35.1
building 32.4
old 29.3
city 25.8
ancient 22.5
window 19.3
historic 17.4
travel 16.9
house 16.8
street 16.6
landmark 15.3
stone 15.2
wall 14.6
structure 14.5
device 14.3
urban 14
religion 13.4
history 13.4
tourism 12.4
town 12.1
light 11.4
art 11.2
exterior 11.1
office 11
vintage 10.8
lamp 10.7
balcony 10.7
windows 10.6
buildings 10.4
sky 10.2
church 10.2
roof 9.6
home 9.6
construction 9.4
religious 9.4
facade 9
interior 8.8
high 8.7
culture 8.5
historical 8.5
famous 8.4
room 8.4
dark 8.3
column 8.1
man 8.1
shadow 8.1
temple 8
antique 7.8
musical instrument 7.7
holy 7.7
guillotine 7.7
instrument of execution 7.6
monument 7.5
percussion instrument 7.4
tourist 7.2
dirty 7.2
statue 7.2
door 7.2
marble 7.1
modern 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

statue 98.1
white 88.4
text 82.7
black and white 81.5
black 77.1
old 66.9
building 64.6
window 53.6
sculpture 53.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 34-50
Gender Male, 50.3%
Fear 49.6%
Angry 49.8%
Sad 49.8%
Happy 49.5%
Calm 49.6%
Confused 49.6%
Surprised 49.6%
Disgusted 49.5%

AWS Rekognition

Age 17-29
Gender Male, 50.2%
Fear 49.5%
Surprised 49.6%
Confused 49.5%
Sad 49.6%
Disgusted 49.5%
Happy 49.9%
Calm 49.7%
Angry 49.6%

AWS Rekognition

Age 31-47
Gender Female, 50.1%
Angry 49.6%
Fear 45.2%
Calm 45.2%
Sad 49.2%
Disgusted 45.1%
Happy 45.2%
Confused 45.1%
Surprised 45.2%

Feature analysis

Amazon

Person 97.5%

Categories

Text analysis

Amazon

GUTAE

Google

ATIV OINI GUSTAVO
ATIV
OINI
GUSTAVO