Human Generated Data

Title

[People near the Orpheus Fountain, Stockholm]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.199.21

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[People near the Orpheus Fountain, Stockholm]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.199.21

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-18

Human 98.6
Person 98.6
Person 97
Person 91.9
Clothing 91.1
Apparel 91.1
Person 90
Person 88.3
Person 84
Person 83.1
Architecture 82.5
Building 82.5
Sculpture 77.7
Art 77.7
Handrail 72.6
Banister 72.6
Person 71.2
People 63.7
Column 63.3
Pillar 63.3
Performer 61.8
Overcoat 60.3
Coat 60.3
Figurine 59.3
Statue 58.2
Officer 56.9
Military Uniform 56.9
Military 56.9
Archaeology 55.3

Clarifai
created on 2019-11-18

people 99.6
monochrome 97.8
music 97.7
man 96.1
adult 95.4
group 94.9
woman 94.3
one 92.7
group together 92.3
stage 91.5
light 90.4
street 90
wear 88.3
theater 88.1
shadow 87.3
art 87.1
musician 86.7
dancer 86.7
dancing 85.3
administration 85.3

Imagga
created on 2019-11-18

stringed instrument 46.9
piano 45.1
grand piano 43.9
percussion instrument 43.7
musical instrument 42.4
keyboard instrument 34.5
room 21.1
chair 19.6
device 16.7
interior 15.9
people 15.6
indoors 14
person 13.5
table 13.1
harp 12.7
furniture 12.5
home 12
old 11.8
support 11.8
upright 11.7
black 11.4
light 11.3
building 11.3
sexy 11.2
body 11.2
window 11
dark 10.8
marimba 10.3
man 10.1
house 10
adult 9.8
attractive 9.8
one 9.7
sitting 9.4
bowed stringed instrument 9.4
art 9.3
travel 9.1
vintage 9.1
women 8.7
architecture 8.6
erotic 8.5
passion 8.5
fashion 8.3
inside 8.3
human 8.2
alone 8.2
seat 8.2
office 8.1
lifestyle 7.9
hair 7.9
male 7.8
model 7.8
nude 7.8
men 7.7
barbershop 7.7
comfortable 7.6
silhouette 7.4
floor 7.4
lady 7.3
sensual 7.3
sensuality 7.3
music 7.2
religion 7.2
history 7.1
posing 7.1
love 7.1
shop 7
modern 7

Google
created on 2019-11-18

Microsoft
created on 2019-11-18

statue 97.8
black and white 94.4
person 93.4
clothing 87.9
monochrome 79.2
man 63.3
text 58.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 29-45
Gender Male, 51.1%
Surprised 45.1%
Calm 50.1%
Fear 46.3%
Disgusted 45.1%
Angry 45.2%
Confused 45.1%
Sad 48.1%
Happy 45%

AWS Rekognition

Age 23-37
Gender Female, 50.4%
Surprised 49.5%
Angry 49.6%
Calm 49.5%
Disgusted 49.5%
Happy 49.5%
Fear 49.5%
Sad 50.3%
Confused 49.5%

AWS Rekognition

Age 34-50
Gender Male, 50.2%
Confused 49.5%
Surprised 49.5%
Disgusted 49.5%
Fear 49.5%
Calm 49.5%
Angry 49.5%
Sad 50.5%
Happy 49.5%

Feature analysis

Amazon

Person 98.6%

Categories

Imagga

interior objects 97.2%