Human Generated Data

Title

[People near the Orpheus Fountain at the Stockholm Town Hall, 1936]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1003.135

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[People near the Orpheus Fountain at the Stockholm Town Hall, 1936]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1003.135

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.4
Human 99.4
Person 99.4
Person 99
Person 98.1
Person 97.5
Person 96.1
Person 95.7
Person 90.2
Person 89.5
Person 89.3
Pedestrian 89.2
Architecture 89
Building 89
Art 88.1
Statue 88.1
Sculpture 88.1
Person 68.9
Apparel 68.2
Clothing 68.2
Banister 66.5
Handrail 66.5
Column 65
Pillar 65
Crowd 64.4
Leisure Activities 61.7

Clarifai
created on 2019-11-16

people 99.9
group 98.7
woman 97.3
music 97.1
group together 96.8
man 96.3
adult 96
administration 93
many 89.8
leader 89.3
street 89
shadow 87.8
two 86.1
child 85.9
musician 85.4
room 85.3
silhouette 84.9
one 83
recreation 82.3
wear 80.8

Imagga
created on 2019-11-16

percussion instrument 65.1
grand piano 63.5
piano 58.3
musical instrument 56.4
stringed instrument 51.6
keyboard instrument 42.1
man 29.6
marimba 28
people 24.5
business 23.7
male 18.4
black 18.1
office 18
person 17.5
businessman 16.8
indoors 16.7
sitting 16.3
sax 14.8
adult 14.5
corporate 13.7
suit 13.5
room 13.5
women 13.4
silhouette 13.2
table 13
men 12.9
indoor 12.8
interior 12.4
classroom 12.1
lifestyle 11.6
building 11.1
smiling 10.8
chair 10.8
light 10.7
executive 10.5
desk 10.4
window 10.3
modern 9.8
job 9.7
group 9.7
couple 9.6
love 9.5
work 9.4
inside 9.2
travel 9.2
alone 9.1
professional 9
worker 9
happy 8.8
together 8.8
happiness 8.6
glass 8.6
laptop 8.4
color 8.3
businesswoman 8.2
music 8.1
computer 8.1
working 8
urban 7.9
airport 7.8
performance 7.7
two 7.6
finance 7.6
relaxation 7.5
hall 7.5
evening 7.5
cheerful 7.3
time 7.3
open 7.2
night 7.1
architecture 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

black and white 97.3
person 94.7
clothing 93
man 92.4
monochrome 84.5
statue 82.5
text 82.1
street 70.7
people 55.5
footwear 51.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 40-58
Gender Male, 51.1%
Angry 45.1%
Fear 45%
Calm 53%
Sad 46.8%
Disgusted 45%
Happy 45%
Confused 45%
Surprised 45.1%

AWS Rekognition

Age 26-42
Gender Male, 51.4%
Disgusted 45.2%
Calm 52.4%
Confused 45.3%
Sad 46.1%
Happy 45.1%
Angry 45.2%
Fear 45.5%
Surprised 45.3%

AWS Rekognition

Age 34-50
Gender Female, 51%
Disgusted 45%
Sad 46.9%
Happy 45%
Surprised 45.1%
Calm 52.6%
Fear 45%
Confused 45.1%
Angry 45.3%

AWS Rekognition

Age 25-39
Gender Female, 50.2%
Happy 45.3%
Confused 45.6%
Calm 50.7%
Angry 45.3%
Disgusted 45.5%
Surprised 46.2%
Fear 45.2%
Sad 46.2%

AWS Rekognition

Age 22-34
Gender Male, 50.2%
Angry 49.5%
Calm 50.3%
Surprised 49.5%
Confused 49.5%
Disgusted 49.5%
Happy 49.5%
Sad 49.7%
Fear 49.5%

AWS Rekognition

Age 39-57
Gender Male, 54.6%
Disgusted 45.1%
Angry 45.1%
Confused 45%
Sad 45.4%
Calm 52.6%
Happy 45.2%
Fear 46.1%
Surprised 45.4%

AWS Rekognition

Age 35-51
Gender Male, 50.1%
Happy 49.5%
Disgusted 49.5%
Calm 50.1%
Sad 49.6%
Surprised 49.5%
Confused 49.5%
Fear 49.5%
Angry 49.6%

AWS Rekognition

Age 42-60
Gender Male, 52.5%
Angry 45.1%
Calm 52.6%
Confused 45%
Disgusted 45%
Surprised 45.1%
Happy 46.5%
Sad 45.6%
Fear 45%

Feature analysis

Amazon

Person 99.4%

Categories

Imagga

interior objects 91.6%
food drinks 5.1%