Human Generated Data

Title

[People on ship and off, Stockholm]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.200.23

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[People on ship and off, Stockholm]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.200.23

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-18

Human 99.2
Person 99.2
Person 97.8
Person 97.3
Person 96.2
Person 95.8
Person 95.2
Clothing 92.9
Apparel 92.9
Person 87.4
Stage 87.4
Person 86.8
Person 86.7
Person 86.2
Crowd 85.6
Person 85
Leisure Activities 81.5
Performer 81.2
Dance Pose 74.5
Person 73.6
Person 72.5
Person 72.5
Person 71.2
Musician 66.5
Musical Instrument 66.5
Skin 64.9
Face 64.6
People 62.9
Hat 60.3
Audience 59.6
Finger 57.4
Guitarist 57.1
Guitar 57.1
Coat 55.3
Overcoat 55.3
Person 49.9
Person 48.6
Person 45.9
Person 44.9

Clarifai
created on 2019-11-18

people 100
group together 98.8
many 98.5
adult 97.9
group 97.8
man 97.6
audience 96.2
music 95.7
woman 95.5
one 93.2
recreation 92.5
crowd 92.3
musician 92
wear 90.7
administration 87.5
stage 87.5
singer 86.7
actor 86.2
monochrome 84.3
spectator 83.5

Imagga
created on 2019-11-18

stage 46.1
sax 39
musical instrument 37.8
wind instrument 32.2
platform 27.8
silhouette 22.4
man 22.2
music 19.9
male 19.9
black 19.8
night 19.5
person 18.2
horn 17.5
people 17.3
device 16.6
accordion 16.1
keyboard instrument 15.9
musician 15.6
dark 15
guitar 14.8
singer 14.6
concert 13.6
performer 13.3
adult 13
light 12.7
stringed instrument 12.5
power 11.8
brass 10.8
performance 10.5
art 10.4
entertainment 10.1
instrumentality 10
sunset 9.9
body 9.6
rock 9.6
cornet 9.5
dance 9.5
men 9.4
club 9.4
sound 9.4
business 9.1
danger 9.1
technology 8.9
group 8.9
lights 8.3
digital 8.1
metal 8.1
disco 7.9
party 7.7
smoke 7.4
style 7.4
protection 7.3
industrial 7.3
harmonica 7.2
life 7.2
shadow 7.2
love 7.1
conceptual 7.1
sky 7
modern 7

Google
created on 2019-11-18

Microsoft
created on 2019-11-18

person 99.1
black and white 95.3
text 92.9
clothing 87
black 78
white 66
crowd 61.2
monochrome 60.7
people 57.6
street 55.7
watching 44.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Male, 50.1%
Confused 49.5%
Surprised 49.5%
Disgusted 49.5%
Fear 50.3%
Calm 49.5%
Angry 49.5%
Sad 49.7%
Happy 49.5%

AWS Rekognition

Age 26-42
Gender Male, 50.3%
Disgusted 49.5%
Sad 50.2%
Fear 49.5%
Happy 49.5%
Surprised 49.5%
Confused 49.5%
Calm 49.7%
Angry 49.5%

AWS Rekognition

Age 35-51
Gender Male, 50.4%
Happy 49.5%
Angry 49.6%
Confused 49.5%
Fear 49.8%
Disgusted 49.5%
Surprised 49.5%
Sad 50%
Calm 49.6%

AWS Rekognition

Age 32-48
Gender Female, 50.1%
Confused 49.5%
Calm 49.6%
Sad 49.7%
Happy 49.5%
Surprised 49.6%
Angry 49.6%
Disgusted 49.5%
Fear 49.9%

AWS Rekognition

Age 7-17
Gender Female, 50.1%
Angry 49.5%
Surprised 49.5%
Confused 49.5%
Fear 49.5%
Disgusted 49.5%
Calm 50.1%
Sad 49.7%
Happy 49.5%

AWS Rekognition

Age 24-38
Gender Female, 50.3%
Happy 49.5%
Surprised 49.5%
Calm 49.8%
Confused 49.5%
Fear 49.6%
Disgusted 49.5%
Sad 49.9%
Angry 49.6%

Feature analysis

Amazon

Person 99.2%

Text analysis

Google

L UN
L
UN