Human Generated Data

Title

[Onboard the S.S. Pennsylvania, passengers viewed from above]

Date

June 1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.158.7

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Onboard the S.S. Pennsylvania, passengers viewed from above]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

June 1936

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-04-04

Human 99.8
Person 99.8
Person 98.8
Person 96.6
Person 91.7
Person 90.7
Person 90.2
Crowd 86.8
Person 84.8
Audience 84.8
Person 84.2
Person 81
People 73.5
Person 70.4
Indoors 65.8
Room 65.8
Person 61.4
Clothing 60.3
Apparel 60.3
Chair 59.5
Furniture 59.5
Musical Instrument 56.9
Musician 56.9
Silhouette 55.7
Person 52.2

Clarifai
created on 2021-04-04

people 98.9
monochrome 98.6
dark 97.4
woman 95.4
street 95.2
light 93.9
music 93.8
audience 93.8
concert 93.3
art 93.2
man 92.6
girl 92
crowd 91
portrait 90.7
adult 89.6
city 88.8
festival 87.4
child 87.4
stage 87
subway system 86

Imagga
created on 2021-04-04

light 24.7
night 18.6
black 18.6
stage 17.7
dark 17.5
art 14.3
smoke 13.9
man 13.4
old 12.5
metal 12.1
sky 11.5
person 11
music 10.8
silhouette 10.8
wall 10.7
male 10.6
theater curtain 10.5
hot 10
religion 9.9
people 9.5
garage 9.4
fire 9.4
energy 9.2
disk jockey 9.1
design 9
color 8.9
platform 8.8
construction 8.6
modern 8.4
curtain 8.3
vintage 8.3
broadcaster 8.3
pattern 8.2
industrial 8.2
digital 8.1
work 8
bright 7.9
architecture 7.8
structure 7.8
equipment 7.7
texture 7.6
lamp 7.5
technology 7.4
heat 7.4
percussion instrument 7.4
graphic 7.3
protection 7.3
danger 7.3
musical instrument 7.2
science 7.1
interior 7.1
working 7.1

Google
created on 2021-04-04

Microsoft
created on 2021-04-04

black and white 85.7
concert 81.9
person 78.8
text 78.3
man 54.9
crowd 1.3

Face analysis

Amazon

AWS Rekognition

Age 25-39
Gender Female, 55.3%
Sad 52.1%
Calm 20.2%
Fear 8.3%
Angry 7%
Surprised 4.4%
Confused 2.7%
Disgusted 2.7%
Happy 2.5%

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a group of people in a dark room 71.9%
a group of people standing in a dark room 61.2%
a group of people that are standing in the dark 42.6%