Human Generated Data

Title

[Onboard the S.S. Pennsylvania, passengers viewed from above]

Date

June 1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.158.9

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Onboard the S.S. Pennsylvania, passengers viewed from above]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

June 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.158.9

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-04-04

Human 98.6
Person 98.6
Person 97.2
Person 92.9
Person 90.7
Person 90.1
Person 83.5
Nature 78.2
Outdoors 73.6
Clothing 60.5
Apparel 60.5
People 60.4

Clarifai
created on 2021-04-04

monochrome 99
shadow 98.4
dark 98.3
people 97.9
art 97.6
church 97.5
silhouette 97.4
light 97.4
street 97.2
cemetery 96.9
architecture 95.2
abstract 95.1
city 94.8
grave 93.8
old 93.8
girl 93.1
black and white 92.6
Halloween 92.3
desktop 91.9
scary 91.1

Imagga
created on 2021-04-04

cell 33
dark 25
light 22.1
architecture 19.6
night 16.9
building 16.6
silhouette 16.5
old 16
black 15.3
city 15
sky 14.7
device 13.7
support 13.3
upright 11.6
wall 11.5
piano 11.1
mystery 10.6
urban 10.5
landscape 10.4
structure 10.2
musical instrument 10.1
step 9.9
art 9.8
interior 9.7
window 9.6
evening 9.3
percussion instrument 9.2
keyboard instrument 9.2
negative 9.2
sun 9
stringed instrument 9
sunset 9
color 8.9
stone 8.9
room 8.3
man 8.1
metal 8
tunnel 8
home 8
film 8
construction 7.7
grunge 7.7
texture 7.6
skyline 7.6
person 7.4
inside 7.4
design 7.3
fantasy 7.2
shadow 7.2
colorful 7.2
travel 7
sea 7

Google
created on 2021-04-04

Microsoft
created on 2021-04-04

text 84.4
white 83.5
black and white 75.8
black 72.6
dark 54.9
monochrome 50.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 30-46
Gender Female, 63.9%
Fear 65.1%
Sad 14%
Angry 6%
Calm 5.4%
Surprised 4%
Happy 3%
Confused 1.3%
Disgusted 1.1%

AWS Rekognition

Age 45-63
Gender Female, 56.2%
Sad 50.3%
Confused 16.5%
Calm 12.3%
Fear 8.1%
Happy 4.7%
Disgusted 4.1%
Angry 2.7%
Surprised 1.2%

AWS Rekognition

Age 32-48
Gender Female, 54.6%
Calm 58.2%
Fear 18.7%
Surprised 8.6%
Sad 5.8%
Angry 3.6%
Disgusted 2.3%
Happy 2.1%
Confused 0.7%

AWS Rekognition

Age 29-45
Gender Female, 87.1%
Happy 26.5%
Fear 25%
Angry 20.1%
Sad 12.9%
Calm 12.4%
Surprised 1.4%
Confused 0.9%
Disgusted 0.8%

Feature analysis

Amazon

Person 98.6%

Captions