Human Generated Data

Title

[Julia Feininger aboard ocean liner]

Date

June 1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.161.10

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Julia Feininger aboard ocean liner]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

June 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.161.10

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-04-04

Clarifai
created on 2021-04-04

people 99.3
one 97.2
adult 96.3
man 94.8
wear 94.7
art 92.5
two 87.9
portrait 85.9
music 85
monochrome 83
veil 80.6
woman 80.3
military 80.1
retro 78.7
actor 78.7
old 78.1
group 75.2
administration 73.8
musician 71.5
lid 71.4

Imagga
created on 2021-04-04

negative 63.8
film 52.2
photographic paper 38.8
photographic equipment 25.9
bowed stringed instrument 23.3
stringed instrument 20.1
old 19.5
architecture 18.9
violin 18.3
art 15.6
musical instrument 14.6
building 13.7
vintage 13.2
ancient 12.1
black 12
history 11.6
person 10.9
color 10.6
decoration 10.5
man 10.1
city 10
tourism 9.9
statue 9.8
grunge 9.4
clock 9.3
travel 9.1
retro 9
pattern 8.9
cello 8.8
stone 8.7
sculpture 8.7
culture 8.5
design 8.4
people 8.4
town 8.3
sky 8.3
work 7.8
face 7.8
male 7.8
scene 7.8
adult 7.8
snow 7.7
wall 7.7
famous 7.4
historic 7.3
time 7.3
business 7.3
religion 7.2
tower 7.2
modern 7

Google
created on 2021-04-04

Jaw 88
Hat 78.1
Monochrome 71.9
Monochrome photography 71.8
Elbow 71.3
Sleeve 71.1
Vintage clothing 67
Tie 66.7
Sitting 63.3
Stock photography 63.1
Room 60.7
Art 60.6
Font 58.3
Machine 56.5
Knee 53.8
Wrist 51.4
Paper product 50.2

Microsoft
created on 2021-04-04

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-40
Gender Female, 50%
Calm 53.9%
Happy 27.8%
Sad 15%
Fear 1.1%
Confused 0.8%
Angry 0.8%
Disgusted 0.4%
Surprised 0.3%

Feature analysis

Amazon

Person 91.5%

Categories

Captions

Microsoft
created on 2021-04-04

a person holding a guitar 25.7%