Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

[Woman balancing on train rail and group looking on]

Date

1930s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.349.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Woman balancing on train rail and group looking on]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.349.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2023-10-12

Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 99.1
Person 99.1
Female 99.1
Woman 99.1
Person 99.1
Terminal 98.4
Person 98.1
Clothing 97.7
Glove 97.7
Railway 97.5
Train Station 97.5
Transportation 97.5
Vehicle 97.5
Train 85.7
Train 84.6
Coat 57.8
Photography 56.9
Face 55.9
Head 55.9
Portrait 55.9
Locomotive 55.2

Clarifai
created on 2023-10-12

movie 99.9
negative 99.9
filmstrip 99.4
cinematography 99.1
people 98.6
exposed 97.1
adult 96.7
bobbin 96.3
man 96.2
slide 94.9
photograph 94.6
monochrome 94.2
woman 93.6
transportation system 92.3
group 92
two 89.7
wear 88.5
many 88.1
outdoors 87.8
three 87.3

Imagga
created on 2019-01-30

bridge 56.3
film 49.8
viaduct 44.8
negative 40.6
device 37.9
pier 37.4
structure 37.2
support 36.6
strip 33
frame 30.8
movie 27.1
camera 25.9
border 24.4
cinema 23.5
slide 23.4
retro 22.9
old 22.3
vintage 21.5
35mm 20.7
black 19.8
filmstrip 19.7
tie 19.5
photograph 19.2
photographic 18.6
roll 18
blank 18
grunge 17
graphic 16.8
art 16.3
brace 16.2
reel 15.8
architecture 15.5
photography 15.2
entertainment 14.7
texture 14.6
damaged 14.3
screen 14
percussion instrument 13.8
digital 13.8
city 13.3
silhouette 13.2
antique 13
design 12.9
empty 12.9
track 12.9
space 12.4
strengthener 12.2
musical instrument 11.9
cinematography 11.8
dirty 10.8
video 10.6
aged 10
tower 9.8
exposure 9.8
frames 9.8
noise 9.8
urban 9.6
cityscape 9.5
banner 9.2
travel 9.1
rough 9.1
sky 8.9
building 8.8
scratch 8.8
gondola 8.7
tape 8.7
rust 8.7
marimba 8.7
edge 8.7
ancient 8.6
equipment 8.6
grungy 8.5
buildings 8.5
boat 8.4
clip 8.4
element 8.3
historic 8.2
style 8.2
business 7.9
text 7.9
scene 7.8
shoot 7.7
construction 7.7
vibraphone 7.7
house 7.6
canvas 7.6
pattern 7.5
tourism 7.4
positive 7.4
backgrounds 7.3
river 7.1

Google
created on 2019-01-30

Microsoft
created on 2019-01-30

outdoor 89.4
person 89.3
people 77.6
white 76.4
black 65.7
black and white 24.3
design 15.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-43
Gender Female, 93.2%
Calm 56%
Angry 32.4%
Surprised 7.2%
Fear 6.3%
Confused 4.1%
Sad 3.4%
Happy 0.7%
Disgusted 0.5%

Feature analysis

Amazon

Adult
Male
Man
Person
Female
Woman
Glove
Train
Adult 99.3%

Categories

Text analysis

Amazon

8S
A
25008
25008 XTAR
T.
XTAR
(1)
AM
butts

Google

8S
8S