Human Generated Data

Title

Tramstop

Date

1977

People

Artist: Joseph Beuys, German 1921 - 1986

Publisher: Edizioni Lucrezia De Domizio Durini,

Classification

Multiples

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, The Willy and Charlotte Reber Collection, Louise Haskell Daly Fund, 1995.362

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

Tramstop

People

Artist: Joseph Beuys, German 1921 - 1986

Publisher: Edizioni Lucrezia De Domizio Durini,

Date

1977

Classification

Multiples

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, The Willy and Charlotte Reber Collection, Louise Haskell Daly Fund, 1995.362

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-09

Clarifai
created on 2019-11-09

no person 98
people 93.4
one 90.1
monochrome 85.3
art 83.9
winter 83.9
outdoors 82
light 81.6
building 81.5
architecture 80.4
snow 79.2
military 79.1
vehicle 78.9
war 77.4
water 76
calamity 75.6
silhouette 75.2
industry 74.7
travel 74.5
transportation system 73.7

Imagga
created on 2019-11-09

pole 49.4
sky 40.2
rod 37.4
tower 26
industry 25.6
construction 23.1
city 21.6
sunset 19.8
structure 19.7
steel 19.6
crane 19.3
metal 18.5
urban 17.5
architecture 17.2
device 16.7
building 16.7
equipment 16.4
antenna 16.2
high 15.6
industrial 15.4
landscape 14.9
cloud 14.6
column 14.3
tall 14.1
water 14
rifle 12.9
tree 12.6
power 12.6
work 12.6
silhouette 12.4
firearm 11.9
hook 11.6
sun 11.3
travel 11.3
gun 10.7
totem pole 10.5
old 9.8
outdoors 9.7
instrument 9.7
technology 9.6
weapon 9.6
cloudy 9.4
monument 9.3
communication 9.2
tourism 9.1
mechanism 9
river 8.9
wire 8.8
concrete 8.6
build 8.5
site 8.4
energy 8.4
town 8.3
snag 8.3
environment 8.2
mountain 8
scenic 7.9
business 7.9
instrument of execution 7.9
sea 7.8
cable 7.5
commercial 7.5
mechanical device 7.5
ocean 7.5
light 7.3

Google
created on 2019-11-09

Microsoft
created on 2019-11-09

sky 78.3
pole 65.8
text 64.3
weapon 56.8
black and white 54.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 17-29
Gender Male, 54.2%
Sad 45%
Angry 45%
Happy 45.4%
Fear 45%
Calm 53.9%
Confused 45%
Surprised 45.6%
Disgusted 45%

Feature analysis

Amazon

Utility Pole 84.6%

Categories

Captions

Microsoft
created on 2019-11-09

a view of a street 62.9%
a view of a city street 50.4%
a close up of a street 50.3%

Text analysis

Amazon

prarys