Human Generated Data

Title

aus "Strauss Mappe"

Date

1972

People

Artist: Joseph Beuys, German 1921 - 1986

Publisher: Kunstverein zur Förderung moderner Kunst e.V., Göttingen,

Classification

Multiples

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, The Willy and Charlotte Reber Collection, Gift of Charlotte Reber, 1996.152.6

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

aus "Strauss Mappe"

People

Artist: Joseph Beuys, German 1921 - 1986

Publisher: Kunstverein zur Förderung moderner Kunst e.V., Göttingen,

Date

1972

Classification

Multiples

Machine Generated Data

Tags

Amazon
created on 2019-11-09

Human 97.9
Apparel 97.7
Clothing 97.7
Hat 97.7
Person 97.4
Person 97.4
Outdoors 88.7
Military Uniform 77.2
Military 77.2
Worker 72.9
Coat 70.6
Overcoat 68.2
Photography 67.2
Photo 67.2
Face 62.1
Portrait 62.1
Nature 60.3
Army 58.6
Armored 58.6
Weaponry 57.6
Weapon 57.6
Gardening 56.3
Gardener 56.3
Garden 56.3
Soldier 55.5
Bronze 55.1

Clarifai
created on 2019-11-09

people 99.9
adult 99
one 98.5
man 98
war 95.2
military 93.8
two 93.2
wear 92.8
soldier 92
woman 90.8
weapon 89.9
portrait 89.9
gun 85.6
group 84.3
skirmish 83.8
veil 83.4
administration 80.6
uniform 80
mammal 78.8
three 78.1

Imagga
created on 2019-11-09

shovel 48.5
tool 31.6
weapon 28.1
paddle 27.2
hand tool 26.1
man 24.2
male 17.7
person 17.1
warrior 16.6
outdoors 16.4
people 16.2
sword 15.7
protection 15.5
outdoor 15.3
oar 15.2
stick 14.3
adult 14.2
crutch 13.8
military 13.5
sport 13.2
fire iron 12.8
cleaner 12.2
clothing 12.1
men 12
instrument 11.8
staff 11.7
war 11.5
black 11.4
helmet 11.4
old 11.1
portrait 11
bow and arrow 10.6
fashion 10.5
blade 10.4
cleaning implement 10
active 9.9
history 9.8
soldier 9.8
conflict 9.8
clothes 9.4
dark 9.2
attractive 9.1
fight 8.7
swab 8.3
danger 8.2
metal 8
holiday 7.9
two 7.6
power 7.6
style 7.4
safety 7.4
dress 7.2
recreation 7.2
activity 7.2
summer 7.1
work 7.1

Google
created on 2019-11-09

Microsoft
created on 2019-11-09

outdoor 99.9
clothing 97.4
person 97.3
text 95.6
man 75.1
black and white 74.8
human face 70.1

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Female, 68.8%
Surprised 0.9%
Angry 14.1%
Sad 69.5%
Happy 1.7%
Disgusted 2.3%
Fear 1.3%
Confused 1.4%
Calm 8.7%

AWS Rekognition

Age 23-35
Gender Male, 96.3%
Disgusted 2.3%
Confused 11%
Calm 24.1%
Angry 48.8%
Sad 7.2%
Happy 0.5%
Fear 3.4%
Surprised 2.6%

Feature analysis

Amazon

Hat 97.7%
Person 97.4%

Captions

Microsoft

a black and white photo of a person 79.7%
a person holding a baseball bat 30.5%
an old photo of a person 30.4%