Human Generated Data

Title

Bonnefanten

Date

1975

People

Artist: Joseph Beuys, German 1921 - 1986

Publisher: Edition Staeck, Heidelberg,

Classification

Prints

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, The Willy and Charlotte Reber Collection, Louise Haskell Daly Fund, 1995.644

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

Bonnefanten

People

Artist: Joseph Beuys, German 1921 - 1986

Publisher: Edition Staeck, Heidelberg,

Date

1975

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 98.5
Human 98.5
Person 98.4
Dog 95.6
Mammal 95.6
Canine 95.6
Animal 95.6
Pet 95.6
Wheel 95.3
Machine 95.3
Clothing 95.1
Apparel 95.1
Person 92.9
Wheel 85.6
Transportation 79.6
Vehicle 79.6
Bicycle 79.6
Bike 79.6
Wheel 71.6
Advertisement 63.9
Poster 63.9
Hat 60.8
Coat 58.1
Overcoat 58.1

Imagga
created on 2022-01-15

barrow 36.6
handcart 29.6
shovel 28
wheeled vehicle 23.6
people 19
man 18.1
vehicle 17.7
person 16.6
adult 14.2
hand tool 14
urban 14
graffito 13.6
fashion 13.6
dirty 13.5
tool 12.6
portrait 12.3
old 11.1
conveyance 11.1
building 11.1
street 11
city 10.8
decoration 10.2
danger 10
dress 9.9
male 9.9
clothing 9.6
cold 9.5
men 9.4
wall 9.4
black 9.2
industrial 9.1
style 8.9
women 8.7
pedestrian 8.7
smile 7.8
industry 7.7
youth 7.7
happy 7.5
human 7.5
clothes 7.5
crutch 7.5
smoke 7.4
light 7.3
protection 7.3
sexy 7.2
weapon 7.2
snow 7.1
posing 7.1
working 7.1

Google
created on 2022-01-15

Wheel 94.7
White 92.2
Tire 91.9
Human 88.9
Motor vehicle 88.4
Font 83.5
Luggage and bags 77.8
Snapshot 74.3
Street fashion 72.8
Advertising 71.1
Bag 70.2
Street 69.9
History 65.6
Vintage clothing 65.2
Baby carriage 64.4
Monochrome 64.2
Baby Products 64.1
Sidewalk 63.9
Art 63.4
Beard 60.7

Microsoft
created on 2022-01-15

building 99.4
outdoor 98.9
clothing 98.3
text 95.5
poster 94
person 93.9
man 81.5
old 60.5
book 56.9
cart 31.8

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 40-48
Gender Male, 99.7%
Calm 96.2%
Angry 2.6%
Happy 0.7%
Confused 0.2%
Surprised 0.1%
Sad 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 42-50
Gender Male, 99.8%
Sad 82.3%
Calm 16.8%
Confused 0.3%
Angry 0.3%
Surprised 0.1%
Happy 0.1%
Fear 0.1%
Disgusted 0%

AWS Rekognition

Age 35-43
Gender Male, 84.4%
Sad 57.2%
Calm 38.5%
Happy 1.7%
Confused 1%
Angry 0.5%
Disgusted 0.4%
Fear 0.4%
Surprised 0.3%

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 57
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Dog 95.6%
Wheel 95.3%
Bicycle 79.6%

Captions

Microsoft

a group of people riding on the back of a horse drawn carriage 73%
a group of people standing in front of a building 72.9%
a group of men riding on the back of a horse drawn carriage 67.1%

Text analysis

Amazon

BONNEFANTEN

Google

BONNEFANTEN
BONNEFANTEN