Human Generated Data

Title

How to Make a Happening

Date

1964

People

Artist: Allan Kaprow, American 1927 - 2006

Classification

Prints

Human Generated Data

Title

How to Make a Happening

People

Artist: Allan Kaprow, American 1927 - 2006

Date

1964

Classification

Prints

Machine Generated Data

Tags

Amazon

Human 99.7
Person 99.7
Person 99.3
Person 96
Apparel 95.7
Clothing 95.7
Person 87
Military Uniform 78.3
Military 78.3
Person 78
Animal 74.9
Mammal 74.9
Transportation 72.2
Car 72.2
Automobile 72.2
Vehicle 72.2
Hat 68.1
Person 65.3
Machine 63.9
Spoke 63.9
People 62.3
Bull 61.5
Outdoors 60.2
Army 59
Armored 59
Officer 58.2
Face 55.6

Clarifai

people 99.9
many 99.5
group 99.1
group together 99
adult 98.8
military 98.1
vehicle 97.6
war 97.4
soldier 97.1
man 96.2
administration 90.5
wear 90.5
weapon 88.4
transportation system 88
skirmish 87.4
woman 87.3
child 86.8
gun 86.2
uniform 85.8
several 84.9

Imagga

man 23.5
outdoor 19.1
weapon 18.3
danger 18.2
male 17.7
person 17.3
people 17.3
snow 17
horse 16.7
adult 15.9
gun 15.7
outdoors 14.9
sky 14.7
old 14.6
clothing 13.8
rifle 13.7
protection 13.6
military 13.5
soldier 12.7
dirty 12.7
uniform 12.1
industrial 11.8
destruction 11.7
toxic 11.7
nuclear 11.6
mask 11.6
park 11.5
landscape 11.2
radioactive 10.8
radiation 10.8
protective 10.7
chemical 10.6
gas 10.6
travel 10.6
black 10.2
sport 10
stalker 9.9
sunset 9.9
environment 9.9
history 9.8
accident 9.8
child 9.7
vehicle 9.6
musical instrument 9.4
winter 9.4
two 9.3
chemical weapon 9.2
animal 9.1
tank 8.9
camouflage 8.8
disaster 8.8
building 8.7
dangerous 8.6
industry 8.5
walking 8.5
tree 8.5
portrait 8.4
smoke 8.4
military uniform 8.3
firearm 8.3
silhouette 8.3
cowboy 8.3
holding 8.3
tourism 8.2
active 8.1
tracked vehicle 8.1
trees 8
to 8
forest 7.8
summer 7.7
culture 7.7
risk 7.7
weapon of mass destruction 7.6
walk 7.6
safety 7.4
lifestyle 7.2
wheeled vehicle 7.2
saddle 7.2
grass 7.1
love 7.1
wind instrument 7.1
happiness 7.1
architecture 7
autumn 7
season 7

Google

Motor vehicle 97.4
Vehicle 85.6
Photography 62.4
Car 60.6
History 54.1

Microsoft

outdoor 99.9
person 92.3
text 90.6
clothing 87.2
man 78.6
old 74.7
tree 72.6
vehicle 61.7
car 54
several 10.2

Face analysis

Amazon

AWS Rekognition

Age 33-49
Gender Female, 50%
Sad 50.2%
Fear 49.8%
Calm 49.5%
Disgusted 49.5%
Happy 49.5%
Angry 49.5%
Surprised 49.5%
Confused 49.5%

Feature analysis

Amazon

Person 99.7%
Car 72.2%

Captions

Microsoft

a group of people in an old photo of a horse 91.1%
a vintage photo of a group of people standing next to a horse 87.9%
a group of people riding on the back of a horse 81.5%

Text analysis

Amazon

a,