Human Generated Data

Title

7000 Oaks

Date

1982

People

Artist: Joseph Beuys, German 1921 - 1986

Classification

Prints

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, The Willy and Charlotte Reber Collection, Louise Haskell Daly Fund, 1995.654

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

7000 Oaks

People

Artist: Joseph Beuys, German 1921 - 1986

Date

1982

Classification

Prints

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, The Willy and Charlotte Reber Collection, Louise Haskell Daly Fund, 1995.654

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-04-06

Human 99.7
Person 99.7
Person 99.5
Person 99.3
Person 98.6
Person 98.3
Person 98.2
Person 98
Person 97
Outdoors 91
Person 87.5
Soil 83.9
Tool 76.3
Apparel 75.7
Shoe 75.7
Footwear 75.7
Clothing 75.7
Person 75.4
Field 65.1
Crowd 64.8
Musical Instrument 63.2
Musician 63.2
Garden 61.8
Person 57.2
Hoe 56.9
Leisure Activities 55.9

Clarifai
created on 2018-04-19

people 99.4
group 94.3
adult 93.7
administration 93.5
man 91.6
religion 91.3
many 87.9
military 86.5
war 85.3
home 83.9
road 83.5
battle 82.1
leader 81.9
offense 81.4
woman 79.8
city 79.5
wear 79
ceremony 78.4
vehicle 77.6
calamity 77.1

Imagga
created on 2018-04-19

weapon 29.3
man 24.9
people 19
outdoor 18.3
outdoors 18
male 17.7
military 16.4
walking 16.1
tool 15.6
travel 15.5
sport 15
sword 14.9
person 13.8
men 13.7
uniform 13.6
shovel 13.6
summer 13.5
war 13.5
active 12.6
park 12.3
boy 12.2
old 11.8
soldier 11.7
conflict 11.7
adult 11.7
engineer 11.1
tourist 11.1
rifle 10.8
history 10.7
army 10.7
mountain 10.7
hiking 10.6
building 10.4
clothing 10.2
protection 10
pedestrian 9.8
backpack 9.8
warrior 9.8
rake 9.7
gun 9.4
danger 9.1
private 8.7
day 8.6
city 8.3
tourism 8.2
countryside 8.2
suit 8.1
activity 8.1
group 8.1
trombone 8
armor 8
brass 8
forest 7.8
outside 7.7
sky 7.7
two 7.6
stone 7.6
adventure 7.6
palace 7.5
leisure 7.5
landscape 7.4
vacation 7.4
child 7.2
track 7.2
sunset 7.2
spring 7.1
hand tool 7

Google
created on 2018-04-19

tree 65

Microsoft
created on 2018-04-19

outdoor 97.4
person 96.3
group 82.4
people 81.2
standing 77.6
crowd 0.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-52
Gender Male, 54.3%
Happy 45.4%
Disgusted 45.4%
Calm 45.9%
Surprised 45.2%
Angry 45.4%
Confused 45.2%
Sad 52.6%

AWS Rekognition

Age 35-52
Gender Male, 52.4%
Happy 45.4%
Confused 45.3%
Calm 45.3%
Sad 46.3%
Surprised 45.2%
Angry 51.9%
Disgusted 45.7%

AWS Rekognition

Age 26-43
Gender Male, 54.6%
Sad 50.5%
Calm 47.6%
Confused 45.2%
Angry 45.8%
Happy 45.3%
Surprised 45.3%
Disgusted 45.4%

AWS Rekognition

Age 26-43
Gender Male, 50.5%
Confused 49.5%
Calm 49.7%
Angry 49.6%
Happy 49.7%
Sad 49.8%
Disgusted 49.7%
Surprised 49.5%

AWS Rekognition

Age 26-43
Gender Male, 50.3%
Calm 49.6%
Happy 49.6%
Disgusted 49.9%
Sad 49.7%
Surprised 49.5%
Angry 49.6%
Confused 49.6%

AWS Rekognition

Age 38-59
Gender Female, 50.1%
Surprised 49.5%
Sad 49.6%
Confused 49.5%
Calm 49.8%
Disgusted 50%
Angry 49.6%
Happy 49.5%

AWS Rekognition

Age 23-38
Gender Female, 50.1%
Happy 49.5%
Surprised 49.5%
Angry 49.9%
Confused 49.5%
Disgusted 49.5%
Sad 49.7%
Calm 49.7%

AWS Rekognition

Age 26-43
Gender Male, 50.5%
Disgusted 49.5%
Calm 49.5%
Confused 49.5%
Angry 49.6%
Sad 50.3%
Happy 49.5%
Surprised 49.5%

AWS Rekognition

Age 35-52
Gender Male, 50.3%
Happy 49.9%
Confused 49.5%
Calm 49.6%
Disgusted 49.5%
Angry 49.6%
Sad 49.8%
Surprised 49.6%

AWS Rekognition

Age 35-52
Gender Female, 50.2%
Angry 49.5%
Happy 50.1%
Surprised 49.5%
Sad 49.8%
Disgusted 49.5%
Confused 49.5%
Calm 49.5%

AWS Rekognition

Age 12-22
Gender Male, 53%
Sad 50.7%
Angry 45.5%
Calm 48%
Surprised 45.3%
Disgusted 45.1%
Happy 45.3%
Confused 45.2%

AWS Rekognition

Age 20-38
Gender Female, 50.4%
Confused 49.5%
Surprised 49.5%
Angry 49.6%
Calm 49.8%
Happy 49.7%
Disgusted 49.7%
Sad 49.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 75.7%