Human Generated Data

Title

7000 Oaks

Date

1982

People

Artist: Joseph Beuys, German 1921 - 1986

Classification

Prints

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, The Willy and Charlotte Reber Collection, Louise Haskell Daly Fund, 1995.655

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

7000 Oaks

People

Artist: Joseph Beuys, German 1921 - 1986

Date

1982

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Human 99.6
Person 99.6
Person 99.6
Person 99.3
Person 98.8
Person 98.7
Person 98.3
Person 98.1
Person 97.7
Clothing 96.6
Shoe 96.6
Footwear 96.6
Apparel 96.6
Outdoors 96.3
Musician 96.3
Musical Instrument 96.3
Garden 78.9
Person 78.2
Soil 77.3
Leisure Activities 76.2
Music Band 74.6
Field 69.5
Gardening 58.2
Tool 55
Shoe 50.4

Imagga
created on 2022-01-15

man 29.6
outdoors 26.2
trombone 24.7
weapon 24.3
outdoor 23.7
tool 22.2
male 22
people 21.7
brass 20.8
summer 19.9
sport 18.3
wind instrument 16.6
grass 16.6
shovel 16.1
rake 16
adult 15.6
active 15.5
leisure 14.1
person 14
boy 13.9
happy 13.8
travel 13.4
outside 12.8
field 12.5
activity 12.5
hiking 12.5
child 12.5
walking 12.3
senior 12.2
sky 12.1
men 12
landscape 11.9
lifestyle 11.6
musical instrument 11.1
work 11
family 10.7
couple 10.4
sword 10.3
play 10.3
spring 10.2
countryside 10
hat 10
park 9.9
mountain 9.8
autumn 9.7
walk 9.5
happiness 9.4
two 9.3
garden 9.2
playing 9.1
suit 9
fun 9
farm 8.9
trees 8.9
worker 8.9
working 8.8
trek 8.8
warrior 8.8
swing 8.8
day 8.6
golf 8.6
adventure 8.5
old 8.4
joy 8.3
sports 8.3
group 8.1
romance 8
game 8
country 7.9
love 7.9
hiker 7.9
forest 7.8
backpack 7.8
hike 7.8
retired 7.8
war 7.7
instrument 7.6
hobby 7.6
club 7.5
tourist 7.5
hand tool 7.4
tree 7.4
vacation 7.4
protection 7.3
smiling 7.2
recreation 7.2
farmer 7.1
women 7.1
kid 7.1
to 7.1
agriculture 7
together 7

Google
created on 2022-01-15

Footwear 98.1
Trousers 96.5
Plant 86.5
Building 84.6
Garden tool 82.8
People in nature 82.6
Window 80.3
Adaptation 79.3
Grass 77.8
Pole 75
Event 74.4
Sky 71.1
Arbor day 70.5
Formal wear 70.1
Soil 67
Shovel 66.6
Suit 65.6
Tree 64.2
Stock photography 63.9
Hoe 62

Microsoft
created on 2022-01-15

outdoor 93.2
person 93
clothing 88
man 84.3
people 83.3
group 77.6
standing 77
musical instrument 76
crowd 0.8

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 100%
Calm 98%
Confused 1.1%
Sad 0.4%
Angry 0.2%
Happy 0.1%
Surprised 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 38-46
Gender Male, 100%
Calm 96.4%
Happy 1.5%
Confused 0.5%
Sad 0.4%
Angry 0.4%
Disgusted 0.3%
Surprised 0.3%
Fear 0.1%

AWS Rekognition

Age 40-48
Gender Male, 100%
Calm 80.2%
Angry 11.7%
Confused 4.2%
Sad 1.2%
Disgusted 1.2%
Fear 0.8%
Surprised 0.4%
Happy 0.3%

AWS Rekognition

Age 36-44
Gender Male, 99.9%
Happy 57.5%
Calm 21.7%
Sad 7.8%
Disgusted 4.4%
Angry 3.1%
Surprised 2.4%
Confused 1.7%
Fear 1.3%

AWS Rekognition

Age 24-34
Gender Female, 96.5%
Calm 26.6%
Angry 16.8%
Fear 16.8%
Happy 16%
Sad 10.4%
Surprised 5.9%
Confused 4.4%
Disgusted 3.1%

AWS Rekognition

Age 22-30
Gender Male, 94.5%
Calm 80.7%
Disgusted 5.6%
Sad 4.7%
Confused 3.4%
Happy 3%
Angry 1%
Fear 0.9%
Surprised 0.8%

AWS Rekognition

Age 24-34
Gender Female, 99.2%
Disgusted 96.1%
Sad 1.3%
Angry 0.8%
Calm 0.5%
Confused 0.5%
Fear 0.3%
Happy 0.2%
Surprised 0.2%

AWS Rekognition

Age 22-30
Gender Male, 85.9%
Sad 66.4%
Calm 19%
Confused 4.5%
Angry 3.7%
Disgusted 3.5%
Happy 1%
Fear 0.9%
Surprised 0.9%

AWS Rekognition

Age 30-40
Gender Female, 97.9%
Disgusted 78.6%
Calm 10.1%
Sad 3.9%
Angry 3.3%
Fear 1.6%
Surprised 1.1%
Confused 0.9%
Happy 0.5%

AWS Rekognition

Age 20-28
Gender Male, 61.1%
Sad 99.8%
Fear 0%
Angry 0%
Calm 0%
Happy 0%
Disgusted 0%
Confused 0%
Surprised 0%

AWS Rekognition

Age 16-24
Gender Female, 89.6%
Calm 54.7%
Fear 16.5%
Sad 12.9%
Angry 6%
Surprised 4.2%
Disgusted 2.3%
Happy 2.3%
Confused 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 96.6%

Captions

Microsoft

a group of people standing in front of a building 96.9%
a group of people standing next to a building 96.6%
a group of people standing outside of a building 96.5%