Human Generated Data

Title

Untitled, from the portfolio "Nature Experiences in Africa"

Date

1980

People

Artist: Joseph Beuys, German 1921 - 1986

Artist: Charles Wilp, German 1932 - 2005

Publisher: Qumran Verlag, Frankfurt,

Classification

Multiples

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, The Willy and Charlotte Reber Collection, Louise Haskell Daly Fund, 1995.431.6

Human Generated Data

Title

Untitled, from the portfolio "Nature Experiences in Africa"

People

Artist: Joseph Beuys, German 1921 - 1986

Artist: Charles Wilp, German 1932 - 2005

Publisher: Qumran Verlag, Frankfurt,

Date

1980

Classification

Multiples

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Face 99.9
Human 99.9
Person 99.4
Head 98.8
Skin 96.2
Shelf 79.9
Man 71.2
Finger 69.1
Monk 68.7
Portrait 68.1
Photography 68.1
Photo 68.1
Monitor 59
Screen 59
Electronics 59
Display 59

Imagga
created on 2022-02-25

man 51.1
television 48.5
portrait 44
male 41.9
senior 41.2
person 39
adult 36.9
telecommunication system 36.1
elderly 34.5
mature 34.4
face 33.4
old 32.8
grandfather 30.5
handsome 26.7
hair 26.2
mug shot 25.6
people 25.1
looking 24
black 23.7
photograph 21.8
expression 19.6
happy 19.4
one 19.4
retirement 19.2
serious 19.1
age 19.1
aged 19
retired 17.5
smiling 17.4
confident 17.3
men 17.2
head 16.8
glasses 16.7
older 16.5
eyes 16.4
businessman 15.9
smile 15.7
representation 15.5
human 15
professional 14.4
attractive 14
close 13.1
pensioner 12.9
elder 12.8
casual 12.7
gray 12.6
suit 12.6
guy 12.4
studio 12.2
business 12.1
wrinkle 11.8
model 11.7
aging 11.5
executive 11.1
work 11
grandmother 10.8
60s 10.7
boss 10.5
muscular 10.5
creation 10.2
computer 10.1
emotion 10.1
friendly 10.1
hand 9.9
wrinkled 9.8
look 9.6
laptop 9.6
closeup 9.4
happiness 9.4
lifestyle 9.4
dark 9.2
bald 8.9
success 8.9
bust 8.8
home 8.8
formal 8.6
tie 8.5
skin 8.5
liquid crystal display 8.4
modern 8.4
manager 8.4
horizontal 8.4
alone 8.2
worker 8
job 8
citizen 7.9
nose 7.9
gray hair 7.9
health 7.6
beard 7.6
fitness 7.2
sculpture 7.1

Google
created on 2022-02-25

Nose 98.4
Cheek 97.9
Jaw 88.1
Ear 80.6
Rectangle 75.5
Wrinkle 75.4
Throat 66.4
Gesture 66.3
Symmetry 64.4
Stock photography 62.4
Flesh 61.5
Event 61.2
Eyelash 59.8
Visual arts 57.1
Portrait photography 56.5
Flash photography 56.3
Art 54.2
Display device 54
Chest 53.2
Facial hair 50.9

Microsoft
created on 2022-02-25

monitor 99.5
television 99.4
wall 99.3
screen 99
man 98.8
human face 97.6
indoor 97
person 96.4
text 94.6
face 87.9
portrait 79.2
electronics 74.1
screenshot 60.4
display 54.3
picture frame 50.5
flat 29.9

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-53
Gender Male, 100%
Sad 57.1%
Confused 14.6%
Surprised 8.7%
Fear 7.5%
Calm 6%
Angry 3.4%
Disgusted 2.1%
Happy 0.6%

Microsoft Cognitive Services

Age 49
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Monitor 59%

Captions

Microsoft

a screen shot of Joseph Beuys 89.4%
a screen shot of Joseph Beuys in front of a television 62.6%
Joseph Beuys and a flat screen television 54.1%