Human Generated Data

Title

Untitled, from the portfolio "Nature Experiences in Africa"

Date

1980

People

Artist: Joseph Beuys, German 1921 - 1986

Artist: Charles Wilp, German 1932 - 2005

Publisher: Qumran Verlag, Frankfurt,

Classification

Multiples

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, The Willy and Charlotte Reber Collection, Louise Haskell Daly Fund, 1995.431.1

Human Generated Data

Title

Untitled, from the portfolio "Nature Experiences in Africa"

People

Artist: Joseph Beuys, German 1921 - 1986

Artist: Charles Wilp, German 1932 - 2005

Publisher: Qumran Verlag, Frankfurt,

Date

1980

Classification

Multiples

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Head 100
Human 98.5
Person 98.5
Face 91.3
Art 73.3
LCD Screen 66.3
Electronics 66.3
Display 66.3
Screen 66.3
Monitor 66.3
Sculpture 59.1
Indoors 55.5
Interior Design 55.5

Imagga
created on 2022-02-25

television 65.8
senior 33.7
man 33.6
telecommunication system 30.1
portrait 29.1
elderly 27.8
male 27.7
person 26.6
monitor 26.5
old 26.5
adult 25.3
looking 24
laptop 23.6
computer 23.2
mature 23.2
screen 23.2
happy 21.3
broadcasting 21.2
people 20.6
background 19.5
office 19.5
smile 19.2
technology 18.5
retirement 18.2
business 17.6
display 17.5
retired 17.4
liquid crystal display 17.3
eyes 17.2
smiling 16.6
work 16.5
gray 16.2
sitting 15.5
telecommunication 15.1
businessman 15
lady 14.6
hair 14.3
face 13.5
modern 13.3
electronic equipment 13.2
equipment 13
home 12
expression 11.9
grandmother 11.7
serious 11.4
age 11.4
device 11
casual 11
handsome 10.7
working 10.6
one 10.5
desk 10.4
executive 10.3
glasses 10.2
lifestyle 10.1
medium 10
aged 10
hand 9.9
elder 9.9
attractive 9.8
human 9.7
look 9.6
black 9.6
professional 9.5
men 9.4
couple 8.7
aging 8.6
happiness 8.6
space 8.5
billboard 8.4
health 8.3
alone 8.2
citizen 7.9
maturity 7.9
electronic device 7.8
pensioner 7.8
hands 7.8
corporate 7.7
thinking 7.6
head 7.6
suit 7.2
job 7.1
indoors 7

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

wall 99.7
television 99
human face 99
monitor 98.5
man 98.2
screen 97
indoor 94.7
person 91.7
text 79.4
glasses 76.2
screenshot 56.3
picture frame 44.3
flat 25.3

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 39-47
Gender Male, 99.9%
Sad 46.8%
Confused 38.8%
Disgusted 11.7%
Calm 1.2%
Angry 0.5%
Surprised 0.5%
Fear 0.4%
Happy 0.2%

Microsoft Cognitive Services

Age 35
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Possible
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%

Captions

Microsoft

a screen shot of a man 87.5%
a man standing in front of a television screen 52.8%
a man with a large screen television 52.2%