Human Generated Data

Title

Untitled, 1976

Date

1976, printed 2008

People

Artist: Robert Gober, American born 1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.197

Copyright

© Robert Gober

Human Generated Data

Title

Untitled, 1976

People

Artist: Robert Gober, American born 1954

Date

1976, printed 2008

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Person 96.7
Human 96.7
Person 96.1
Postage Stamp 78.5
Text 67.9

Clarifai
created on 2018-03-23

retro 97.7
vintage 93
old 91.6
paper 91.4
people 87.5
man 83.7
illustration 81.6
no person 81.4
desktop 80.9
bill 79.6
letter 79
post 78.2
art 77.4
military 77.3
antique 75.8
flag 74.7
symbol 74.6
business 72.2
adult 72.2
text 71.4

Imagga
created on 2018-03-23

stamp 49.4
mail 48.8
vintage 44.6
postmark 43.3
envelope 40.4
postage 40.3
postal 38.3
letter 36.7
philately 35.5
retro 34.4
circa 33.5
old 29.3
printed 28.5
card 28.2
message 26.5
aged 26.2
shows 24.6
container 23.6
ancient 23.3
address 22.5
post 20
negative 18.7
post office 17.8
film 17.1
paper 16.5
newspaper 15.4
stamps 14.8
global 14.6
product 14.3
money 13.6
creation 12.7
1967 11.9
philatelic 11.9
currency 11.7
symbol 11.4
grunge 11.1
packet 10.9
cutting 10.6
package 10.3
photographic paper 10.2
paint 10
male 9.9
soviet 9.9
art 9.8
communications 9.6
international 9.5
unique 9.5
people 9.5
frame 9.4
closeup 9.4
man 9.4
collection 9
science 8.9
space 8.5
face 8.5
black 8.4
head 8.4
sport 8.3
competition 8.2
pattern 8.2
design 8.1
bank 8.1
history 8
bag 8
business 7.9
antique 7.8
portrait 7.8
states 7.7
decoration 7.7
hobby 7.6
savings 7.5
banking 7.4
cash 7.3

Google
created on 2018-03-23

white 95.8
text 92.3
black and white 91.7
product 80.3
textile 78
design 73.4
font 70.8
monochrome 68.9
monochrome photography 68.3
material 67.1
pattern 66
brand 55.3
pattern 53.8

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 19-36
Gender Female, 54.8%
Sad 50.7%
Surprised 45.3%
Disgusted 46.6%
Confused 45.1%
Calm 45.5%
Happy 46%
Angry 45.9%

AWS Rekognition

Age 16-27
Gender Female, 54.2%
Confused 45.1%
Surprised 45.1%
Angry 45.2%
Sad 45.3%
Happy 52.6%
Calm 46.6%
Disgusted 45.1%

Microsoft Cognitive Services

Age 9
Gender Female

Microsoft Cognitive Services

Age 9
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 96.7%

Captions

Microsoft

a person in a white shirt 48.1%
a black and white photo of a person 48%
a person posing for a photo 47.9%

Text analysis

Amazon

arbus
dianc
O0
9 dianc arbus 009-
:t
E
9
009-
DPEPODOO
omcugetomesorpooo