Human Generated Data

Title

Detail from "1978-2000"

Date

1978-2000, printed 2003-2005

People

Artist: Robert Gober, American born 1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.202.5

Copyright

© Robert Gober

Human Generated Data

Title

Detail from "1978-2000"

People

Artist: Robert Gober, American born 1954

Date

1978-2000, printed 2003-2005

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.202.5

Copyright

© Robert Gober

Machine Generated Data

Tags

Amazon
created on 2019-04-06

Interior Design 95.1
Indoors 95.1
Room 74.7
Home Decor 65.8
Mirror 60
Silhouette 58.3
Nature 57.5
Electronics 57.1
Screen 57.1

Clarifai
created on 2018-03-23

people 94.8
picture frame 94.7
no person 94
art 93.8
window 93.4
monochrome 91.1
nature 90.5
abstract 89.2
desktop 89
blur 87.4
margin 87.2
winter 87.2
water 86.9
landscape 86.9
snow 86.8
sky 85.8
painting 85.3
street 85.2
light 84.7
empty 84.2

Imagga
created on 2018-03-23

car mirror 100
mirror 100
reflector 100
grunge 40
vintage 33.9
antique 32.9
old 32.7
texture 31.3
retro 27
material 26.8
border 26.2
wall 25.7
frame 25
aged 24.4
damaged 23.8
dirty 20.8
rusty 20
ancient 19.9
grungy 19
backdrop 19
black 18.6
design 18.6
paper 18
empty 18
structure 17.4
old fashioned 17.1
pattern 17.1
rough 16.4
weathered 16.2
textured 15.8
art 15.6
decay 15.4
road 15.4
dark 15
car 14.8
space 14.7
spot 14.4
wallpaper 13.8
grime 13.7
faded 13.6
stains 13.6
screen 13
mottled 12.7
messy 12.6
transportation 12.6
rust 12.5
stain 12.5
parchment 12.5
graphic 12.4
television 12.1
ragged 11.7
fracture 11.7
crack 11.6
grain 11.1
gray 10.8
tracery 10.7
crumpled 10.7
aging 10.5
obsolete 10.5
digital 10.5
blank 10.3
inside 10.1
paint 10
broad 9.8
crease 9.8
interior 9.7
succulent 9.7
surface 9.7
driving 9.7
text 9.6
automobile 9.6
dirt 9.5
worn 9.5
drive 9.5
vehicle 9.4
historic 9.2
cement 8.7
layer 8.7
light 8.7
concrete 8.6
nobody 8.6
night 8
decoration 8
urban 7.9
plaster 7.9
scratched 7.8
noise 7.8
abandoned 7.8
scratch 7.8
driver 7.8
travel 7.7
broken 7.7
edge 7.7
stained 7.7
stone 7.6
element 7.4
display 7.4
film 7.2
computer 7.2
person 7.1

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

window 91
image 30.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-38
Gender Female, 84.9%
Happy 2%
Confused 1.9%
Surprised 2.9%
Angry 3.2%
Calm 44.9%
Sad 43.5%
Disgusted 1.5%

Captions