Human Generated Data

Title

Untitled ("New York, New Haven, and Hartford")

Date

c. 1970

People

Artist: Michael Mathers, American born 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1819

Human Generated Data

Title

Untitled ("New York, New Haven, and Hartford")

People

Artist: Michael Mathers, American born 1945

Date

c. 1970

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Clothing 93.2
Apparel 93.2
Text 88
Human 86.8
Person 81
People 61.3
Train 59.7
Transportation 59.7
Vehicle 59.7
Pants 57.5

Imagga
created on 2022-01-22

old 29.2
texture 29.2
container 28.9
mailbox 27
vintage 26.5
grunge 26.4
box 22.3
antique 20.8
retro 19.7
lock 19.4
paper 18.8
device 18.4
aged 18.1
design 18
frame 17.5
rusty 17.1
padlock 16.3
bag 16
damaged 15.3
pattern 15
ancient 14.7
rough 14.6
dirty 14.5
material 14.3
empty 13.7
metal 13.7
old fashioned 13.3
structure 13
art 13
wall 12.8
border 12.7
briefcase 12.6
decay 12.5
parchment 12.5
spot 12.5
faded 11.7
close 11.4
weathered 11.4
textured 11.4
water 11.3
security 11
grime 10.7
ragged 10.7
backdrop 10.7
fracture 10.7
stains 10.7
crumpled 10.7
surface 10.6
grungy 10.4
fastener 10.2
tracery 9.7
business 9.7
crack 9.7
detail 9.6
rust 9.6
black 9.6
iron 9.5
graphic 9.5
space 9.3
grain 9.2
wallpaper 9.2
historic 9.2
mailbag 9.1
silver 8.8
broad 8.8
crease 8.8
mottled 8.8
succulent 8.7
restraint 8.2
drop 8.2
paint 8.1
backgrounds 8.1
closeup 8.1
symbol 8.1
light 8
steel 8
decoration 7.9
scratched 7.8
travel 7.7
money 7.6
worn 7.6
textures 7.6
equipment 7.5
style 7.4
metallic 7.4
brown 7.4
gray 7.2
open 7.2

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.9
drawing 94.6
black and white 90
handwriting 86.6
poster 83.3
cartoon 69.8
street 59.3
monochrome 59.3
old 48.2

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Male, 96.5%
Calm 98.2%
Angry 0.5%
Sad 0.4%
Surprised 0.3%
Fear 0.2%
Confused 0.2%
Disgusted 0.1%
Happy 0.1%

Feature analysis

Amazon

Person 81%
Train 59.7%

Captions

Microsoft

a close up of a sign 74.5%
graffiti on a wall 42.9%
an old photo of a person 40.8%

Text analysis

Amazon

0976
L.D.
GENER 14 ELECTRIC L.D.
LOCANOTIVE
75.333
ELECTRIC
ANTRICAN LOCANOTIVE LTD,
GENER 14
vellarged
LTD,
ANTRICAN
adidas

Google

0976 JONENEETANY 75533
0976
JONENEETANY
75533