Human Generated Data

Title

Untitled

Date

2007

People

Artist: David Levinthal, American born 1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous gift, 2017.285

Human Generated Data

Title

Untitled

People

Artist: David Levinthal, American born 1949

Date

2007

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous gift, 2017.285

Machine Generated Data

Tags

Amazon
created on 2019-04-10

Apparel 92.2
Clothing 92.2
Car 91.4
Formula One 91.4
Vehicle 91.4
Automobile 91.4
Transportation 91.4
Spoke 84.7
Machine 84.7
Electronics 79.6
Screen 79.6
Monitor 79.6
Display 79.6
Wheel 72.2
Advertisement 71.2
Helmet 67.3
Tire 66.6
Alloy Wheel 66.6
Poster 66.5
Text 59.8
Crash Helmet 58.6

Clarifai
created on 2018-11-05

no person 96.2
desktop 94.6
picture frame 94
retro 91.1
old 87.4
image 85.8
design 85.4
travel 84.8
screen 83.3
vintage 82.9
isolated 82.7
technology 82.7
business 82.5
margin 82.2
illustration 82.1
art 82.1
antique 81.9
track 81.7
modern 81
collage 80.1

Imagga
created on 2018-11-05

car 79.4
vehicle 49.6
racer 46.5
transportation 35.9
motor vehicle 33.7
speed 30.2
automobile 23.9
transport 23.7
auto 22
race 22
drive 21.8
bobsled 21.4
road 19.9
fast 19.6
equipment 17.7
sled 17.1
sport 16.7
wheeled vehicle 16.6
motor 15.5
travel 15.5
driving 14.5
device 14
blur 13.9
engine 13.5
helmet 13.5
wheel 13.4
traffic 12.3
movement 12.2
light 12
black 12
technology 11.9
color 11.7
cassette tape 11.4
luxury 11.1
safety 11
automotive 10.8
highway 10.6
yellow 10.6
industry 10.2
sports 10.2
street 10.1
power 10.1
snowmobile 10
modern 9.8
motion 9.4
magnetic tape 9.4
glass 9.3
crash helmet 9.2
cab 9.1
driver 8.9
metal 8.9
racing 8.8
cars 8.8
wheels 8.8
gear 8.7
fire 8.4
toy 8.3
tracked vehicle 8
close 8
track 7.7
blurred 7.7
moving 7.6
style 7.4
machine 7.3
business 7.3
industrial 7.3
people 7.3
memory device 7.2
bumper car 7.2
shiny 7.1
steel 7.1
work 7.1

Google
created on 2018-11-05

poster 79.6
font 52

Microsoft
created on 2018-11-05

Color Analysis

Feature analysis

Amazon

Monitor 79.6%

Categories

Imagga

food drinks 74.8%
cars vehicles 22.5%
interior objects 1.7%

Captions

Microsoft
created on 2018-11-05

a close up of a screen 69.2%
a close up of a computer screen 58.1%
a close up of a sign 58%

Text analysis

Amazon

PATROL
MARS PATROL
MARS
/
DQ
DQ te /
te

Google

MARS PATIROL
MARS
PATIROL