Human Generated Data

Title

Untitled (front view of wrecked early model car)

Date

1930-1940

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9094

Human Generated Data

Title

Untitled (front view of wrecked early model car)

People

Artist: Martin Schweig, American 20th century

Date

1930-1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Machine 96.7
Wheel 96.7
Transportation 93.2
Car 93.2
Automobile 93.2
Vehicle 93.2
Brick 86.1
Drawing 83.3
Art 83.3
Spoke 79.2
Sketch 71.2
Leisure Activities 67.9
Apparel 62.9
Clothing 62.9
Outdoors 61.5
Alloy Wheel 61.2
Tire 60.1
Musician 59.4
Guitar 59.4
Musical Instrument 59.4
Performer 59.4
Guitarist 59.4
Nature 57.9
Face 57.4

Imagga
created on 2022-01-23

pay-phone 26.5
equipment 24.6
telephone 23.9
electronic equipment 18.9
old 18.1
shopping cart 17.4
building 16
gas pump 15.6
window 14.1
wheeled vehicle 13.9
handcart 13.7
pump 13.5
business 13.3
architecture 13.3
transportation 12.5
device 11.3
travel 11.3
city 10.8
container 10.8
modern 10.5
urban 10.5
wheel 10.5
technology 10.4
black 10.2
transport 10
mechanical device 9.7
metal 9.6
street 9.2
antique 9
digital 8.9
chair 8.8
light 8.7
cold 8.6
wall 8.5
mechanism 8.4
house 8.3
speed 8.2
cup 8.2
graffito 8.1
life 7.8
people 7.8
bike 7.8
scene 7.8
industry 7.7
winter 7.7
finance 7.6
newspaper 7.5
drawing 7.4
support 7.3
computer 7.2
work 7.2
snow 7.1
tire 7.1
steel 7.1
parking meter 7

Microsoft
created on 2022-01-23

text 99.2
wheel 97.4
land vehicle 94.9
tire 94.5
vehicle 93.3
sketch 79.2
black and white 77.9
auto part 77
drawing 75.7
car 59.1
old 58.4

Feature analysis

Amazon

Wheel 96.7%
Car 93.2%

Captions

Microsoft

a vintage photo of a person 48.7%

Text analysis

Amazon

R
THAN
-603
R GARANE
GARANE
567 -603
567
I
DELL
835
are
MD
AZOPEA
Arrooor

Google

R
R GARE
GARE