Human Generated Data

Title

Untitled

Date

July 28, 1952

People

Artist: Minor White, American 1908 - 1976

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Loan, 3.1994.52

Copyright

© The Trustees of Princeton University

Human Generated Data

Title

Untitled

People

Artist: Minor White, American 1908 - 1976

Date

July 28, 1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Loan, 3.1994.52

Copyright

© The Trustees of Princeton University

Machine Generated Data

Tags

Amazon
created on 2022-01-30

Machine 96.5
Car 96.5
Automobile 96.5
Vehicle 96.5
Transportation 96.5
Spoke 96
Wheel 95.6
Tire 94.9
Alloy Wheel 92.8
Advertisement 91.8
Car Wheel 89.7
Wheel 89.4
Collage 86.5
Poster 86.5
Sports Car 81.5
Art 72.2
Car 64
Coupe 58.9
Text 55.1

Clarifai
created on 2023-10-29

no person 99.7
art 95.5
one 94.4
nature 93.9
retro 93.5
monochrome 92.7
water 92.1
people 91.5
outdoors 90.6
graphic design 90
two 88.2
paper 88
street 87.4
wear 86.1
travel 84.3
painting 84.1
artistic 83.5
architecture 83.3
interior design 82.7
summer 82.5

Imagga
created on 2022-01-30

fastener 35.1
locker 29.8
old 27.8
restraint 25.3
paper 24.4
grunge 23
vintage 22.3
tool 21
aged 19.9
device 19.9
texture 18.7
antique 17.3
plane 16.4
pencil sharpener 16
wood 15.8
retro 15.6
brown 15.4
cutting implement 14.9
hand tool 14.9
box 14.6
catch 13.9
sharpener 13.8
empty 13.7
edge tool 13.4
frame 13.3
wall 12.8
rusty 12.4
design 12.4
door 12.4
close 12
dirty 11.7
decoration 11.6
furniture 11.5
worn 11.4
hasp 11.4
art 11.1
border 10.8
blade 10.7
container 10.6
blank 10.3
card 10.2
color 10
cutter 9.9
wooden 9.7
pattern 9.6
ancient 9.5
metal 8.8
detail 8.8
surface 8.8
envelope 8.8
cabinet 8.7
stamp 8.4
black 8.4
paint 8.1
home 8
interior 8
yellow 7.9
business 7.9
textured 7.9
album 7.8
space 7.8
obsolete 7.7
mail 7.7
damaged 7.6
binder 7.6
weathered 7.6
sign 7.5
page 7.4
letter 7.3
object 7.3
artwork 7.3
book 7.2
building 7.2
material 7.1
decor 7.1

Google
created on 2022-01-30

Microsoft
created on 2022-01-30

text 99.5
car 93.5
vehicle 92.6
land vehicle 90.1

Color Analysis

Feature analysis

Amazon

Car
Wheel
Car 96.5%
Car 64%
Wheel 95.6%
Wheel 89.4%

Categories

Imagga

paintings art 97.8%
interior objects 1.5%

Captions

Microsoft
created on 2022-01-30

text 32.8%

Text analysis

Amazon

the
ve
experience the
When ve know from
from
object
know
experience
61
When
object moves.
moves.

Google

61 When we know from experience the object moves.
61
When
we
know
from
experience
the
object
moves.