Human Generated Data

Title

Untitled (machine parts displayed on white ground)

Date

c. 1950

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1853

Human Generated Data

Title

Untitled (machine parts displayed on white ground)

People

Artist: John Deusing, American active 1940s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Text 71
Rug 63.4
Food 59.6
Meal 59.6

Imagga
created on 2021-12-14

blackboard 66.6
grunge 28.1
black 20.5
design 19.7
texture 18.8
vintage 17.4
art 16.9
tracing 16
board 15.7
chalk 15.6
business 14.6
frame 14.4
pattern 14.4
old 13.9
graphic 13.1
retro 13.1
decoration 13
wallpaper 13
silhouette 12.4
drawing 11.3
finance 11
aged 10.9
symbol 10.8
idea 10.7
digital 10.5
education 10.4
color 10
element 9.9
backdrop 9.9
modern 9.8
chalkboard 9.8
antique 9.5
grungy 9.5
space 9.3
dark 9.2
data 9.1
school 9.1
puzzle 9.1
card 9.1
dirty 9
sign 9
blade 9
technology 8.9
style 8.9
success 8.9
creative 8.8
text 8.7
artistic 8.7
floral 8.5
learn 8.5
game 8.3
shape 8.2
paint 8.1
lesson 7.8
wall 7.8
classroom 7.8
blank 7.7
stain 7.7
money 7.7
communication 7.6
study 7.5
light 7.4
connection 7.3
computer 7.2
celebration 7.2
material 7.1
science 7.1
clock 7.1

Google
created on 2021-12-14

Font 82.6
Rectangle 80.9
Blackboard 76.3
Art 74.3
Pattern 71.6
Circle 70.4
Graphics 64.5
Event 63.8
Wood 62.5
Visual arts 61.4
Brand 60.1
Graphic design 53.9
Room 53.3
Illustration 53.1
Metal 50.2

Microsoft
created on 2021-12-14

text 99.7
drawing 68.7
black and white 66.4
art 52.6

Feature analysis

Amazon

Rug 63.4%

Captions

Microsoft

a close up of a mans face 49%
a close up of a logo 48.9%
a screenshot of a cell phone 43.8%

Text analysis

Amazon

ЧАНОЧ