Human Generated Data

Title

Untitled (pattern)

Date

1952

People

Artist: Brett Weston, American 1911 - 1993

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Robert M. Sedgwick II Fund, 2.2002.1329

Human Generated Data

Title

Untitled (pattern)

People

Artist: Brett Weston, American 1911 - 1993

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Plant 93.2
Art 88.1
Mammal 86.9
Animal 86.9
Pet 86.9
Cat 86.9
Tree 74.4
Apparel 71.3
Clothing 71.3
Wall 67.8
Graphics 64.8
Outdoors 62.4
Crowd 61.7
Modern Art 61.5
Crystal 59.2
Drawing 58.2
Floral Design 58
Pattern 58
Suit 57.5
Coat 57.5
Overcoat 57.5

Imagga
created on 2022-01-08

zebra 100
equine 79.6
ungulate 37.7
graffito 34
decoration 27.6
black 27
safari 21.2
pattern 19.8
wildlife 19.6
wild 18.3
mammal 16
portrait 15.5
game 15.2
horse 14.2
stripes 12.5
motion 12
design 11.9
hair 11.9
face 11.4
color 11.1
head 10.9
fantasy 10.8
art 10.5
detail 10.5
texture 10.4
newspaper 10.1
model 10.1
light 10
close 9.7
shape 9.7
sexy 9.6
fur 9.6
looking 9.6
animals 9.3
park 9.1
negative 9
lines 9
futuristic 9
style 8.9
zoo 8.7
male 8.5
striped 8.4
dark 8.4
fractal 8.3
effect 8.2
creation 8.2
eye 8
graphic 8
zebras 7.9
grass 7.9
cute 7.9
curve 7.9
product 7.8
artistic 7.8
south 7.5
element 7.4
backdrop 7.4
generated 7.4
flow 7.4
digital 7.3
soft 7.2
body 7.2
film 7.2
look 7

Google
created on 2022-01-08

Nature 90.1
Black 89.5
Organism 86.1
Rectangle 85.4
Art 83.5
Black-and-white 83
Line 81.9
Adaptation 79.3
Feather 79.3
Pattern 76.7
Grass 76.3
Eyelash 75.6
Monochrome photography 75.5
Font 75.4
Beauty 75
Painting 74.8
Monochrome 73.3
Visual arts 69.3
Design 68.4
Plant 67.1

Microsoft
created on 2022-01-08

text 95.2
sketch 95
drawing 91.3
black and white 64.5
painting 64.1
art 53.1

Feature analysis

Amazon

Cat 86.9%

Captions

Microsoft

a group of people that are standing in the water 28.9%