Human Generated Data

Title

Band with Human and Animal Figures

Date

People
Classification

Textile Arts

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Charles Bain Hoyt, 1931.40

Human Generated Data

Title

Band with Human and Animal Figures

People
Date

Classification

Textile Arts

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Rug 98.1
Tapestry 68.4
Ornament 68.4
Art 68.4
Applique 62.9

Imagga
created on 2022-01-28

wallet 100
container 100
case 92.4
texture 33.4
bag 32.3
old 30
paper 28.2
pencil box 27.9
vintage 27.3
pattern 26.7
box 25.9
retro 24.6
grunge 23.8
purse 20.6
material 19.7
textured 19.3
color 18.9
design 18.6
art 16.9
decoration 16.7
antique 16.4
wallpaper 16.1
decor 15.9
close 15.4
aged 15.4
decorative 15
surface 15
frame 15
brown 14
ancient 13.8
colorful 13.6
card 13.6
border 13.6
backdrop 11.5
canvas 11.4
blank 11.1
document 11.1
wall 11.1
rough 10.9
cotton 10.8
worn 10.5
detail 10.5
textile 10.4
empty 10.3
symbol 10.1
damaged 9.5
floral 9.4
fabric 9.2
collection 9
style 8.9
ornament 8.6
paint 8.2
board 8.1
business 7.9
handmade 7.8
travel 7.8
cloth 7.7
rusty 7.6
fashion 7.5
creativity 7.4
greeting 7.4
page 7.4
cover 7.4
letter 7.3
artwork 7.3
ornate 7.3
graphic 7.3
yellow 7.3
doormat 7.2
dirty 7.2
creative 7.1

Google
created on 2022-01-28

Brown 98
Rectangle 89.9
Textile 87.8
Beige 83.3
Art 81.6
Wood 77.8
Creative arts 76.3
Pattern 74.9
Font 72.8
Linens 68.3
Fashion accessory 66.7
Visual arts 66.5
Metal 64.4
Magenta 64.4
Motif 62.8
Circle 56.3
Home accessories 55.8
Rug 55.5
Needlework 53.7
Peach 52.6

Microsoft
created on 2022-01-28

text 86.8
furniture 75.2
rug 57.5
embroidery 54.5
fabric 10.1

Feature analysis

Amazon

Rug 98.1%

Captions

Microsoft

a close up of a rug 61.5%
close up of a rug 53.5%