Human Generated Data

Title

Fragment with Figures Between Urns

Date

-

People

-

Classification

Textile Arts

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Nanette B. Rodney, 1985.120

Human Generated Data

Title

Fragment with Figures Between Urns

Classification

Textile Arts

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Nanette B. Rodney, 1985.120

Machine Generated Data

Tags

Amazon
created on 2019-07-06

Rug 98.9
Pattern 82.8
Embroidery 69.6
Art 59.6
Ornament 59.6
Tapestry 59.6
Applique 56.4

Clarifai
created on 2019-07-06

vintage 98.4
old 98.2
paper 97.9
retro 97.6
antique 97.3
art 97.1
ancient 96.3
map 96.2
symbol 92.8
desktop 92.6
snap 92.5
collection 91.3
letter 90.4
texture 90.2
wear 90
print 87.9
parchment 87.7
manuscript 87.6
design 87.3
card 86.9

Imagga
created on 2019-07-06

lace 64.1
old 31.4
texture 29.9
floor cover 27.5
paper 26.7
vintage 25.7
pattern 25.3
prayer rug 24
grunge 23
retro 23
rug 22.1
doormat 21.8
antique 20.8
covering 20.3
brown 19.9
mat 19.2
art 18.8
design 18.1
textured 16.7
burlap 16.6
ancient 16.5
surface 15.9
color 15.6
decoration 15.4
wallpaper 15.3
close 14.9
detail 14.5
dirty 14.5
furnishing 14.4
frame 14.2
letter 13.8
rough 13.7
aged 13.6
parchment 13.5
obsolete 13.4
material 13.4
canvas 13.3
fabric 13.2
text 13.1
border 12.7
worn 12.4
decorative 11.7
decor 11.5
weathered 11.4
grungy 11.4
sheet 11.3
paint 10.9
flower 10.8
backdrop 10.7
empty 10.3
symbol 10.1
blank 9.4
floral 9.4
document 9.3
collection 9
backgrounds 8.9
closeup 8.8
stained 8.7
used 8.7
yellow 8.6
colorful 8.6
card 8.5
finance 8.5
tile 8.4
note 8.3
cash 8.2
style 8.2
currency 8.1
mosaic 8
stamp 7.9
postage 7.9
travel 7.8
handmade 7.8
torn 7.7
spotted 7.7
cardboard 7.7
money 7.7
blanket 7.6
set 7.6
post 7.6
rusty 7.6
textile 7.6
page 7.4
book 7.3
history 7.2
cotton 7.2

Google
created on 2019-07-06

Textile 72.9
Tapestry 58.4
Art 58.1
Beige 54.8

Microsoft
created on 2019-07-06

gold 55
motif 54.4
handwriting 54.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 48-68
Gender Male, 80.9%
Sad 2%
Angry 3.2%
Calm 78.1%
Surprised 5.8%
Disgusted 4.3%
Confused 5%
Happy 1.6%

Feature analysis

Amazon

Rug 98.9%

Categories

Imagga

food drinks 72.3%
paintings art 23.3%
nature landscape 4.1%

Captions

Microsoft
created on 2019-07-06

a pair of jeans 39.5%

Azure OpenAI

Created on 2024-02-06

The image shows an ancient textile fragment, possibly from the medieval period. It is a rectangular piece with frayed edges and looks to be made of a brownish-gold fabric with darker brown designs woven into it. The motif includes symmetrical decorative elements and what appears to be two human figures, possibly warriors or knights, as they seem to be holding weapons and riding animals resembling horses. Between the figures are two upright vases, and there are also circular emblems with figures in them in the upper corners, along with bird-like designs. The textile is damaged, with visible holes and signs of wear, indicating its old age. The background the fragment lies on is a light grey, contrasting with the textile's colors.

Anthropic Claude

Created on 2024-03-29

The image appears to be a fragment of a textile or fabric, likely an ancient or historical artifact. The fabric is predominantly yellow or gold in color, with intricate brown or dark brown designs and patterns woven or embroidered into it. The designs feature various abstract shapes, figures, and symbols that appear to have a tribal or indigenous aesthetic. The fabric seems to be quite aged and worn, with some damage or fraying visible along the edges. Overall, the image suggests a cultural or artistic textile from a historical or archaeological context.

Text analysis

Google

465
465