Human Generated Data

Title

Wine pitcher

Date

1629

People

-

Classification

Vessels

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Purchase in memory of Eda K. Loeb, BR61.88

Human Generated Data

Title

Wine pitcher

Date

1629

Classification

Vessels

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Purchase in memory of Eda K. Loeb, BR61.88

Machine Generated Data

Tags

Amazon
created on 2023-08-30

Archaeology 90.4
Cup 74.6
Stein 74.6
Pottery 72.9
Smoke Pipe 67.9
Architecture 57.8
Pillar 57.8
Emblem 57.7
Symbol 57.7
Totem 55.4
Jar 55.2

Clarifai
created on 2023-11-01

no person 99.3
ancient 97.3
container 97.1
one 95.1
old 93.8
sculpture 92.7
religion 91.7
antique 91.6
still life 90.7
art 90
retro 88.7
wealth 86
metalwork 79.3
isolate 77.8
wear 77.5
vintage 76.3
spirituality 73.9
people 73.8
desktop 72.8
Buddha 71.6

Imagga
created on 2018-12-27

bottle 75.2
vessel 36.4
water bottle 34.9
container 34.8
drink 24.2
glass 21
liquid 20.9
beverage 20.8
column 19
alcohol 16.6
dinner dress 16
clean 15.9
close 15.4
beer bottle 14.7
refreshment 14.5
water 13.3
object 13.2
clear 13.1
cold 12.9
wet 12.5
cap 12.5
gold 12.3
food 12.1
bottled 11.8
bottles 11.7
thirst 11.7
transparent 11.6
refreshing 11.5
pure 11.1
fresh 11.1
freshness 10.8
beer 10.7
metal 10.5
nobody 10.1
tower 9.6
bubbles 9.5
healthy 9.4
natural 9.4
bar 9.2
plastic 9.2
full 9.1
drop 9.1
cool 8.9
art 8.8
closeup 8.8
party 8.6
culture 8.5
old 8.4
health 8.3
dagger 8.3
single 8.2
weapon 8.1
diet 8.1
spa 8.1
steel 8
yellow 7.9
cooking 7.9
drunk 7.8
minaret 7.8
thirsty 7.8
condensation 7.8
color 7.8
ancient 7.8
golden 7.7
herb 7.6
knife 7.6
lager 7.5
oil 7.4
reflection 7.4
brown 7.4
cash 7.3
black 7.3
kitchen 7.2
pop bottle 7.1
life 7

Google
created on 2018-12-27

Microsoft
created on 2018-12-27

indoor 85.9
monochrome 85.9
black and white 74.8
art 57.7
winter 49.8
ancient 31.2
museum 27
ceramic 26.7

Color Analysis

Feature analysis

Amazon

Smoke Pipe 67.9%

Categories

Captions