Human Generated Data

Title

Albarello

Date

c. 1580

People
Classification

Vessels

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Edward W. Forbes, 1954.146

Human Generated Data

Title

Albarello

People
Date

c. 1580

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2022-06-17

Tin 95.6
Can 93.2
Person 80.2
Human 80.2

Imagga
created on 2022-06-17

container 100
cup 94.1
glass 55.2
drink 44.3
tableware 42.3
beverage 34
ice 32.6
alcohol 31.7
cold 28.4
liquid 28.2
water 28
ware 27.6
vessel 26.6
drop 26.3
splash 23.4
bar 23.1
cool 23.1
cocktail 22.1
refreshment 21.8
wet 21.5
transparent 20.6
reflection 20.3
clear 19.2
fresh 18.3
close 17.7
bubble 16.9
can 16.5
object 16.1
vodka 15.9
freshness 15.8
bottle 15.7
currency 15.3
jar 15.2
drops 15.1
cash 14.6
splashing 14.5
cube 14.5
money 14.5
party 13.8
vase 13.8
food 13.6
cubes 13.6
mug 13.1
dollar 13
refresh 12.7
finance 12.7
clean 12.5
wealth 11.7
bucket 11.6
financial 11.6
bin 11.5
milk can 11.4
bubbles 11.4
pure 11.1
paper 11
alcoholic 10.7
pour 10.7
health 10.4
crystal 10.3
china 10.1
closeup 10.1
healthy 10.1
bank 9.9
liquor 9.8
thirst 9.7
aqua 9.5
tea 9.2
tumbler 8.9
soda 8.8
restaurant 8.6
refreshing 8.6
ceramic ware 8.6
life 8.6
porcelain 8.4
savings 8.4
banking 8.3
coffee mug 8.3
ashcan 8.2
single 8.2
goblet 8.1
business 7.9
amber 7.8
nobody 7.8
frozen 7.6
bill 7.6
empty 7.5
purity 7.4
investment 7.3
full 7.3
punch 7.3
black 7.2
shiny 7.1

Google
created on 2022-06-17

Microsoft
created on 2022-06-17

soft drink 94.9
text 90.3
black and white 87.6
cup 72.7
drink 68.2
vase 67.6
bottle 64.3
can 21.4

Feature analysis

Amazon

Person 80.2%

Captions

Microsoft

a can of soda 43.7%
a close up of a can 43.6%
an old photo of a can 43.5%

Text analysis

Amazon

feha-magi
UND

Google

mar
fchide
fchide mar