Human Generated Data

Title

Albarello

Date

c. 1540

People
Classification

Vessels

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Edward W. Forbes, 1948.87

Human Generated Data

Title

Albarello

People
Date

c. 1540

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2022-06-11

Jar 94.3
Wristwatch 93.9
Pottery 74.2
Vase 67.2
Milk 62.2
Beverage 62.2
Drink 62.2
Stein 59.1
Jug 59.1

Imagga
created on 2022-06-11

saltshaker 100
shaker 100
container 100
jar 38.6
glass 38.3
object 24.9
drink 21.7
cup 20.2
vase 19.8
earthenware 19
bottle 16.9
ceramic ware 16.1
liquid 15.7
beverage 15.5
majolica 15.3
close 14.8
utensil 14.4
old 13.9
vessel 13.8
food 12.7
money 11.9
nobody 11.7
transparent 11.6
alcohol 11.4
antique 11.4
traditional 10.8
closeup 10.8
mug 10.8
beer 10.7
ingredient 10.6
metal 10.5
health 10.4
dollar 10.2
cash 10.1
refreshment 10
currency 9.9
single 9.9
ancient 9.5
empty 9.5
healthy 9.4
culture 9.4
finance 9.3
business 9.1
vintage 9.1
kitchen 8.9
water 8.7
cold 8.6
golden 8.6
pitcher 8.4
pot 8.4
jug 8.3
bar 8.3
china 8.2
wealth 8.1
bank 8.1
decoration 8
clear 7.8
table 7.8
party 7.7
herbal 7.6
craft 7.6
bubbles 7.6
wood 7.5
gold 7.4
natural 7.4
brown 7.4
banking 7.4
full 7.3
detail 7.2
life 7

Google
created on 2022-06-11

Microsoft
created on 2022-06-11

vase 96.9
lid 71.3
plant 66.4
old 62.2
earthenware 61.3
black and white 55.8
pottery 55.2
jar 53.2
ceramic ware 37.7
drawn 37
image 31
can 29.8

Feature analysis

Amazon

Wristwatch 93.9%
Milk 62.2%

Captions

Microsoft

a close up of a glass vase 63.5%
a close up of a can vase on a table 56.5%
a close up of a vase 56.4%

Text analysis

Google

·C··C 3
·
C
··
3