Human Generated Data

Title

Nutritum; Faience Jar

Date

1764

People
Classification

Vessels

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Mr. and Mrs. Edward M. Pflueger, BR69.232

Human Generated Data

Title

Nutritum; Faience Jar

People
Date

1764

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2022-06-18

Pottery 99.9
Art 99.9
Porcelain 99.9
Jar 94.4
Urn 83.2
Cake 73
Birthday Cake 73
Food 73
Dessert 73
Person 62.3
Human 62.3

Imagga
created on 2022-06-18

container 76.3
china 72.2
porcelain 60
jar 52.9
ceramic ware 52.2
saltshaker 49.1
glass 41.4
shaker 40.2
vase 38.7
utensil 38
vessel 29.7
bottle 29
drink 28.4
liquid 22.9
food 20.3
health 19.5
cup 18.6
beverage 18.1
healthy 15.7
fresh 15.7
close 15.4
alcohol 14.9
natural 14.7
water 14.7
object 14.7
mug 14.4
earthenware 13.7
clean 13.4
cold 12.9
empty 12
refreshment 11.8
transparent 11.6
bar 11.1
full 11
soap dispenser 10.6
luxury 10.3
closeup 10.1
drop 10
pot 10
gold 9.9
diet 9.7
medicine 9.7
ingredient 9.7
color 9.5
dispenser 9.4
bubble 9.4
freshness 9.1
traditional 9.1
milk 9.1
cool 8.9
jug 8.9
thirsty 8.8
thirst 8.8
beer 8.7
clear 8.7
foam 8.7
bubbles 8.5
tea 8.5
meal 8.1
majolica 7.9
glassware 7.8
aromatherapy 7.7
therapy 7.5
hot 7.5
cream 7.5
single 7.4
reflection 7.3
spa 7.2
celebration 7.2
kitchen 7.2
medical 7.1

Google
created on 2022-06-18

Microsoft
created on 2022-06-18

vase 98.7
indoor 98.1
tableware 97
plant 93
ceramic 88.2
jug 85.7
ceramic ware 84.7
mug 84.3
pottery 83.2
earthenware 82.2
teapot 81.8
kettle 79.9
pitcher 79.2
white 79
lid 78.9
black and white 72
bottle 71.6
cup 68.2
coffee cup 66.1
still life photography 64.5
still life 64.1
serveware 62.5
saucer 50.9
porcelain 39.4

Face analysis

Amazon

AWS Rekognition

Age 18-24
Gender Female, 98.1%
Sad 99.8%
Fear 10.9%
Surprised 8.1%
Happy 5%
Calm 5%
Angry 2.1%
Confused 2%
Disgusted 1.4%

AWS Rekognition

Age 28-38
Gender Male, 72.4%
Surprised 57.5%
Angry 19.5%
Fear 17.6%
Happy 8.5%
Calm 8.4%
Disgusted 5.2%
Sad 3.4%
Confused 1.8%

Feature analysis

Amazon

Person 62.3%

Captions

Microsoft

a vase sitting on a table 61.3%
a vase sitting on top of a table 59%
a vase sitting on a counter 53.8%

Text analysis

Amazon

NUTRITUM

Google

NUTRITUN