Human Generated Data

Title

Red-figure Bell Krater: Youth Crowned by Nike

Date

c. 400-380 BCE

People

Artist: The Tarporley Painter,

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of David M. Robinson, 1960.359

Human Generated Data

Title

Red-figure Bell Krater: Youth Crowned by Nike

People

Artist: The Tarporley Painter,

Date

c. 400-380 BCE

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2022-06-11

Pottery 99
Jar 97.7
Art 97.7
Porcelain 97.7
Vase 97.4
Human 91.6
Person 91.6
Person 88.4
Potted Plant 84.3
Plant 84.3
Beer 79.6
Beverage 79.6
Alcohol 79.6
Drink 79.6
Urn 76
Planter 67
Person 65.9

Imagga
created on 2022-06-11

cup 100
container 56
china 54.3
ceramic ware 46.1
drink 42.6
porcelain 39.4
vase 38.5
utensil 37.5
tea 36.7
beverage 35.9
punch 34
earthenware 31.6
vessel 31.4
glass 29.5
pot 27.5
mixed drink 25.6
jar 24.8
alcohol 24.3
object 22
refreshment 20.9
tableware 20.8
liquid 20
traditional 20
ceramic 19.4
pottery 17.6
hot 17.6
breakfast 16.8
mug 16.6
herbal 16.2
bowl 15.8
healthy 15.7
closeup 15.5
color 15
nobody 14.8
transparent 14.3
kitchen 14.3
close 14.3
empty 13.7
saucer 13.6
freshness 13.3
food 13.3
brown 13.2
coffee 13
cups 12.7
morning 12.7
fresh 12.4
oriental 12.3
water 12
culture 12
yellow 11.9
old 11.9
teapot 11.8
art 11.7
decoration 10.9
teacup 10.8
ware 10.8
gold 10.7
single 10.7
restaurant 10.4
health 9.7
ceremony 9.7
shiny 9.5
aroma 9.4
bar 9.2
tradition 9.2
relaxation 9.2
full 9.1
pitcher 9
majolica 8.9
jug 8.9
metal 8.9
antique 8.7
craft 8.6
clean 8.4
heat 8.3
wet 8
natural 8
ceramics 7.8
clay 7.8
pour 7.8
serving 7.7
handle 7.6
dish 7.6
gourmet 7.6
break 7.6
milk 7.6
spoon 7.6
plate 7.6
relax 7.6
elegance 7.6
decorative 7.5
taste 7.4
lifestyle 7.2
black 7.2

Google
created on 2022-06-11

Microsoft
created on 2022-06-11

vase 97.4
museum 88.5
art 82.7
earthenware 75.8
pottery 72.7
ceramic 67.9
tableware 67.5
black and white 61
cup 59.2
ceramic ware 28.6
painted 20.6

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Male, 50.7%
Calm 83.9%
Surprised 14.6%
Fear 7%
Sad 2.3%
Confused 0.7%
Disgusted 0.5%
Angry 0.3%
Happy 0.2%

AWS Rekognition

Age 20-28
Gender Female, 96.4%
Calm 76.1%
Sad 15.7%
Surprised 6.9%
Fear 6.2%
Angry 4.5%
Happy 1.2%
Disgusted 0.6%
Confused 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 91.6%
Beer 79.6%

Captions

Microsoft

a vase sitting on a table 53.1%
a vase sitting on top of a table 49.8%
a vase is sitting on a table 43.1%

Text analysis

Google

XGGXGGXGGX