Human Generated Data

Title

Sugar Sifter

Date

-

People

-

Classification

Vessels

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Gift, 1940.75

Human Generated Data

Title

Sugar Sifter

Classification

Vessels

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Gift, 1940.75

Machine Generated Data

Tags

Amazon
created on 2022-06-11

Jar 99.1
Pottery 97.3
Vase 90.8
Lamp 85.3
Urn 70.7
Plant 64.6
Potted Plant 60.8

Clarifai
created on 2023-10-29

no person 98.9
metalwork 98.9
art 98.5
container 97.3
antique 97.3
vintage 97.2
retro 97.1
one 95.7
ancient 94.8
old 94.4
still life 94.2
brass 93.5
kitchenware 92.9
religion 92.5
gold 91.9
decoration 89.9
lamp 88.9
lantern 88
trophy 87.7
cup 87.7

Imagga
created on 2022-06-11

thimble 100
cap 100
protective covering 100
container 100
covering 69
metal 27.3
equipment 20.8
gold 18.1
object 17.6
microphone 17.2
glass 16.8
classic 14.8
old 13.9
music 13.5
audio 13.4
sound 13.1
metallic 12.9
closeup 12.1
golden 12
light 12
technology 11.9
religion 11.6
lamp 11.5
retro 11.5
style 11.1
nobody 10.9
voice 10.8
radio 10.7
concert 10.7
steel 10.6
antique 10.5
ancient 10.4
religious 10.3
culture 10.2
entertainment 10.1
vintage 9.9
karaoke 9.9
recording 9.8
performance 9.6
classical 9.6
power 9.2
traditional 9.1
black 9
silver 8.8
worship 8.7
shiny 8.7
musical 8.6
faith 8.6
bright 8.6
media 8.6
close 8.6
temple 8.5
chrome 8.5
travel 8.4
communication 8.4
studio 8.4
drink 8.3
wealth 8.1
money 7.7
statue 7.6
old fashioned 7.6
iron 7.5
instrument 7.4
glow 7.4
perfume 7.4
bottle 7.4
aged 7.2
tool 7.2
celebration 7.2

Google
created on 2022-06-11

Microsoft
created on 2022-06-11

sitting 94.2
indoor 94.1
still life photography 73.8
lamp 67.2
green 61
vase 52.5

Color Analysis

Feature analysis

Amazon

Lamp 85.3%

Categories

Imagga

food drinks 93.5%
interior objects 6.4%

Captions