Human Generated Data

Title

Sugar Sifter

Date

People
Classification

Vessels

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Gift, 1940.75

Human Generated Data

Title

Sugar Sifter

People
Date

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2022-06-11

Jar 99.1
Pottery 97.3
Vase 90.8
Lamp 85.3
Urn 70.7
Plant 64.6
Potted Plant 60.8

Imagga
created on 2022-06-11

cap 100
thimble 100
container 100
protective covering 100
covering 69
metal 27.3
equipment 20.8
gold 18.1
object 17.6
microphone 17.2
glass 16.8
classic 14.8
old 13.9
music 13.5
audio 13.4
sound 13.1
metallic 12.9
closeup 12.1
golden 12
light 12
technology 11.9
religion 11.6
lamp 11.5
retro 11.5
style 11.1
nobody 10.9
voice 10.8
radio 10.7
concert 10.7
steel 10.6
antique 10.5
ancient 10.4
religious 10.3
culture 10.2
entertainment 10.1
vintage 9.9
karaoke 9.9
recording 9.8
performance 9.6
classical 9.6
power 9.2
traditional 9.1
black 9
silver 8.8
worship 8.7
shiny 8.7
musical 8.6
faith 8.6
bright 8.6
media 8.6
close 8.6
temple 8.5
chrome 8.5
travel 8.4
communication 8.4
studio 8.4
drink 8.3
wealth 8.1
money 7.7
statue 7.6
old fashioned 7.6
iron 7.5
instrument 7.4
glow 7.4
perfume 7.4
bottle 7.4
aged 7.2
tool 7.2
celebration 7.2

Google
created on 2022-06-11

Microsoft
created on 2022-06-11

sitting 94.2
indoor 94.1
still life photography 73.8
lamp 67.2
green 61
vase 52.5

Feature analysis

Amazon

Lamp 85.3%

Captions

Microsoft

a vase sitting on a table 54.5%
a vase sitting on top of a table 52.2%
a tall glass vase on a table 52.1%