Human Generated Data

Title

Lamp

Date

-

People

-

Classification

Lighting Devices

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Peabody Museum of Archaeology and Ethnology, Harvard University, 1978.495.186

Human Generated Data

Title

Lamp

Classification

Lighting Devices

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Peabody Museum of Archaeology and Ethnology, Harvard University, 1978.495.186

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Bread 99.6
Food 99.6
Pottery 71.6
Bronze 56

Clarifai
created on 2023-10-27

pottery 100
clay 99.8
no person 99.7
old 98.8
ancient 98.7
antique 98.3
retro 98.2
handmade 97.9
veil 97.9
one 97.7
earthenware 97.1
sculpture 97
wear 96.3
ceramic 96.3
art 96.2
arts and crafts 94.9
rusty 94
hard 93
dirty 92.4
container 92.3

Imagga
created on 2022-01-23

ocarina 92.9
earthenware 77.8
wind instrument 74.3
ceramic ware 57.1
musical instrument 55.7
utensil 37.3
brown 26.5
food 24.7
close 18.8
snack 15.4
tasty 14.2
delicious 14
fresh 13.1
old 12.5
chocolate 12.2
object 11.7
closeup 11.4
wooden 11.4
sweet 11.1
gourmet 11
eat 10.9
cup 10.8
culture 10.2
plate 10.2
dinner 10.1
traditional 10
kitchen 9.8
breakfast 9.7
dessert 9.7
metal 9.6
black 9.6
round 9.5
healthy 9.4
tea 9.4
sugar 9.4
money 9.3
hot 9.2
wood 9.2
meal 8.9
color 8.9
lunch 8.6
nobody 8.5
finance 8.4
fat 8.4
slice 8.2
box 8.1
colorful 7.9
cooking 7.8
table 7.8
ancient 7.8
ceramic 7.7
pot 7.7
pattern 7.5
dark 7.5
savings 7.4
coffee 7.4
business 7.3
financial 7.1

Google
created on 2022-01-23

Art 78.9
Wood 78.8
Firefighter 72
Sculpture 71.9
Artifact 66.7
Metal 66.3
Tool 65.7
Circle 60.3
Carving 57.4
Household hardware 52.8
Rust 50.3

Microsoft
created on 2022-01-23

ocarina 96.6
wall 95
indoor 94.6
rust 54.9
close 22.8

Color Analysis

Feature analysis

Amazon

Bread
Bread 99.6%

Categories

Imagga

food drinks 98.8%
paintings art 1.2%

Captions

Microsoft
created on 2022-01-23

a close up of a doughnut 36.9%
a close up of a donut 32.9%