Human Generated Data

Title

Female Mask

Date

323-31 BCE

People

-

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Edward P. Bliss, 1916.318

Human Generated Data

Title

Female Mask

Date

323-31 BCE

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Edward P. Bliss, 1916.318

Machine Generated Data

Tags

Amazon
created on 2022-06-18

Figurine 92.2
Animal 85
Bird 84.5
Archaeology 75.3
Person 66.6
Human 66.6
Mammal 58.4
Soil 57.4

Clarifai
created on 2023-10-29

sculpture 99.5
art 98.2
people 98
statue 97.6
one 95.4
monochrome 95.1
adult 94
group 94
no person 93.3
doll 90.9
bird 89.5
portrait 89.5
two 89.4
mammal 89.4
marble 88.5
figurine 88
child 87.2
religion 87
woman 86.8
stone 86.5

Imagga
created on 2022-06-18

bird 34.5
animal 26
wildlife 24.9
mollusk 22.5
conch 20.3
wild 20
gastropod 19.6
beak 19.3
sea 15.4
sand 14.5
invertebrate 14
feathers 13.6
eye 13.4
close 12
brown 11.8
falcon 11.6
water 11.3
soil 11
outdoor 10.7
hawk 10.6
feather 10.6
sky 10.2
ocean 10.1
hunter 10
trainer 9.7
animals 9.3
turtle 9.2
predator 9.2
wing 8.9
birds 8.7
rock 8.7
beach 8.6
mammal 8.6
marine 8.5
male 8.5
armadillo 8.4
portrait 8.4
head 8.4
lifestyle 7.9
travel 7.7
outdoors 7.5
closeup 7.4
sculpture 7.4
statue 7.3
cute 7.2
seal 7.1
owl 7.1

Google
created on 2022-06-18

Microsoft
created on 2022-06-18

bird 93.7
statue 83.5
animal 82.8
black and white 73
sculpture 62.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 50-58
Gender Male, 58.8%
Sad 66.3%
Calm 44.9%
Surprised 8.5%
Fear 7.8%
Confused 5%
Disgusted 4.5%
Angry 2.9%
Happy 2.8%

AWS Rekognition

Age 23-33
Gender Female, 97.7%
Angry 96.4%
Surprised 6.8%
Fear 6.7%
Sad 2.2%
Disgusted 0.1%
Happy 0.1%
Confused 0%
Calm 0%

Feature analysis

Amazon

Bird 84.5%
Person 66.6%

Categories

Imagga

pets animals 74.4%
paintings art 24.1%
nature landscape 1.3%

Captions