Human Generated Data

Title

Seated Girl

Date

323-31 BCE

People

-

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Edward P. Bliss, 1916.337

Human Generated Data

Title

Seated Girl

Date

323-31 BCE

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Edward P. Bliss, 1916.337

Machine Generated Data

Tags

Amazon
created on 2022-06-18

Human 95.3
Figurine 94.2
Person 83.3
Person 82
Military 76.5
Person 74.6
Military Uniform 74.3
Soldier 70.7
Elephant 69.5
Animal 69.5
Mammal 69.5
Wildlife 69.5
Flooring 61.1
Army 58.6
Armored 58.6
Silhouette 57.8

Clarifai
created on 2023-10-29

people 99.4
group 98.6
child 98.3
art 97.1
man 97.1
wear 95
woman 94.7
group together 94.2
figurine 93.7
adult 93.7
model 92.9
sculpture 92.8
family 90.8
monochrome 90.8
figure 89.4
several 89.3
doll 89.2
many 88.9
statue 85.3
cooperation 82

Imagga
created on 2022-06-18

astronaut 52.5
uniform 40.3
military uniform 38.9
man 29.5
clothing 29.4
travel 19.7
person 17
sea 16.4
protection 16.4
covering 15.9
consumer goods 15.7
male 15.6
engineer 15.6
danger 15.4
people 15.1
military 14.5
soldier 13.7
destruction 13.7
mask 13.6
sport 13.4
sky 13.4
nuclear 12.6
ocean 12.4
tourism 12.4
tourist 11.8
protective 11.7
city 11.6
silhouette 11.6
gas 11.5
beach 11.2
radioactive 10.8
coast 10.8
disaster 10.7
toxic 10.7
water 10.7
urban 10.5
old 10.4
industrial 10
stalker 9.9
park 9.9
world 9.9
camouflage 9.8
radiation 9.8
adult 9.8
accident 9.8
chemical 9.6
gun 9.5
color 9.4
dirty 9
black 9
horizon 9
history 8.9
private 8.9
sun 8.8
architecture 8.6
motion 8.6
weapon 8.4
outdoor 8.4
summer 8.4
freedom 8.2
automaton 8.2
landscape 8.2
recreation 8.1
statue 8
commodity 7.7
winter 7.7
clouds 7.6
walking 7.6
street 7.4
business 7.3
sunset 7.2
life 7.2
building 7.1
to 7.1
day 7.1

Google
created on 2022-06-18

Microsoft
created on 2022-06-18

black and white 89
clothing 88.8
person 88.4
man 64.1
statue 58.9
sculpture 48.7
different 34.6
megalith 30

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-43
Gender Male, 72%
Sad 100%
Calm 7.8%
Surprised 6.4%
Fear 6%
Happy 3.9%
Confused 1.7%
Angry 0.6%
Disgusted 0.4%

AWS Rekognition

Age 7-17
Gender Male, 98%
Sad 99%
Calm 26.8%
Confused 10.1%
Surprised 6.8%
Fear 6.2%
Happy 0.8%
Disgusted 0.7%
Angry 0.6%

Feature analysis

Amazon

Person 83.3%
Elephant 69.5%

Categories

Imagga

paintings art 84.1%
interior objects 11.2%
pets animals 2.5%