Human Generated Data

Title

Boy in front of a tree trunk

Date

c. 1770

People

Artist: Joseph Nees, active c. 1745-1773

Manufacturer: Zurich Porcelain, 1763 - 1790

Classification

Sculpture

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Mr. and Mrs. H. Graves Terwilliger, BR59.139

Human Generated Data

Title

Boy in front of a tree trunk

People

Artist: Joseph Nees, active c. 1745-1773

Manufacturer: Zurich Porcelain, 1763 - 1790

Date

c. 1770

Classification

Sculpture

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Mr. and Mrs. H. Graves Terwilliger, BR59.139

Machine Generated Data

Tags

Amazon
created on 2022-06-18

Figurine 99.8
Hat 90.1
Clothing 90.1
Apparel 90.1
Person 83.2
Human 83.2
Toy 65.5

Clarifai
created on 2023-10-29

lid 99.4
people 99.3
wear 99.2
one 99
veil 98.8
child 98.7
portrait 98.2
outfit 97.9
cap 97.4
costume 97.1
uniform 96.3
art 96.3
military 96
retro 92.8
soldier 91.8
adult 90.4
war 90
man 89
figurine 88.6
sculpture 87.7

Imagga
created on 2022-06-18

military uniform 38.3
statue 32
clothing 30.2
uniform 29.4
sculpture 22.3
covering 20.9
consumer goods 19.5
art 18
model 17.1
person 16.9
body 16.8
hat 16.6
figure 15
religion 14.3
portrait 14.2
fashion 13.6
monument 13.1
style 12.6
cowboy hat 11.9
posing 11.5
man 11.4
male 11.4
sexy 11.2
ancient 11.2
old 11.1
culture 11.1
stone 11
3d 10.8
people 10.6
lady 10.5
pretty 10.5
attractive 10.5
decoration 10.1
face 9.9
human 9.7
headdress 9.6
god 9.6
helmet 9.3
tourism 9.1
dress 9
ballplayer 9
child 8.8
warrior 8.8
athlete 8.8
hair 8.7
doll 8.5
dark 8.3
traditional 8.3
costume 8.3
makeup 8.2
pose 8.1
player 7.9
antique 7.9
smile 7.8
render 7.8
naked 7.7
bust 7.6
commodity 7.6
historical 7.5
silhouette 7.4
digital 7.3
weapon 7.1
history 7.1
cartoon 7.1
love 7.1
travel 7

Google
created on 2022-06-18

Microsoft
created on 2022-06-18

statue 96.5
human face 90.7
person 85.5
text 78.5
sculpture 78.1
clothing 74.9
black and white 51.3
posing 42

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 1-7
Gender Female, 99.6%
Calm 89.2%
Surprised 6.7%
Fear 6.1%
Happy 4.2%
Sad 2.7%
Disgusted 2.5%
Confused 0.8%
Angry 0.4%

Microsoft Cognitive Services

Age 4
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Hat 90.1%
Person 83.2%

Categories

Imagga

paintings art 98.3%
interior objects 1.2%