Human Generated Data

Title

Three Young Korean Women Wearing Traditional Robes, Sitting on Mats,and Preparing to Pound Cloth

Date

c. 1857-1874

People
Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.91.2

Human Generated Data

Title

Three Young Korean Women Wearing Traditional Robes, Sitting on Mats,and Preparing to Pound Cloth

People
Date

c. 1857-1874

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Person 99.5
Human 99.5
Person 99.4
Person 98.6
Sitting 94.3
Painting 92.2
Art 92.2
Clothing 88.8
Apparel 88.8
People 68.9
Child 65.7
Kid 65.7
Female 62.2
Girl 62.2
Sled 57.5

Clarifai
created on 2019-07-07

people 100
adult 99.4
two 99.4
wear 98.6
group 98.3
one 97.8
portrait 96.6
position 95.7
sit 95.2
woman 95.1
three 94.9
child 94.7
leader 93.3
outfit 92.4
furniture 92
veil 91.1
man 91.1
offspring 90.4
actress 90.2
sibling 90

Imagga
created on 2019-07-07

kin 100
statue 29.9
old 23
people 21.8
sculpture 21.6
man 20.2
mother 19.8
adult 19.5
happy 19.4
home 18.3
love 18.2
portrait 18.1
couple 17.4
religion 17
male 15.6
art 15.3
stone 15.2
family 15.1
happiness 14.9
person 14
monument 14
sitting 13.7
history 13.4
god 13.4
senior 13.1
ancient 13
architecture 12.5
religious 12.2
face 12.1
women 11.9
aged 11.8
marble 11.6
smile 11.4
together 11.4
lifestyle 10.8
parent 10.7
sofa 10.5
antique 10.4
culture 10.3
two 10.2
smiling 10.1
historic 10.1
attractive 9.8
detail 9.7
married 9.6
elderly 9.6
building 9.5
hair 9.5
father 9.5
famous 9.3
mature 9.3
fashion 9
cheerful 8.9
lady 8.9
catholic 8.8
pray 8.7
spiritual 8.6
husband 8.6
loving 8.6
historical 8.5
travel 8.5
church 8.3
vintage 8.3
tourism 8.3
child 8.2
landmark 8.1
group 8.1
decoration 8
sepia 7.8
couch 7.7
men 7.7
saint 7.7
bride 7.7
outdoor 7.6
wife 7.6
city 7.5
figure 7.4
indoor 7.3
girls 7.3
black 7.2

Google
created on 2019-07-07

Microsoft
created on 2019-07-07

person 99.6
clothing 98.4
old 98
sitting 96.2
human face 96.2
player 91.3
outdoor 89.3
smile 77
black 74.7
child 71.3
white 70.5
vintage 70.1
woman 60.7
photograph 52.5
posing 42.4
raft 11.9

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 10-15
Gender Female, 78.1%
Confused 14.6%
Happy 3.8%
Surprised 7.2%
Calm 17.9%
Sad 31.4%
Angry 23.5%
Disgusted 1.7%

AWS Rekognition

Age 1-5
Gender Female, 65.2%
Calm 26.3%
Confused 7.9%
Disgusted 3.1%
Sad 11.9%
Angry 41.1%
Surprised 3.3%
Happy 6.4%

AWS Rekognition

Age 15-25
Gender Female, 100%
Disgusted 1%
Happy 3.9%
Surprised 1.5%
Calm 11.5%
Confused 1.8%
Sad 77.9%
Angry 2.4%

Microsoft Cognitive Services

Age 5
Gender Male

Microsoft Cognitive Services

Age 31
Gender Male

Microsoft Cognitive Services

Age 21
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Likely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Painting 92.2%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 95.5%
a vintage photo of a group of people posing for a picture 95.4%
a vintage photo of a group of people sitting in chairs 95.3%