Human Generated Data

Title

Seated Young Korean Man and Two Small Girls

Date

c. 1857-1874

People

-

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.89.2

Human Generated Data

Title

Seated Young Korean Man and Two Small Girls

Date

c. 1857-1874

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.89.2

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Human 98.2
Person 98.2
Person 97.9
Person 97.2
Art 80.7
Painting 80.7
People 78
Photography 66.8
Photo 66.8
Sailor Suit 64.6
Urban 63.6
Face 62.6
Portrait 62.6
Clothing 58.3
Apparel 58.3
Child 57.4
Kid 57.4
Footwear 55.2
Shoe 55.2

Clarifai
created on 2019-07-07

people 100
child 99.2
wear 99
group 98.7
adult 98.1
two 97.8
one 96.3
three 96.2
group together 95.5
man 94.2
four 93.6
veil 93.4
outfit 93
portrait 92.9
offspring 92.8
boy 92.4
administration 91.8
son 91.2
woman 89.6
several 89.6

Imagga
created on 2019-07-07

man 29.6
male 24.3
people 23.4
person 22.1
old 21.6
kin 20.7
portrait 18.8
adult 18.3
religion 16.1
culture 14.5
face 13.5
family 13.3
vintage 13.2
art 12.6
statue 12.6
history 12.5
mother 12.4
church 12
happy 11.9
attractive 11.9
two 11.9
dress 11.7
traditional 11.6
couple 11.3
fashion 11.3
wall 11.1
sculpture 10.9
god 10.5
ancient 10.4
religious 10.3
love 10.3
holy 9.6
faith 9.6
historical 9.4
human 9
lady 8.9
sibling 8.9
hat 8.8
room 8.8
hair 8.7
smiling 8.7
happiness 8.6
architecture 8.6
sitting 8.6
men 8.6
model 8.6
expression 8.5
dark 8.3
holding 8.3
clothing 8.2
retro 8.2
child 8.2
costume 8.1
group 8.1
together 7.9
antique 7.9
black 7.8
sacred 7.8
worship 7.7
pretty 7.7
war 7.7
spiritual 7.7
cathedral 7.7
grunge 7.7
serious 7.6
stone 7.6
monument 7.5
one 7.5
father 7.4
grandfather 7.4
tradition 7.4
historic 7.3
decoration 7.2
lifestyle 7.2
home 7.2
smile 7.1
romantic 7.1
interior 7.1

Google
created on 2019-07-07

People 95.4
Photograph 95.4
Snapshot 84.1
Standing 82.8
Family 64.8
Vintage clothing 61.7
Child 52

Microsoft
created on 2019-07-07

clothing 99.1
person 98.3
human face 96.8
baby 96.8
toddler 94.4
smile 93.8
child 92.2
old 89.7
boy 87.7
posing 86.2
group 67.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Male, 85.7%
Surprised 4%
Angry 4.3%
Sad 3.9%
Confused 5.6%
Happy 4.9%
Calm 63.9%
Disgusted 13.4%

AWS Rekognition

Age 4-9
Gender Female, 89.4%
Sad 18%
Happy 11.5%
Surprised 2.9%
Angry 4.8%
Calm 54%
Confused 4.9%
Disgusted 3.9%

AWS Rekognition

Age 6-13
Gender Female, 99.8%
Angry 8%
Surprised 5.6%
Happy 10.3%
Calm 50.4%
Disgusted 2.7%
Confused 3%
Sad 20%

Microsoft Cognitive Services

Age 34
Gender Male

Microsoft Cognitive Services

Age 7
Gender Female

Microsoft Cognitive Services

Age 5
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.2%
Painting 80.7%
Shoe 55.2%

Categories