Human Generated Data

Title

Photo Album

Date

c. 1857 - c. 1874

People

-

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.58

Human Generated Data

Title

Photo Album

Date

c. 1857 - c. 1874

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.58

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 98.5
Person 98.5
Person 98.1
People 92.2
Clothing 91.2
Apparel 91.2
Shoe 91.2
Footwear 91.2
Shoe 83.2
Family 72.7
Photo 65.7
Photography 65.7
Portrait 61.5
Face 61.5

Clarifai
created on 2023-10-25

people 100
group 99.8
two 98.9
three 98.6
adult 98.6
child 98.5
wear 97.8
offspring 97.6
woman 97.2
sibling 96.8
four 96.5
son 96.2
gown (clothing) 95.8
portrait 95.1
leader 95
family 94.8
man 94
several 93.5
five 92.1
sit 91.4

Imagga
created on 2022-01-09

person 23
old 20.9
man 20.8
people 19.5
dress 19
adult 18.8
religion 17.9
statue 16.3
male 15.6
religious 15
portrait 14.9
clothing 14.5
fashion 14.3
vintage 14.1
art 14.1
style 13.3
faith 12.4
couple 12.2
face 12.1
catholic 11.7
god 11.5
sculpture 11.3
culture 11.1
mother 11
costume 10.9
traditional 10.8
nurse 10.7
ancient 10.4
home 10.4
love 10.3
church 10.2
happy 10
history 9.8
pretty 9.8
attractive 9.8
posing 9.8
lady 9.7
black 9.7
coat 9.7
antique 9.7
holy 9.6
saint 9.6
kimono 9.5
hair 9.5
monument 9.3
teacher 9.2
elegance 9.2
child 9
uniform 8.7
happiness 8.6
men 8.6
garment 8.6
two 8.5
indoor 8.2
groom 8.1
decoration 8
celebration 8
standing 7.8
cathedral 7.7
senior 7.5
historic 7.3
makeup 7.3
robe 7.3
make 7.3
pose 7.2
detail 7.2
hat 7.2
color 7.2
sexy 7.2
professional 7.2
kin 7.2
romantic 7.1
family 7.1
women 7.1
interior 7.1

Google
created on 2022-01-09

Hat 84.2
Classic 74.1
Vintage clothing 71.9
Art 68.8
Sitting 68.3
Event 65
History 62.5
Retro style 56.8
Recreation 56
Family 53.6
Room 53.4
Painting 52.7

Microsoft
created on 2022-01-09

person 99.8
clothing 98.8
wall 95.4
text 89.5
old 86.3
photograph 78.3
vintage clothing 71.9
people 71
white 69.7
black 68.9
woman 68.4
retro style 60.5
man 50.2
posing 47.1
vintage 40

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-39
Gender Male, 99.7%
Calm 88.5%
Confused 3.4%
Fear 3.1%
Surprised 2.3%
Sad 0.9%
Disgusted 0.8%
Angry 0.6%
Happy 0.4%

AWS Rekognition

Age 31-41
Gender Male, 100%
Calm 97.5%
Confused 1.2%
Sad 0.4%
Angry 0.4%
Happy 0.2%
Disgusted 0.1%
Surprised 0.1%
Fear 0.1%

Microsoft Cognitive Services

Age 38
Gender Male

Microsoft Cognitive Services

Age 32
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Shoe 91.2%

Categories