Human Generated Data

Title

Untitled (portrait of four women wearing tags with handwriting)

Date

c. 1856 - c. 1910

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. Andrew S. Dibner, P2003.131.13005

Human Generated Data

Title

Untitled (portrait of four women wearing tags with handwriting)

People

Artist: Unidentified Artist,

Date

c. 1856 - c. 1910

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. Andrew S. Dibner, P2003.131.13005

Machine Generated Data

Tags

Amazon
created on 2022-01-24

Person 98.4
Human 98.4
Person 97.4
Clothing 95
Apparel 95
Person 94.2
Person 92.9
Painting 87.7
Art 87.7
People 85.8
Hat 66.4
Portrait 65.9
Face 65.9
Photography 65.9
Photo 65.9
Cabinet 58.5
Furniture 58.5
Family 55.8

Clarifai
created on 2023-10-27

people 99.9
child 99.3
group 98.7
son 98.2
art 98
lid 97.5
two 97.1
wear 96.4
three 96.3
portrait 93.9
adult 93.9
retro 93.7
vintage 92.5
woman 91.1
four 91
veil 90.6
offspring 89.8
family 88.6
nostalgia 88.4
sepia 88.1

Imagga
created on 2022-01-24

statue 38.2
sculpture 34.1
ancient 32
old 28.6
religion 27.8
art 25.9
architecture 24.3
stone 24.2
cemetery 21.7
history 21.5
monument 20.5
culture 20.5
kin 19.5
temple 18.7
travel 18.3
historic 17.4
antique 16.6
vintage 16.5
tourism 16.5
god 16.3
famous 15.8
religious 15
world 14.4
building 14.3
historical 14.1
figure 14
carving 13.6
holy 12.5
spirituality 12.5
column 11.8
marble 11.6
face 11.4
book jacket 11.4
decoration 11.2
landmark 10.8
spiritual 10.6
church 10.2
people 9.5
man 9.4
person 9.4
structure 9.3
covering 9.2
traditional 9.2
jacket 8.8
prayer 8.7
city 8.3
close 8
mythology 7.9
museum 7.8
past 7.7
heritage 7.7
ruler 7.7
human 7.5
one 7.5
retro 7.4
tourist 7.3
portrait 7.1

Google
created on 2022-01-24

Microsoft
created on 2022-01-24

old 99.3
human face 95.1
clothing 94.4
text 92
person 90.1
child 86.3
boy 81.9
photograph 65.9
hat 59.7
posing 50.8
vintage 50.1
clothes 15.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-36
Gender Male, 60.1%
Calm 95.6%
Sad 1.8%
Confused 1.1%
Surprised 0.5%
Happy 0.4%
Angry 0.2%
Fear 0.2%
Disgusted 0.1%

AWS Rekognition

Age 18-26
Gender Female, 99.9%
Calm 99.8%
Confused 0.1%
Sad 0%
Surprised 0%
Happy 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 35-43
Gender Female, 100%
Calm 88.8%
Confused 4.3%
Sad 3.7%
Angry 1.1%
Surprised 0.8%
Happy 0.5%
Fear 0.4%
Disgusted 0.3%

AWS Rekognition

Age 33-41
Gender Male, 89.2%
Confused 90.1%
Calm 5.8%
Angry 2%
Sad 0.7%
Surprised 0.5%
Disgusted 0.4%
Happy 0.4%
Fear 0.2%

Microsoft Cognitive Services

Age 47
Gender Female

Microsoft Cognitive Services

Age 29
Gender Female

Microsoft Cognitive Services

Age 29
Gender Male

Microsoft Cognitive Services

Age 19
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person
Painting
Person 98.4%
Person 97.4%
Person 94.2%
Person 92.9%
Painting 87.7%

Categories