Human Generated Data

Title

Untitled (close up portrait of woman in hat)

Date

c. 1856 - c. 1910

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. Andrew S. Dibner, P2003.131.13351

Human Generated Data

Title

Untitled (close up portrait of woman in hat)

People

Artist: Unidentified Artist,

Date

c. 1856 - c. 1910

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. Andrew S. Dibner, P2003.131.13351

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 86.5
Human 86.5
Hat 77.2
Clothing 77.2
Apparel 77.2

Clarifai
created on 2023-10-27

portrait 99.2
people 98
painting 96.3
art 96.2
one 96.2
man 96.1
lid 96
adult 94.8
wear 93
woman 92.7
old 92
face 89.5
two 87.5
vintage 82.9
retro 82.4
wall 80.6
historic 80.3
museum 78.6
girl 76.7
antique 76

Imagga
created on 2022-01-28

burlap 45.6
prayer rug 25.5
rug 20.3
portrait 20
face 17.7
eyes 17.2
furnishing 16.8
towel 16.1
floor cover 15.6
covering 14.8
hamper 14.7
close 13.7
people 13.4
vintage 13.2
child 13
container 12.6
basket 12.2
paper 11.8
old 11.8
person 11.8
adult 11.6
bath towel 11.3
texture 11.1
hair 11.1
linen 10.7
fashion 10.5
culture 10.2
pattern 10.2
cute 10
eye 9.8
little 9.7
art 9.4
grunge 9.4
bath linen 9.1
style 8.9
brown 8.8
man 8.7
love 8.7
money 8.5
black 8.4
pretty 8.4
fabric 8.4
retro 8.2
happy 8.1
currency 8.1
blanket 8.1
material 8
sexy 8
decoration 8
kid 8
design 7.9
smile 7.8
expression 7.7
closeup 7.4
symbol 7.4
note 7.3
lady 7.3
cotton 7.2
blond 7.2
religion 7.2

Google
created on 2022-01-28

Hair 98.4
Head 97.3
Chin 96.7
Outerwear 95.5
Hairstyle 95.1
Eyebrow 93.7
Hat 93.4
Cap 89.3
Fashion 88
Jaw 87.9
Rectangle 87.3
Sleeve 86.5
Fedora 85
Headgear 85
Art 83.5
Costume hat 83.4
Sun hat 79.5
Font 76.9
Facial hair 73.3
Vintage clothing 70.5

Microsoft
created on 2022-01-28

human face 98.2
person 93.4
old 88.5
hat 88.4
clothing 85
white 74.4
portrait 53.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-28
Gender Male, 99.9%
Calm 88.1%
Confused 6.1%
Sad 1.7%
Surprised 1.3%
Angry 1.2%
Fear 0.7%
Disgusted 0.6%
Happy 0.3%

Microsoft Cognitive Services

Age 26
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person
Hat
Person 86.5%
Hat 77.2%

Categories

Imagga

paintings art 94.4%
people portraits 2.8%
food drinks 1.6%

Captions