Human Generated Data

Title

Untitled (woman in striped dress holding ornate book, seated, half-length)

Date

c.1856 - c.1910

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3856

Human Generated Data

Title

Untitled (woman in striped dress holding ornate book, seated, half-length)

People

Artist: Unidentified Artist,

Date

c.1856 - c.1910

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3856

Machine Generated Data

Tags

Amazon
created on 2021-04-03

Human 93.4
Person 93.1
Art 89.7
Clothing 73.1
Apparel 73.1
Portrait 66.1
Photography 66.1
Face 66.1
Photo 66.1
Female 63
Painting 58.4
Drawing 55.2

Clarifai
created on 2021-04-03

people 99.2
portrait 99.1
art 98.8
one 96.3
adult 95.9
man 95.5
wear 93
painting 88.2
retro 87.2
illustration 81.7
vintage 80.3
woman 80.1
son 80
old 77.8
music 77
gown (clothing) 76.6
leader 76.3
print 75.2
street 74.7
lid 74.4

Imagga
created on 2021-04-03

statue 33.3
sculpture 29.8
art 21.6
religion 21.5
portrait 21.4
face 18.5
person 18
model 17.1
religious 16.9
black 16.6
comic book 16.6
sexy 16.1
ancient 15.6
god 15.3
old 14.6
decoration 14.4
adult 14.2
antique 13.9
fashion 13.6
hair 13.5
male 12.9
people 12.8
culture 12.8
man 12.1
attractive 11.9
tattoo 11.9
stone 11
figure 10.9
vintage 10.8
history 10.7
marble 10.7
pretty 10.5
style 10.4
historical 10.4
architecture 10.3
church 10.2
head 10.1
book jacket 10.1
makeup 10.1
dark 10
jacket 9.8
catholic 9.7
pray 9.7
detail 9.7
design 9.4
world 9.1
comedian 9.1
dress 9
body 8.8
mask 8.7
brunette 8.7
spiritual 8.6
golden 8.6
covering 8.4
historic 8.3
human 8.3
carving 8.2
make 8.2
eyes 7.7
prayer 7.7
holy 7.7
sketch 7.6
performer 7.6
representation 7.6
traditional 7.5
monument 7.5
famous 7.4
symbol 7.4
posing 7.1

Google
created on 2021-04-03

Microsoft
created on 2021-04-03

sketch 98.1
wall 97.9
drawing 97
human face 93.6
text 92.5
person 91.7
cartoon 74.7
clothing 66.3
retro 60.2
painting 51.2
old 42.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-51
Gender Female, 92.3%
Calm 97.9%
Sad 0.8%
Angry 0.8%
Fear 0.1%
Confused 0.1%
Happy 0.1%
Disgusted 0.1%
Surprised 0.1%

Microsoft Cognitive Services

Age 39
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 93.1%

Categories

Captions

Microsoft
created on 2021-04-03

an old photo of a man 71.7%
old photo of a man 66.9%
a old photo of a man 66.6%