Human Generated Data

Title

Untitled (unidentified woman wearing sari, seated, right arm resting on table, left leg crossed over right)

Date

1860-1899

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.329.7

Human Generated Data

Title

Untitled (unidentified woman wearing sari, seated, right arm resting on table, left leg crossed over right)

People

Artist: Unidentified Artist,

Date

1860-1899

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.329.7

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 97.4
Human 97.4
Art 93.1
Painting 87.7
Drawing 62.6
Monk 59.2
Clothing 56
Apparel 56
Archaeology 55.6

Clarifai
created on 2023-10-29

portrait 99.9
people 99.3
art 99.3
wear 98.5
vintage 97.1
sepia pigment 97
man 96.9
adult 96.2
old 95.9
retro 95.8
sepia 95.7
one 95.7
painting 94.1
antique 91.9
scan 87.8
documentary 87.5
profile 85.4
veil 84.4
paper 82.7
facial hair 80.7

Imagga
created on 2022-02-26

statue 42.7
sculpture 38
book jacket 37.5
jacket 29.2
religion 28.7
art 26.9
ancient 26
wrapping 22.2
religious 20.6
history 20.6
culture 20.5
architecture 18.9
old 18.8
stone 18
god 15.3
covering 15.1
marble 15
monument 15
vintage 14.9
antique 14.7
figure 14.6
historical 14.1
temple 13.8
column 13.6
world 13.4
famous 13
church 13
historic 12.8
face 12.8
holy 12.5
carving 12.2
building 12
catholic 11.7
spiritual 11.5
travel 11.3
sand 11.1
portrait 11
head 10.9
city 10.8
museum 10.8
symbol 10.8
ruler 10.8
child 10.7
pray 10.7
spirituality 10.6
monk 10.2
man 10.1
detail 9.7
cemetery 9.6
decoration 9.2
earth 9.1
soil 9.1
tourism 9.1
landmark 9
roman 8.8
prayer 8.7
cross 8.5
person 8.4
structure 8.4
people 8.4
closeup 8.1
male 8
sepia 7.8
golden 7.7
saint 7.7
grunge 7.7
money 7.7
obelisk 7.6
human 7.5
artwork 7.3
currency 7.2

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99.9
person 98.7
clothing 98.5
old 97.4
human face 95.2
book 92.9
black 90.7
vintage 85.4
photograph 84.2
white 72.7
vintage clothing 60.6
posing 56.8
woman 51.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 19-27
Gender Female, 93.1%
Angry 98.2%
Sad 0.8%
Calm 0.4%
Disgusted 0.2%
Fear 0.2%
Surprised 0.1%
Confused 0.1%
Happy 0.1%

Feature analysis

Amazon

Person
Painting
Person 97.4%

Categories

Imagga

paintings art 100%

Captions

Microsoft
created on 2022-02-26

a vintage photo of a boy 81.2%
a vintage photo of a girl 81.1%
a vintage photo of a person 81%