Human Generated Data

Title

Untitled (unidentified woman, seated, right arm resting on table )

Date

1860-1899

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.329.24

Human Generated Data

Title

Untitled (unidentified woman, seated, right arm resting on table )

People

Artist: Unidentified Artist,

Date

1860-1899

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.329.24

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 99.6
Human 99.6
Soil 86
Art 83.3
Painting 81.7
Archaeology 81.4

Clarifai
created on 2023-10-28

people 99.9
portrait 99.7
art 99.3
adult 99
wear 98.9
one 97.8
man 97.5
painting 96.2
veil 95
two 94.8
print 94.1
woman 92.6
sepia pigment 91.2
military 87.3
retro 86.8
facial hair 86.2
vintage 86.1
elderly 86
furniture 84.8
documentary 83.9

Imagga
created on 2022-02-25

statue 45.3
book jacket 43.5
jacket 34.8
sculpture 34.2
ancient 26.8
wrapping 25.7
art 25.1
religion 22.4
old 20.2
stone 19.1
architecture 18.8
religious 18.7
covering 18.5
history 17.9
antique 17.5
monument 16.8
marble 16.7
culture 16.2
sax 15.5
god 15.3
detail 15.3
ruler 15.1
catholic 14.6
historical 14.1
travel 14.1
roman 13.8
product 12.4
vintage 12.4
figure 12.4
newspaper 12.4
symbol 12.1
famous 12.1
historic 11.9
city 11.7
world 11.5
carving 11.4
face 11.4
building 11.1
church 11.1
column 11
holy 10.6
saint 10.6
portrait 10.4
closeup 10.1
tourism 9.9
creation 9.8
carved 9.8
decoration 9.5
money 9.4
person 8.7
man 8.7
pray 8.7
paper 8.6
dollar 8.4
landmark 8.1
currency 8.1
monk 8
museum 7.9
angel 7.8
spiritual 7.7
spirituality 7.7
outdoor 7.7
sign 7.5
design 7.4
aged 7.2
sky 7

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 99.8
book 98.8
person 98.7
clothing 98.4
old 97.6
man 92
white 85.9
black 76.5
vintage 74.7
photograph 72.3
posing 54.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-26
Gender Male, 94.8%
Angry 70%
Calm 24.7%
Confused 2.3%
Sad 1.5%
Surprised 0.5%
Disgusted 0.5%
Fear 0.4%
Happy 0.1%

Microsoft Cognitive Services

Age 31
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.6%

Categories

Imagga

paintings art 99%