Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (unidentified woman seated in chair, book in lap, table to right)

Date

1860-1899

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.329.5

Human Generated Data

Title

Untitled (unidentified woman seated in chair, book in lap, table to right)

People

Artist: Unidentified Artist,

Date

1860-1899

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.329.5

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 99.1
Human 99.1
Chair 97.1
Furniture 97.1
Text 92.3
Poster 88
Advertisement 88
Face 72.6
Sitting 61.3
Label 59.4
Alphabet 55.3

Clarifai
created on 2023-10-28

people 99.9
portrait 99.7
wear 98.6
art 98.4
adult 97.7
man 96.3
one 94.9
painting 94.8
retro 93.8
vintage 92.6
sepia pigment 92.1
two 90.8
nostalgia 90.6
print 90.3
woman 89
documentary 88.5
child 87.1
old 86.8
veil 86.3
chair 85.7

Imagga
created on 2022-02-25

book jacket 23.3
newspaper 18.1
jacket 18.1
old 18.1
antique 17.6
ancient 16.4
covering 15.8
sketch 15.6
product 15.2
military uniform 15
vintage 14.9
art 14
wrapping 13.8
portrait 13.6
person 13.1
drawing 13
face 12.8
black 12.6
statue 12.4
retro 12.3
man 12.1
culture 12
creation 11.9
clothing 11.6
uniform 11.5
sculpture 11.5
decoration 11.3
people 11.2
grunge 11.1
head 10.9
dress 10.8
religion 10.8
money 10.2
currency 9.9
history 9.8
sepia 9.7
style 9.6
historical 9.4
adult 9.2
cash 9.1
paper 9.1
fashion 9
bank 9
marble 8.7
male 8.7
golden 8.6
luxury 8.6
old fashioned 8.6
religious 8.4
design 8.4
monument 8.4
dollar 8.3
banking 8.3
historic 8.2
wealth 8.1
child 8.1
hair 7.9
model 7.8
sitting 7.7
blackboard 7.7
mask 7.7
finance 7.6
world 7.6
room 7.5
closeup 7.4
aged 7.2
detail 7.2
home 7.2
cute 7.2
representation 7.2

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 99.6
old 98.2
person 96.4
book 96.2
clothing 94.7
black 81.7
vintage 76.4
photograph 74
white 73.4
furniture 66.6
chair 66.2
posing 65.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 16-24
Gender Female, 99.9%
Calm 98.6%
Sad 0.5%
Confused 0.3%
Surprised 0.3%
Angry 0.2%
Fear 0.1%
Happy 0%
Disgusted 0%

Feature analysis

Amazon

Person
Poster
Person 99.1%

Categories

Imagga

paintings art 99.9%

Captions