Human Generated Data

Title

Untitled (young woman holding a closed book, seated, three-quarter view)

Date

c. 1860

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs; Kate, Maurice R. and Melvin R. Seiden Purchase Fund for Photographs; Discretionary Fund for the Photograph Department; Robert M. Sedgwick II Fund, P1997.37.8

Human Generated Data

Title

Untitled (young woman holding a closed book, seated, three-quarter view)

People

Artist: Unidentified Artist,

Date

c. 1860

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs; Kate, Maurice R. and Melvin R. Seiden Purchase Fund for Photographs; Discretionary Fund for the Photograph Department; Robert M. Sedgwick II Fund, P1997.37.8

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 98.3
Human 98.3
Art 98.1
Painting 92.1
Portrait 67.6
Photography 67.6
Face 67.6
Photo 67.6
Drawing 56

Clarifai
created on 2023-10-27

people 99.7
portrait 99.5
art 98.6
one 98.3
painting 96.3
man 96.1
adult 95.7
elderly 93.2
sit 92.5
wear 89.5
woman 88.5
room 85.1
two 83.7
seat 81.7
leader 81.5
furniture 81.4
indoors 79.6
old 78
family 75.8
book bindings 71.9

Imagga
created on 2022-01-29

armchair 24
man 18.1
male 17.1
person 17.1
portrait 14.9
old 14.6
ancient 13.8
black 13.8
cell 13.7
adult 12.9
people 12.8
vintage 10.7
light 10.7
cemetery 10.5
antique 10.4
suit 10.4
business 10.3
art 9.8
one 9.7
grunge 9.4
model 9.3
face 9.2
fashion 9
handsome 8.9
sculpture 8.7
statue 8.6
dark 8.3
room 8.3
style 8.2
child 8.1
happy 8.1
aged 8.1
dress 8.1
religion 8.1
boy 7.8
attractive 7.7
culture 7.7
stone 7.5
historic 7.3
prison 7.3
sexy 7.2
history 7.2
hair 7.1
family 7.1
design 7.1
work 7.1
paper 7.1
architecture 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

wall 98.9
drawing 96.7
person 96.1
sketch 95.2
human face 93.3
clothing 93.1
text 92.5
man 91.5
indoor 86.2
old 72.6
painting 70.7
portrait 58.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 36-44
Gender Female, 97.7%
Confused 97.1%
Calm 1.4%
Sad 0.6%
Angry 0.3%
Surprised 0.3%
Fear 0.2%
Disgusted 0.1%
Happy 0.1%

Microsoft Cognitive Services

Age 31
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Painting
Person 98.3%
Painting 92.1%

Categories