Human Generated Data

Title

Untitled (portrait of a woman seated on a bench in studio)

Date

1920

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1866

Human Generated Data

Title

Untitled (portrait of a woman seated on a bench in studio)

People

Artist: Hamblin Studio, American active 1930s

Date

1920

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1866

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.3
Human 99.3
Art 89.5
Painting 77.4
Clothing 73.3
Apparel 73.3
Portrait 63.8
Photography 63.8
Face 63.8
Photo 63.8

Clarifai
created on 2023-10-25

people 99.7
adult 98.3
one 97.5
art 96.6
man 96.4
portrait 94.3
woman 93.4
leader 91.7
wear 91.3
two 88.6
antique 87.5
sit 87.2
position 87.1
retro 83.9
administration 82.5
monochrome 79.9
print 79.5
old 79
sepia pigment 75.9
paper 73.8

Imagga
created on 2021-12-14

negative 42.6
film 35.9
statue 25.6
photographic paper 24.8
book jacket 23.2
sculpture 23
jacket 22.9
art 17.3
photographic equipment 16.5
wrapping 13.7
dress 13.6
old 13.2
marble 12.9
face 12.8
model 12.4
portrait 12.3
people 12.3
black 12
attractive 11.2
head 10.9
covering 10.6
fashion 10.6
body 10.4
daily 10.4
clothing 10.3
monument 10.3
religion 9.9
travel 9.9
newspaper 9.8
posing 9.8
adult 9.8
stone 9.4
architecture 9.4
grunge 9.4
pretty 9.1
human 9
lady 8.9
building 8.9
sexy 8.8
ancient 8.6
product 8.5
historical 8.5
church 8.3
vintage 8.3
tourism 8.3
makeup 8.2
person 8.1
history 8
bride 8
catholic 7.9
design 7.9
angel 7.8
culture 7.7
elegance 7.6
healthy 7.6
figure 7.5
clothes 7.5
famous 7.4
future 7.4
tourist 7.4
creation 7.4
park 7.4
symbol 7.4
man 7.4
pose 7.2
looking 7.2

Microsoft
created on 2021-12-14

text 99.4
sketch 95.4
drawing 93
clothing 89.7
person 87.4
old 85
white 80.3
black and white 75.9
gallery 67.4
posing 66.8
painting 55.4
room 45.8
vintage 30.5
picture frame 15.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-52
Gender Male, 90.6%
Calm 59.1%
Confused 33.9%
Sad 4.8%
Surprised 1%
Happy 0.8%
Angry 0.3%
Disgusted 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Categories

Imagga

paintings art 99.8%

Captions