Human Generated Data

Title

Untitled (unidentified woman, standing, three-quarter length, leaning on back of chair)

Date

1866-1900

People

Artist: Clark, American active mid-19th to early-20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3666

Human Generated Data

Title

Untitled (unidentified woman, standing, three-quarter length, leaning on back of chair)

People

Artist: Clark, American active mid-19th to early-20th century

Date

1866-1900

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3666

Machine Generated Data

Tags

Amazon
created on 2022-01-30

Person 99.6
Human 99.6
Face 82.6
Clothing 77.8
Apparel 77.8
Poster 74.7
Advertisement 74.7
Text 68.2
Photography 62
Photo 62
Portrait 60.4
Collage 57.7
Paper 55.3

Clarifai
created on 2023-10-29

portrait 99
art 98.2
moment 95.8
picture frame 95.5
dirty 95.3
sepia 95.1
people 94.6
collage 94.2
retro 94.1
paper 93.7
old 93.5
empty 93.4
woman 93.1
vintage 93.1
blank 91.6
man 90.7
painting 89.2
antique 88.9
museum 88.6
wear 87.6

Imagga
created on 2022-01-30

frame 43.5
grunge 30.7
blank 29.2
retro 28.7
old 27.9
border 27.1
paper 26.7
vintage 26.5
empty 25.8
aged 24.4
film 24.3
antique 23.4
texture 22.9
black 21.9
screen 21.2
book 20.6
blackboard 19.9
design 18.7
product 18.2
space 17.1
damaged 16.2
ancient 15.6
book jacket 15.5
dirty 15.4
material 15.2
textured 14.9
creation 14.8
album 14.6
background 14.3
room 13.9
instant 13.8
display 13.7
photograph 13.6
art 13.4
photography 13.3
grungy 13.3
jacket 13
computer 13
camera 12.9
edge 12.5
wall 12
parchment 11.5
office 11.3
television 11.2
people 11.2
slide 10.7
person 10.7
stains 10.7
crumpled 10.7
laptop 10.7
notebook 10.6
spot 10.6
attractive 10.5
old fashioned 10.5
home 10.4
business 10.3
brown 10.3
monitor 10.1
board 10.1
wallpaper 10
photographic 9.8
grime 9.8
lady 9.7
interior 9.7
portrait 9.7
structure 9.7
graphic 9.5
sheet 9.4
grain 9.2
wrapping 9.2
message 9.1
rough 9.1
element 9.1
backdrop 9.1
pattern 8.9
scratch 8.8
mottled 8.8
fracture 8.8
text 8.7
movie 8.7
memory 8.7
decay 8.7
adult 8.6
rusty 8.6
wood 8.3
historic 8.3
style 8.2
paint 8.2
happy 8.2
copy space 8.1
covering 8.1
decoration 8.1
card 8
highly 7.9
broad 7.9
frames 7.8
tracery 7.8
male 7.8
ragged 7.8
succulent 7.8
detailed 7.7
worn 7.6
dirt 7.6
snapshot 7.6
sign 7.5
man 7.4
template 7.3
smiling 7.2
negative 7.2
businessman 7.1
model 7
pretty 7

Microsoft
created on 2022-01-30

wall 99.4
person 97.2
text 97
clothing 96.8
man 92.4
drawing 88.7
indoor 85.2
painting 74.2
sketch 73.8
human face 60.5
gallery 55.3
room 48.4
picture frame 16.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 34-42
Gender Female, 99.8%
Calm 94.3%
Sad 2.2%
Confused 1.3%
Angry 1.2%
Surprised 0.6%
Fear 0.2%
Happy 0.1%
Disgusted 0.1%

Microsoft Cognitive Services

Age 31
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.6%

Categories

Imagga

events parties 94.9%
text visuals 4.7%

Captions

Text analysis

Amazon

Clark
PITTSFIELD,
PITTSFIELD, MASS,
MASS,
2.2002.3666

Google

|Clante PITTSFIELD, MASS, 2.J002.3666
|Clante
PITTSFIELD,
MASS,
2.J002.3666