Human Generated Data

Title

Untitled (proof print: older woman with eyeglasses)

Date

c. 1930

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1138

Human Generated Data

Title

Untitled (proof print: older woman with eyeglasses)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1138

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 98.5
Human 98.5
Text 82.4
Face 80.6
Advertisement 74.5
Art 66.4
Female 65.3
Clothing 63.6
Apparel 63.6
Finger 56.9
Poster 53.6

Clarifai
created on 2023-10-26

people 99.9
portrait 99.9
monochrome 98.7
vintage 98.1
one 97.7
adult 96.8
old 96.6
book series 96.2
retro 96.1
woman 95.8
art 95.5
writer 94.2
facial expression 94
wear 93.7
music 93.3
poet 92.9
antique 92.7
man 92.2
sepia 90.3
book bindings 90.1

Imagga
created on 2022-01-22

world 32.7
man 19.6
portrait 18.8
child 16.8
male 16.6
black 15.8
cemetery 15.3
cash 14.6
money 14.5
currency 14.4
antique 13.9
statue 13.6
vintage 13.2
religion 12.6
kin 12.5
old 12.5
sculpture 12.4
people 12.3
banking 12
memorial 11.6
person 11.5
bill 11.4
face 11.4
dollar 11.1
finance 11
book jacket 10.9
bank 10.8
financial 10.7
jacket 10.4
ancient 10.4
savings 10.3
head 10.1
wealth 9.9
exchange 9.5
love 9.5
culture 9.4
religious 9.4
stone 9.1
business 9.1
pretty 9.1
attractive 9.1
art 9.1
mother 9
family 8.9
sexy 8.8
creation 8.8
sibling 8.8
newspaper 8.7
adult 8.4
economy 8.3
product 8.3
fashion 8.3
human 8.3
one 8.2
body 8
hair 7.9
gravestone 7.9
banknotes 7.8
sepia 7.8
bills 7.8
eyes 7.7
men 7.7
expression 7.7
bride 7.7
youth 7.7
dark 7.5
decoration 7.5
blackboard 7.5
paper 7.4
closeup 7.4
detail 7.2
history 7.2
smile 7.1
market 7.1
model 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.9
human face 98.3
person 96.9
clothing 94.2
book 86.8
woman 78.6
poster 73.2
old 51.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 64-74
Gender Female, 99.9%
Calm 98.4%
Sad 0.6%
Angry 0.3%
Surprised 0.2%
Fear 0.2%
Disgusted 0.1%
Confused 0.1%
Happy 0.1%

Microsoft Cognitive Services

Age 58
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Poster 53.6%

Categories

Imagga

paintings art 99.7%

Captions

Microsoft
created on 2022-01-22

an old photo of a person 50.2%
an old photo of a person 45.9%
old photo of a person 44.6%

Text analysis

Amazon

of
this
make
like
me
Please
make Is like me this this hert of this justine Please
not
loce
not loce cate
Is
justine
hert
cate
Unfurned

Google

ut lace pao Uufuahed make lme thig f this pictisi
ut
lace
pao
Uufuahed
make
lme
thig
f
this
pictisi