Human Generated Data

Title

Two Girls

Date

1934

People

Artist: Raphael Soyer, American 1899 - 1987

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Paul J. Sachs, M12316

Human Generated Data

Title

Two Girls

People

Artist: Raphael Soyer, American 1899 - 1987

Date

1934

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Paul J. Sachs, M12316

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 98.4
Human 98.4
Art 97
Person 94.7
Drawing 91.7
Painting 88.3
Sketch 71.6
Text 58.9
Face 58
Sitting 57.1
Portrait 56.8
Photography 56.8
Photo 56.8

Clarifai
created on 2023-10-26

people 100
print 99.9
art 99.7
engraving 99.4
adult 99.3
vintage 99.3
portrait 99.2
man 98.8
two 98.7
etching 98.4
antique 98.2
illustration 98.1
group 97.8
retro 97.5
painting 97.3
old 95.9
wear 94.8
affection 93.9
woman 93.7
reproduction 93.7

Imagga
created on 2022-01-22

vintage 35.6
doormat 34.6
stamp 29.4
old 28.6
letter 28.4
mail 27.8
mat 27.4
paper 24.3
currency 24.2
money 23.8
postmark 23.7
postage 23.6
retro 22.1
envelope 21.4
cash 21.1
floor cover 20.8
postal 19.6
dollar 19.5
finance 19.4
aged 19
covering 18.3
art 18.3
creation 17.7
ancient 17.3
post 17.2
bill 17.1
financial 16.9
printed 16.7
grunge 16.2
business 15.8
circa 15.8
shows 15.8
texture 15.3
stencil 14.8
bank 14.8
card 14.5
close 14.3
banking 13.8
philately 12.8
stamps 12.8
global 12.8
dollars 12.6
burlap 12.2
symbol 12.1
savings 12.1
note 11.9
newspaper 11.9
product 11.9
message 11.9
representation 11.3
dirty 10.8
black 10.3
icon 10.3
economy 10.2
communication 10.1
sketch 10
one 9.7
states 9.7
exchange 9.5
antique 9.5
united 9.5
closeup 9.4
wallpaper 9.2
mosaic 9.2
frame 9.2
baby 9.1
wealth 9
drawing 8.9
material 8.9
masterpiece 8.9
banknotes 8.8
banknote 8.7
notes 8.6
communications 8.6
painted 8.6
design 8.6
culture 8.5
rich 8.4
sign 8.3
collection 8.1
museum 8
known 7.9
address 7.8
photograph 7.8
paintings 7.8
bills 7.8
portrait 7.8
hundred 7.7
cutting 7.7
finances 7.7
international 7.6
fine 7.6
mug shot 7.6
weathered 7.6
unique 7.6
china 7.5
office 7.2
history 7.2
book jacket 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

drawing 96.3
text 96.3
sketch 95.6
person 95.2
human face 94.6
woman 89.8
clothing 81.3
painting 80.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 16-22
Gender Female, 98%
Calm 96%
Surprised 1.1%
Sad 0.8%
Fear 0.8%
Happy 0.5%
Disgusted 0.3%
Angry 0.3%
Confused 0.1%

Microsoft Cognitive Services

Age 28
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.4%
Painting 88.3%

Categories

Captions

Text analysis

Amazon

my
RAPHAEL
RAPHAEL Sayre
Sayre
head my
head

Google

RAPHARI Seysk
RAPHARI
Seysk