Human Generated Data

Title

Study of a Woman

Date

1947

People

Artist: Caroline van Evera, American 1889-1987

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from Fogg Art Museum Archives, 1968.9

Human Generated Data

Title

Study of a Woman

People

Artist: Caroline van Evera, American 1889-1987

Date

1947

Classification

Drawings

Machine Generated Data

Tags

Amazon
created on 2020-05-02

Art 99.6
Human 99.6
Drawing 99.6
Person 98.6
Sketch 96.3
Face 82.6
Doodle 72.1
Photo 63.9
Photography 63.9
Portrait 63.9

Clarifai
created on 2020-05-02

portrait 99.6
people 99.6
one 99
adult 98.1
man 96.7
wear 96.5
royalty 94.5
lid 93.5
leader 92.8
veil 92.1
engraving 90.6
administration 88.6
music 88.5
art 87.2
print 83.9
medal 80.2
woman 79.7
facial expression 78
musician 76.9
old 75.7

Imagga
created on 2020-05-02

mug shot 100
photograph 84.7
representation 66.3
creation 43.7
sculpture 31.1
statue 31
brass 29.3
money 28.9
currency 27.8
memorial 26.9
cash 24.7
ancient 23.4
knocker 23.2
dollar 23.2
art 21.5
face 21.3
bill 20.9
close 18.8
bank 18.8
finance 18.6
device 18
religion 17.9
head 17.6
old 17.4
culture 17.1
banking 16.6
wealth 16.2
paper 15.7
hundred 15.5
us 15.4
stone 15.4
structure 15.2
business 15.2
financial 15.2
one 14.9
portrait 14.9
dollars 14.5
god 14.4
history 14.3
man 14.1
franklin 13.8
architecture 13.3
antique 13
male 12.8
pay 12.5
monument 12.1
banknotes 11.8
temple 11.4
travel 11.3
religious 11.2
famous 11.2
banknote 10.7
loan 10.5
exchange 10.5
savings 10.3
rich 10.2
decoration 10.1
bust 10
marble 9.7
finances 9.6
closeup 9.4
human 9
carving 8.1
figure 8.1
roman 7.8
economic 7.8
golden 7.7
spirituality 7.7
notes 7.7
mask 7.6
economy 7.4
church 7.4
historic 7.3
object 7.3
person 7.2
market 7.1

Google
created on 2020-05-02

Drawing 82.7
Illustration 77.2
Sketch 71.2
Portrait 63.4
Jaw 63
Art 58.1
Artwork 55.2

Microsoft
created on 2020-05-02

sketch 99.8
drawing 99.5
text 99.1
human face 95.2
illustration 89.2
person 83.2
black 72.2
art 71.1
ink 69.4
clothing 66.5
white 62.5
man 58.2
cartoon 50.8
vintage 25.4

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 42-60
Gender Female, 52.1%
Angry 3.3%
Surprised 0.5%
Disgusted 0.4%
Confused 1.1%
Calm 36.5%
Fear 3.5%
Happy 0.6%
Sad 54.1%

Microsoft Cognitive Services

Age 65
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Possible
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%

Captions

Microsoft

a black and white photo of a man 84.6%
a vintage photo of a man 84.3%
black and white photo of a man 78.6%

Text analysis

Amazon

VANEVRA
MM
00000eoo

Google

VVIMIN
VAN
EVERA
00
000
VVIMIN 00 000 VAN EVERA