Human Generated Data

Title

Agnes Mongan (1905 - 1996)

Date

1933

People

Artist: Eleanor E. Randall, American 1899 - 1979

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Agnes Mongan, 1954.2

Human Generated Data

Title

Agnes Mongan (1905 - 1996)

People

Artist: Eleanor E. Randall, American 1899 - 1979

Date

1933

Classification

Drawings

Machine Generated Data

Tags

Amazon
created on 2019-05-31

Person 98.7
Human 98.7
Art 95.2
Drawing 95.2
Sketch 87.5
Text 74.8
Photography 60.6
Photo 60.6
Painting 55

Clarifai
created on 2019-05-31

people 99
one 96.8
portrait 96.7
wear 93.7
retro 93.5
man 93.3
antique 93
adult 92.7
art 88.8
paper 88.5
old 84.8
print 84.4
vintage 82.8
pensive 80.2
woman 79.6
sepia 78.2
side view 77.4
sepia pigment 74.3
profile 71.4
painting 69

Imagga
created on 2019-05-31

mug shot 100
photograph 97
representation 96.6
creation 49.5
model 35.8
portrait 35.6
sketch 35.2
face 34.8
attractive 32.9
hair 30.1
adult 29.2
sexy 27.3
drawing 26.3
pretty 25.9
fashion 24.9
brunette 24.4
person 24.3
people 21.2
skin 20
lady 19.5
smile 17.8
pose 17.2
head 16.8
eyes 16.4
happy 16.3
expression 15.4
looking 15.2
human 15
close 14.9
style 14.8
posing 14.2
smiling 13.8
makeup 13.7
studio 13.7
youth 13.6
sensuality 13.6
male 13.5
man 13.5
body 12.8
sensual 12.7
gorgeous 12.7
black 12.6
women 11.9
cute 11.5
one 11.2
lips 11.1
healthy 10.7
book jacket 10.6
business 10.3
jacket 10.2
happiness 10.2
hand 9.9
handsome 9.8
look 9.6
professional 9.5
closeup 9.4
natural 9.4
make 9.1
health 9
lovely 8.9
glamorous 8.7
lifestyle 8.7
glamor 8.6
hairstyle 8.6
money 8.5
cosmetics 8.4
feminine 8.4
dark 8.4
brown 8.1
currency 8.1
spa 8.1
seductive 7.7
skincare 7.6
bath 7.6
enjoying 7.6
laughing 7.6
joy 7.5
clean 7.5
care 7.4
alone 7.3
confident 7.3
eye 7.2
modern 7

Google
created on 2019-05-31

Photograph 96.7
Portrait 86.7
Retro style 79.5
Forehead 78.2
Vintage clothing 76.9
Self-portrait 67.3
Drawing 63.2
Jaw 57.4
Sketch 55.9
Paper product 53.3
Art 50.2

Microsoft
created on 2019-05-31

text 99.6
sketch 99.6
drawing 99.2
human face 93.5
person 83.7
portrait 81.3
child art 79.4
woman 70.2
painting 65.8

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 17-27
Gender Female, 92.8%
Sad 26%
Happy 4.1%
Confused 13.9%
Angry 7.4%
Surprised 6.1%
Disgusted 7.1%
Calm 35.4%

Microsoft Cognitive Services

Age 24
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%

Captions

Microsoft

a close up of text on a black background 68%
a close up of text on a white background 67.9%
close up of text on a black background 65.7%

Text analysis

Amazon

.Randell
33
E2 .Randell 33
E2