Human Generated Data

Title

Study of a Man

Date

19th-20th century

People

Artist: Thomas Cowperthwait Eakins, American 1844 - 1916

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Marian H. Phinney Fund, 1975.93

Human Generated Data

Title

Study of a Man

People

Artist: Thomas Cowperthwait Eakins, American 1844 - 1916

Date

19th-20th century

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Marian H. Phinney Fund, 1975.93

Machine Generated Data

Tags

Amazon
created on 2020-05-01

Art 97.6
Face 97.1
Human 97.1
Drawing 96.9
Person 96.4
Sketch 87.2
Head 81
Painting 73.5
Text 71.7
Photography 67
Portrait 67
Photo 67

Clarifai
created on 2020-05-01

portrait 99.9
people 99.8
one 99.5
adult 98.7
man 97.4
monochrome 96.8
paper 96.5
art 96.1
print 95.8
wear 95.7
old 94
sepia pigment 91.6
antique 89.8
administration 89.7
vintage 89.4
leader 86.8
engraving 85.9
two 83.8
black and white 82.1
retro 81.8

Imagga
created on 2020-05-01

mug shot 67.7
photograph 52.6
representation 41.8
beard 38.3
portrait 36.9
face 29.9
man 28.9
hair 28.6
creation 27.4
adult 26.5
male 26.3
attractive 25.9
person 25.1
sexy 24.9
skin 24.4
body 23.2
people 20.7
model 20.2
pretty 18.2
eyes 18.1
human 18
world 17
love 16.6
guy 16.5
expression 16.2
handsome 15.2
emotion 14.8
spa 14.4
health 13.9
black 13.8
lady 13.8
fashion 13.6
lifestyle 13
sensual 12.7
one 12.7
wet 12.5
bath 12.4
smile 12.1
men 12
water 12
looking 12
happy 11.9
relaxation 11.7
bathroom 11.5
adolescent 11.4
hands 11.3
head 10.9
close 10.9
cute 10.8
healthy 10.7
care 10.7
erotic 10.5
look 10.5
serious 10.5
couple 10.5
clean 10
shower 10
posing 9.8
old 9.8
boy 8.7
naked 8.7
casual 8.5
juvenile 8.5
relax 8.4
treatment 8.3
makeup 8.2
lovely 8
women 7.9
depression 7.8
grandma 7.7
youth 7.7
facial 7.7
muscular 7.7
nose 7.6
hand 7.6
joy 7.5
feminine 7.5
mature 7.4
closeup 7.4
lips 7.4
wellness 7.3
gorgeous 7.3
gray 7.2
fresh 7.2
eye 7.2

Google
created on 2020-05-01

Microsoft
created on 2020-05-01

sketch 99.8
drawing 99.7
text 99.3
human face 95
painting 89.4
person 88.4
man 86.3
portrait 78.9
art 77.6
child art 75.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 39-57
Gender Male, 98.8%
Calm 5.8%
Angry 10.8%
Disgusted 0.3%
Confused 81.3%
Sad 1.4%
Fear 0.1%
Surprised 0.3%
Happy 0%

Microsoft Cognitive Services

Age 32
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.4%

Categories

Imagga

pets animals 79.1%
paintings art 20.4%

Captions