Human Generated Data

Title

Eakers

Date

1904

People

Artist: Robert Henri, American 1865 - 1929

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Louise E. Bettens Fund, 1948.85

Human Generated Data

Title

Eakers

People

Artist: Robert Henri, American 1865 - 1929

Date

1904

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Louise E. Bettens Fund, 1948.85

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Art 95.4
Painting 93.5
Person 86.4
Human 86.4
Portrait 61.7
Face 61.7
Photo 61.7
Photography 61.7
Text 60.7

Clarifai
created on 2020-04-24

people 99.9
portrait 99.9
art 99.5
one 99.3
engraving 99
print 99
adult 98.2
man 97.7
leader 96.9
painting 96.7
administration 94.4
wear 94.3
vintage 92.9
antique 90.9
old 90.4
poet 90.3
music 90.2
writer 90.1
costume 88
politician 87.8

Imagga
created on 2020-04-24

person 41.8
portrait 33
man 28.9
male 28.5
adult 27.8
fashion 26.4
people 25.7
adolescent 23.9
face 23.4
hair 22.2
model 20.2
juvenile 19.5
attractive 18.9
black 18.4
jacket 18.2
expression 17.9
one 17.2
human 16.5
book jacket 16
dress 15.4
old 14.6
art 13.7
studio 13.7
looking 13.6
sexy 12.8
style 11.9
statue 11.8
pose 11.8
handsome 11.6
culture 11.1
church 11.1
love 11
lifestyle 10.8
vintage 10.8
posing 10.7
holy 10.6
lady 10.5
god 10.5
serious 10.5
guy 10.5
couple 10.5
antique 10.4
men 10.3
youth 10.2
sensuality 10
religion 9.9
mother 9.7
boy 9.7
crazy 9.7
ancient 9.5
wrapping 9.5
suit 9.5
wall 9.4
costume 9.4
cute 9.3
head 9.2
sculpture 8.8
brunette 8.7
standing 8.7
covering 8.5
religious 8.4
dark 8.3
cadaver 8.2
alone 8.2
happy 8.1
history 8
eye 8
cool 8
eyes 7.7
sitting 7.7
prayer 7.7
pretty 7.7
spiritual 7.7
clothing 7.6
hand 7.6
professional 7.6
city 7.5
historic 7.3
sensual 7.3
make 7.3
beard 7.2
body 7.2
businessman 7.1

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

text 100
drawing 99.9
sketch 99.8
painting 99.7
book 99.7
person 96
child art 94.3
old 93.4
art 87.7
human face 81.9
portrait 73.6
black 69.1
posing 37.7
vintage 31.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Male, 92.1%
Disgusted 0.4%
Confused 1%
Fear 1%
Calm 21%
Happy 0.4%
Surprised 0.4%
Sad 74.3%
Angry 1.5%

Microsoft Cognitive Services

Age 29
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 93.5%
Person 86.4%

Categories

Imagga

paintings art 65.9%
people portraits 33%

Captions