Human Generated Data

Title

Portrait of a Man

Date

c. 1630-1635

People

Artist: Unidentified Artist,

Previous attribution: Thomas de Keyser, Dutch 1596 or 1597-1667

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of James P. Warburg, 1969.61

Human Generated Data

Title

Portrait of a Man

People

Artist: Unidentified Artist,

Previous attribution: Thomas de Keyser, Dutch 1596 or 1597-1667

Date

c. 1630-1635

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of James P. Warburg, 1969.61

Machine Generated Data

Tags

Amazon
created on 2019-03-28

Painting 99.5
Art 99.5
Clothing 97.9
Apparel 97.9
Human 96.1
Person 96.1
Footwear 69.2
Boot 61.9
Photo 61
Photography 61
Face 61
Portrait 61

Clarifai
created on 2018-02-09

people 99.7
lid 99
adult 98.7
one 98.4
wear 98
painting 97
veil 96.3
man 94.9
art 94.8
weapon 94.5
walking stick 94.3
print 93.3
illustration 86.4
woman 86.4
outerwear 85.4
portrait 85.1
pants 84.3
coat 81.7
bonnet 81.7
sword 81.5

Imagga
created on 2018-02-09

crutch 71.9
staff 57.5
hat 53.7
stick 42.7
clothing 35.1
cloak 34.8
headdress 28
covering 26.7
face 24.9
fashion 24.1
portrait 22
attractive 21
person 20.8
dress 20.8
people 20.1
costume 20
poncho 19.1
adult 18.1
hair 17.4
model 17.1
clothes 16.9
old 16.7
happy 16.3
style 14.8
man 14.8
pretty 14.7
black 13.7
lady 13
culture 12.8
looking 12.8
one 12.7
smile 12.1
sexy 12
human 12
traditional 11.6
posing 11.5
garment 11.5
male 11.3
sombrero 10.8
vintage 10.7
standing 10.4
lifestyle 10.1
cute 10
child 10
brunette 9.6
smiling 9.4
dark 9.2
mysterious 8.7
mask 8.6
ethnic 8.6
expression 8.5
art 8.5
make 8.2
building 7.9
look 7.9
holiday 7.9
wall 7.7
mystery 7.7
casual 7.6
joy 7.5
outdoors 7.5
jacket 7.4
long 7.3
teenager 7.3
religion 7.2
happiness 7

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

person 98.1
wall 95.1
floor 91.1
standing 86.8
indoor 85.5
black 68.1
posing 35.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 19-36
Gender Male, 99.7%
Calm 65%
Confused 8.2%
Happy 1%
Sad 13.9%
Angry 5.8%
Surprised 4.1%
Disgusted 2.1%

Microsoft Cognitive Services

Age 32
Gender Female

Google Vision

Surprise Unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 99.5%
Person 96.1%