Human Generated Data

Title

Portrait of a Man, after Paris Bordone

Date

1904

People

Artist: Edward Waldo Forbes, American 1873 - 1969

Artist: Charles Fairfax Murray, British 1849 - 1919

Artist after: Paris Bordone, Italian 1500 - 1571

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Edward W. Forbes, 1948.102

Human Generated Data

Title

Portrait of a Man, after Paris Bordone

People

Artist: Edward Waldo Forbes, American 1873 - 1969

Artist: Charles Fairfax Murray, British 1849 - 1919

Artist after: Paris Bordone, Italian 1500 - 1571

Date

1904

Classification

Paintings

Machine Generated Data

Tags

Amazon
created on 2019-11-09

Art 96.8
Painting 96.5
Human 92.6
Person 92.6

Clarifai
created on 2019-11-09

people 100
portrait 99.9
one 99.7
adult 99.6
art 99.2
wear 98
man 97.5
painting 97.4
facial hair 95.6
mustache 91
music 90.3
side view 81.4
facial expression 80.4
musician 79.7
illustration 76.8
furniture 76.4
gown (clothing) 76.3
outerwear 75.9
writer 74.9
indoors 73.3

Imagga
created on 2019-11-09

elevator 48.5
lifting device 38.8
device 33.2
black 27.2
dark 21.7
television 21.2
man 20.1
light 16
hair 15.8
world 14.6
people 14.5
portrait 14.2
attractive 14
sexy 13.6
adult 13.6
face 13.5
telecommunication system 13.3
person 13.2
body 12.8
human 12.7
model 12.4
love 11.8
room 11.5
one 11.2
male 10.7
fashion 10.5
hands 10.4
lady 9.7
door 9.4
expression 9.4
skin 9.3
studio 9.1
posing 8.9
night 8.9
looking 8.8
home 8.8
happy 8.8
couple 8.7
garment 8.7
erotic 8.6
feather boa 8.6
sliding door 8.5
passion 8.5
child 8.4
old 8.4
window 8.3
sensual 8.2
sensuality 8.2
coat 8.2
dress 8.1
clothing 8
romance 8
wall 8
lifestyle 7.9
look 7.9
brunette 7.8
pretty 7.7
serious 7.6
house 7.5
fire 7.5
holding 7.4
lips 7.4
baby 7.4
furniture 7.1
interior 7.1

Google
created on 2019-11-09

Microsoft
created on 2019-11-09

human face 98.8
drawing 97.5
person 96.7
indoor 92.8
sketch 89.3
portrait 87.7
art 87.2
man 86.5
text 84.7
screen 70.7
picture frame 36.4
painting 22.8

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 24-38
Gender Male, 96.5%
Disgusted 23.7%
Confused 3.1%
Happy 0.1%
Angry 4.3%
Fear 1.5%
Surprised 0.5%
Sad 55.9%
Calm 10.8%

Microsoft Cognitive Services

Age 35
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 96.5%
Person 92.6%

Captions

Microsoft

a painting of a man 81.2%
a man standing next to a painting 58.7%
a screen shot of a man 58.6%