Human Generated Data

Title

Virgin and Child and Saint Jerome

Date

16th century

People

Artist: Polidoro da Lanciano, Italian c. 1515 - 1565

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Randall Fund and several individual gifts, 1911.1

Human Generated Data

Title

Virgin and Child and Saint Jerome

People

Artist: Polidoro da Lanciano, Italian c. 1515 - 1565

Date

16th century

Classification

Paintings

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Art 98.3
Painting 97.2
Human 96.5
Person 96.5
Person 82.4

Clarifai
created on 2020-04-24

people 99.9
adult 99.1
two 99.1
group 98.1
print 97.1
furniture 96.8
woman 96.6
portrait 96.5
three 95.9
art 95.7
offspring 95.6
sit 94.5
child 94.1
wear 93.8
administration 93.3
one 92.8
leader 92.7
family 91
man 91
painting 90.3

Imagga
created on 2020-04-24

sitting 33.5
adult 31.8
person 26.7
people 26.2
happy 21.3
attractive 21
lifestyle 21
newspaper 20.4
smiling 19.5
portrait 19.4
smile 19.2
pretty 18.2
product 16.8
man 16.8
outdoors 15.7
happiness 15.7
face 15.6
model 15.6
lady 15.4
dress 15.4
hair 15.1
love 15
wedding 14.7
looking 14.4
bride 14.4
mother 13.8
sexy 13.6
relaxing 13.6
fashion 13.6
scholar 13.2
creation 13.1
couple 13.1
women 11.9
male 11.5
one 11.2
youth 11.1
casual 11
relaxation 10.9
intellectual 10.5
parent 10.2
leisure 10
cheerful 9.8
together 9.6
married 9.6
marriage 9.5
sit 9.5
sensuality 9.1
computer 9
religion 9
clothing 9
interior 8.8
home 8.8
brunette 8.7
luxury 8.6
bouquet 8.5
two 8.5
relax 8.4
outdoor 8.4
mature 8.4
holding 8.3
human 8.2
sculpture 8.2
covering 8
art 8
laptop 7.9
cute 7.9
look 7.9
child 7.7
couch 7.7
summer 7.7
expression 7.7
jacket 7.7
health 7.6
book 7.6
wife 7.6
joy 7.5
senior 7.5
student 7.4
sofa 7.4
blond 7.4
romance 7.1
romantic 7.1
work 7.1
day 7.1
indoors 7

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

text 99.9
book 99.1
painting 98.8
drawing 98.1
sketch 94.4
outdoor 88.2
person 85.2
human face 83.3
woman 76.8
clothing 66
old 42

Face analysis

Amazon

AWS Rekognition

Age 12-22
Gender Female, 92.9%
Sad 1.8%
Confused 0.2%
Disgusted 0%
Happy 0.1%
Fear 0%
Calm 97.8%
Surprised 0%
Angry 0.1%

AWS Rekognition

Age 49-67
Gender Male, 90.5%
Happy 0.8%
Fear 0.3%
Calm 9.1%
Disgusted 0.1%
Confused 0.2%
Surprised 0.1%
Angry 0.2%
Sad 89.3%

AWS Rekognition

Age 1-7
Gender Female, 61.5%
Confused 1.7%
Sad 2.2%
Calm 10.4%
Fear 0.3%
Happy 0.1%
Disgusted 3%
Angry 81.7%
Surprised 0.6%

Feature analysis

Amazon

Painting 97.2%
Person 96.5%

Captions

Microsoft

a group of people looking at a book 62.6%
a group of people sitting on a bench 62.5%
a group of people sitting on a book 54.8%