Human Generated Data

Title

Virgin and Child

Date

c. 1520

People

Artist: Bernardino Fasolo, Italian 1489 - after 1526

Previous attribution: Bartolomeo Veneto, Italian 1502 - 1546

Previous attribution: Unidentified Artist,

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Samuel H. Kress Foundation, 1962.161

Human Generated Data

Title

Virgin and Child

People

Artist: Bernardino Fasolo, Italian 1489 - after 1526

Previous attribution: Bartolomeo Veneto, Italian 1502 - 1546

Previous attribution: Unidentified Artist,

Date

c. 1520

Classification

Paintings

Machine Generated Data

Tags

Amazon
created on 2021-11-11

Painting 98.6
Art 98.6
Person 89.7
Human 89.7

Imagga
created on 2021-11-11

brother 38.3
mother 37.5
sexy 30.5
attractive 29.4
parent 28.8
hair 28.5
black 28.3
adult 27.2
people 26.8
model 24.9
portrait 24.6
child 24.5
stethoscope 24
male 24
pretty 23.8
person 23.6
love 22.9
lady 21.9
fashion 21.1
medical instrument 20.2
body 19.2
sensual 19.1
sitting 18.9
erotic 18.6
happy 17.6
sensuality 17.3
passion 16.9
cute 16.5
human 15.8
posing 15.1
dark 15
brunette 14.8
man 14.8
studio 14.4
son 14.4
skin 14.4
women 14.2
family 14.2
wife 14.2
instrument 14.2
face 14.2
couple 14
one 13.4
smile 12.8
gorgeous 12.7
device 12.6
happiness 12.5
smiling 12.3
looking 12
elegance 11.8
dress 11.8
handsome 11.6
sofa 11.6
jeans 11.5
together 11.4
lingerie 11.3
father 11.2
style 11.1
lips 11.1
sexual 10.6
seductive 10.5
fun 10.5
youth 10.2
makeup 10.1
blond 10.1
girls 10
husband 9.7
couch 9.7
bed 9.5
eyes 9.5
pleasure 9.4
relaxing 9.1
cheerful 8.9
hands 8.7
glamor 8.6
chair 8.5
legs 8.5
relax 8.4
make 8.2
home 8
glamorous 7.7
baby 7.7
expression 7.7
loving 7.6
serious 7.6
casual 7.6
room 7.5
feminine 7.5
pose 7.2
lifestyle 7.2
kid 7.1
interior 7.1

Google
created on 2021-11-11

Microsoft
created on 2021-11-11

painting 99.1
drawing 94.9
art 91.5
human face 90.5
baby 90.2
text 84.6
person 80
sketch 75.8
cartoon 71.1

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 17-29
Gender Female, 99%
Calm 57.7%
Sad 40.9%
Fear 0.5%
Angry 0.3%
Happy 0.2%
Surprised 0.2%
Confused 0.1%
Disgusted 0.1%

AWS Rekognition

Age 35-51
Gender Male, 55.9%
Calm 93.4%
Sad 5.4%
Confused 0.4%
Happy 0.2%
Fear 0.2%
Disgusted 0.1%
Surprised 0.1%
Angry 0.1%

Microsoft Cognitive Services

Age 25
Gender Female

Feature analysis

Amazon

Painting 98.6%
Person 89.7%

Captions

Microsoft

a person sitting next to a stuffed animal 41.4%
a person holding a stuffed animal 39.1%
a person standing next to a stuffed animal 39%