Human Generated Data

Title

Andrea di Odoni

Date

17th century

People

Artist: Cornelis Visscher, Dutch c. 1629 - 1658

Artist after: Lorenzo Lotto, Italian 1480 - 1556/1557

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R4747

Human Generated Data

Title

Andrea di Odoni

People

Artist: Cornelis Visscher, Dutch c. 1629 - 1658

Artist after: Lorenzo Lotto, Italian 1480 - 1556/1557

Date

17th century

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-10-30

Art 99.2
Painting 99.2
Human 96.7
Person 96.7
Person 92.3
Person 89.8
Person 78.3
Portrait 58.1
Photo 58.1
Face 58.1
Photography 58.1

Clarifai
created on 2019-10-30

people 100
art 99.7
adult 99.6
print 99.6
portrait 99.4
facial hair 99.3
engraving 98.8
painting 98.5
man 98.4
group 97.6
one 97.4
leader 96.9
illustration 95.3
two 94
affection 93.8
wear 93.8
furniture 92
Renaissance 91.6
baroque 90.3
outerwear 89.9

Imagga
created on 2019-10-30

cadaver 71.6
statue 33.6
sculpture 32.6
religion 27.8
ancient 27.7
stone 23.7
art 23.5
old 22.3
religious 20.6
culture 20.5
architecture 20.3
temple 19.7
god 19.1
carving 17
history 17
spiritual 15.4
ruler 14.5
holy 14.4
face 14.2
monument 14
close 13.7
money 13.6
figure 13.2
dollar 13
antique 13
cash 12.8
one 11.9
historic 11.9
currency 11.7
portrait 11.7
pray 11.6
historical 11.3
travel 11.3
head 10.9
man 10.8
bank 10.7
people 10.6
spirituality 10.6
bill 10.5
church 10.2
finance 10.1
grandfather 10.1
vintage 9.9
famous 9.3
male 9.2
banking 9.2
dollars 8.7
prayer 8.7
us 8.7
peace 8.2
building 7.9
paper 7.8
carved 7.8
museum 7.8
hundred 7.7
worship 7.7
heritage 7.7
pay 7.7
faith 7.7
capital 7.6
person 7.6
east 7.5
business 7.3
detail 7.2
wealth 7.2
hair 7.1

Google
created on 2019-10-30

Microsoft
created on 2019-10-30

text 100
book 99.8
drawing 99.3
sketch 99.1
painting 97.9
wall 97.4
person 93.8
indoor 93.3
human face 87.2
man 82
cartoon 63.6
clothing 53.8
old 53

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 32-48
Gender Male, 99.1%
Sad 42.6%
Calm 54%
Angry 1%
Happy 0.3%
Fear 0.2%
Confused 1.4%
Disgusted 0.1%
Surprised 0.4%

AWS Rekognition

Age 34-50
Gender Male, 99.4%
Disgusted 0%
Confused 0%
Fear 0%
Happy 0%
Angry 0.4%
Sad 0.5%
Calm 99%
Surprised 0%

Microsoft Cognitive Services

Age 45
Gender Male

Microsoft Cognitive Services

Age 40
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Painting 99.2%
Person 96.7%

Captions

Microsoft

a person sitting on a book 51.4%
a person sitting on top of a book 50.7%
a person holding a book 50.6%