Human Generated Data

Title

Venus, Juno and Ceres

Date

16th century

People

Artist: Marco Dente, Italian 1488 - 1532

Artist after: Raphael, Italian 1483 - 1520

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G2564

Human Generated Data

Title

Venus, Juno and Ceres

People

Artist: Marco Dente, Italian 1488 - 1532

Artist after: Raphael, Italian 1483 - 1520

Date

16th century

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-11-04

Person 99
Human 99
Person 98.2
Art 97.9
Painting 88.6
Archaeology 64.2

Clarifai
created on 2019-11-04

people 99.9
art 99.7
illustration 99.6
print 98.7
adult 98.6
man 97.9
engraving 96.6
group 96.5
two 94.7
woman 94.2
painting 92.8
portrait 90.2
visuals 86.2
woodcut 85.9
Renaissance 85.2
nude 84.9
one 83.8
leader 83.1
etching 82.9
veil 82.3

Imagga
created on 2019-11-04

sketch 100
drawing 83.7
representation 67.2
art 35
sculpture 31.7
ancient 26.8
decoration 26.1
statue 23
religion 20.6
architecture 20.3
old 19.5
history 17.9
religious 17.8
design 17.3
detail 16.9
vintage 16.5
god 15.3
stone 15.3
antique 14.9
symbol 14.8
artistic 14.8
retro 14.8
carving 14.6
culture 14.5
temple 14.2
church 13.9
holy 13.5
monument 13.1
comic book 12.5
figure 12.3
graffito 12.1
artwork 11.9
style 11.9
spiritual 11.5
travel 11.3
stamp 11.2
historic 11
tattoo 10.9
decorative 10.9
black 10.8
angel 10.8
spirituality 10.6
wall 10.3
pattern 10.3
man 10.1
catholic 9.7
faith 9.6
historical 9.4
traditional 9.2
city 9.1
gold 9
landmark 9
currency 9
museum 8.9
marble 8.9
pray 8.7
ornament 8.6
face 8.5
grunge 8.5
letter 8.3
tourism 8.3
cash 8.2
peace 8.2
painting 8.2
graphic 8
decor 8
postmark 7.9
holiday 7.9
palace 7.7
saint 7.7
mail 7.7
famous 7.4
close 7.4
building 7.1

Google
created on 2019-11-04

Microsoft
created on 2019-11-04

text 99.9
drawing 99.4
sketch 99.4
book 98.5
cartoon 88.1
old 74.1
person 73.9
art 66.9
illustration 65.7
painting 57.9
posing 46.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Male, 50.4%
Happy 45%
Sad 45%
Calm 54.9%
Confused 45%
Disgusted 45%
Surprised 45%
Fear 45%
Angry 45%

AWS Rekognition

Age 21-33
Gender Female, 50%
Happy 45.9%
Disgusted 45%
Angry 45.1%
Calm 53.9%
Sad 45.1%
Surprised 45%
Confused 45%
Fear 45%

AWS Rekognition

Age 13-23
Gender Male, 97%
Confused 1.8%
Fear 5.1%
Surprised 4.2%
Happy 1.2%
Calm 50.5%
Angry 33.2%
Sad 1.7%
Disgusted 2.1%

Microsoft Cognitive Services

Age 28
Gender Female

Microsoft Cognitive Services

Age 27
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Painting 88.6%

Captions

Microsoft

an old photo of a person 87.5%
a group of people posing for a photo 74%
old photo of a person 73.9%

Text analysis

Amazon

R

Google

RA
RA