Human Generated Data

Title

Clothing the Naked

Date

1581-82

People

Artist: Adriaen Collaert, Flemish c. 1560 - 1618

Artist after: Maerten de Vos, Netherlandish 1532 - 1603

Publisher: Gerard de Jode, Flemish 1509 - 1591

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Ketcham Wheaton in memory of Robert Bradford Wheaton, 2011.613.285

Human Generated Data

Title

Clothing the Naked

People

Artist: Adriaen Collaert, Flemish c. 1560 - 1618

Artist after: Maerten de Vos, Netherlandish 1532 - 1603

Publisher: Gerard de Jode, Flemish 1509 - 1591

Date

1581-82

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Ketcham Wheaton in memory of Robert Bradford Wheaton, 2011.613.285

Machine Generated Data

Tags

Amazon
created on 2019-06-26

Person 95.8
Human 95.8
Art 92.4
Painting 92.2
Person 87.7
Person 80.4
Person 80.2
Person 76.3
Person 74.4
Drawing 74
Person 70.8
Person 63.1
Text 61.1
Person 58.6
Person 57.8
Archaeology 56.7

Clarifai
created on 2019-06-26

painting 99.7
art 99.3
people 99.1
illustration 98
print 95.2
religion 94
man 93.5
museum 92.9
exhibition 92
adult 91.4
old 90.7
manuscript 88.1
saint 87.6
group 87
woman 86.5
mammal 85.8
Renaissance 85.7
war 84
god 83.9
lithograph 83.7

Imagga
created on 2019-06-26

graffito 67.3
decoration 46.9
tray 39.9
receptacle 32.3
vintage 30.6
old 27.2
container 26.2
retro 22.1
money 21.3
finance 20.3
letter 20.2
art 19.8
stamp 19.8
currency 19.7
postmark 18.7
mail 18.2
comic book 17.9
postage 17.7
postal 17.7
cash 17.4
envelope 16.5
paper 16.5
church 15.7
symbol 15.5
grunge 15.3
dollar 14.8
savings 14
business 14
ancient 13.8
banking 13.8
global 13.7
post 13.4
bill 13.3
culture 12.8
printed 12.8
icon 12.7
black 12
note 11.9
circa 11.8
texture 11.8
aged 11.8
chalkboard 11.8
religion 11.7
bank 11.6
financial 11.6
blackboard 10.7
board 10.5
painted 10.5
antique 10.4
wall 10.3
shows 9.9
design 9.8
painter 9.8
one 9.7
museum 9.7
office 9.6
communications 9.6
exchange 9.5
artwork 9.2
world 9
history 8.9
pattern 8.9
philately 8.9
renaissance 8.8
paintings 8.8
payment 8.7
unique 8.5
card 8.5
religious 8.4
frame 8.3
message 8.2
paint 8.1
painting 8.1
closeup 8.1
wealth 8.1
object 8.1
detail 8
close 8
post mail 7.9
zigzag 7.9
masterpiece 7.9
fame 7.9
known 7.9
stamps 7.9
delivery 7.8
banknote 7.8
cutting 7.7
dollars 7.7
saint 7.7
fine 7.6
map 7.6
earth 7.3
print media 7.2
idea 7.1
travel 7
window 7
drawing 7

Google
created on 2019-06-26

Microsoft
created on 2019-06-26

drawing 98.2
cartoon 93.3
indoor 92.9
person 90.9
text 82.2
clothing 76.9
museum 67.1
art 52.8
gallery 51.4
old 46.9
picture frame 21
painting 17.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Female, 54.9%
Disgusted 45%
Angry 45%
Calm 45%
Confused 45%
Surprised 45%
Happy 45%
Sad 54.9%

AWS Rekognition

Age 26-43
Gender Male, 96.8%
Angry 1.7%
Disgusted 87.7%
Confused 1.5%
Sad 7.5%
Happy 0.3%
Calm 0.8%
Surprised 0.5%

AWS Rekognition

Age 16-27
Gender Male, 53.5%
Happy 45.1%
Angry 45.8%
Calm 47%
Confused 46.1%
Sad 49.5%
Disgusted 46.1%
Surprised 45.4%

AWS Rekognition

Age 38-59
Gender Male, 54.2%
Disgusted 45.1%
Surprised 45.4%
Angry 45.3%
Sad 45.6%
Confused 45.1%
Calm 53.5%
Happy 45.1%

AWS Rekognition

Age 20-38
Gender Female, 53.9%
Sad 46.6%
Calm 47.9%
Disgusted 45.4%
Confused 45.4%
Happy 45.7%
Angry 45.7%
Surprised 48.3%

AWS Rekognition

Age 35-52
Gender Female, 53.5%
Happy 45.3%
Sad 48%
Disgusted 45.8%
Calm 50.1%
Surprised 45.2%
Angry 45.3%
Confused 45.3%

AWS Rekognition

Age 6-13
Gender Female, 53%
Calm 45.3%
Angry 45.1%
Happy 45.1%
Surprised 45%
Disgusted 45.1%
Sad 54.4%
Confused 45%

AWS Rekognition

Age 45-65
Gender Male, 53.1%
Sad 52.4%
Angry 45.7%
Calm 45.5%
Surprised 45.2%
Confused 45.3%
Happy 45.7%
Disgusted 45.1%

AWS Rekognition

Age 48-68
Gender Male, 54.9%
Surprised 45.2%
Angry 45.5%
Sad 46.2%
Disgusted 45.1%
Happy 45.2%
Calm 52.6%
Confused 45.2%

AWS Rekognition

Age 30-47
Gender Female, 54%
Surprised 45.1%
Calm 45.2%
Sad 54.5%
Confused 45.1%
Disgusted 45%
Happy 45.1%
Angry 45.1%

AWS Rekognition

Age 45-63
Gender Female, 51.5%
Sad 49.4%
Disgusted 45.8%
Calm 48.1%
Confused 45.2%
Surprised 45.3%
Happy 45.4%
Angry 45.8%

Microsoft Cognitive Services

Age 23
Gender Female

Microsoft Cognitive Services

Age 30
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 95.8%
Painting 92.2%

Categories

Imagga

paintings art 98.7%

Captions

Microsoft
created on 2019-06-26

a painting on the wall 68.3%
a painting on a wall 67.6%
a painting hanging on a wall 67.5%

Azure OpenAI

Created on 2024-11-13

The image is a colored print from a book, presumably an old and possibly historical or religious text, considering the illustration style and clothing depicted. The scene appears to vividly depict a group of classical figures in a variety of expressive poses and outfits, which suggests a narrative context. The figures are shown in a setting that includes arches and columns, indicating a public space or a building of importance reminiscent of classical architecture. Specific features within the image include individuals in the foreground and background interacting with each other. Some characters are seated while others stand. The costumes range from draped cloaks and tunics to more ornate garments that imply different social statuses or roles. In the background, there is an architectural element with arches, and a figure can be seen framed within one of the archways, suggesting depth and space within the scene. The clothing and postures give the setting a theatrical or ceremonial feel. At the bottom of the print, there is text which appears to be Latin, hinting at a story or a moral lesson, common in historical prints that accompany texts. The use of color is fairly rich, adding to the dramatic impact of the scene. The book page is numbered 297 in the upper right corner, signifying its place within the volume. The old and worn edges of the book, along with the visible binding, indicate the age and physical history of the book itself.

Anthropic Claude

Created on 2024-11-14

The image appears to be an illustration from an old book or text. It depicts a crowded scene of various figures, both clothed and unclothed, interacting with one another in what seems to be an indoor public setting. The figures include men, women, and children, some of whom are gesturing or engaged in conversation. The style and level of detail suggest this is an artistic rendering from an earlier time period.

Meta Llama

Created on 2024-11-21

The image depicts an open book with a colorful illustration of a scene from ancient times. The illustration is surrounded by text, including the title "NVDVS ERAM ET OPERVITIS ME" and the date "1597." The book appears to be old and worn, with yellowed pages and a brown cover. * **Illustration:** * The illustration shows a group of people in various poses, including some who are standing, sitting, or lying down. * The people are dressed in clothing that is typical of ancient times, such as togas and tunics. * Some of the people are holding objects, such as swords or shields. * The background of the illustration is a cityscape with buildings and arches. * **Text:** * The title "NVDVS ERAM ET OPERVITIS ME" is written in large letters at the bottom of the illustration. * The date "1597" is written in smaller letters in the top-right corner of the page. * There is additional text written in Latin at the bottom of the page, but it is not legible. * **Book:** * The book is old and worn, with yellowed pages and a brown cover. * The cover is made of leather or a similar material. * The spine of the book is visible and appears to be made of the same material as the cover. * The book is open to a page with the illustration and text. Overall, the image appears to be a page from an old book that contains an illustration and text related to ancient times. The illustration depicts a scene with people in various poses, while the text provides additional context and information about the scene.

Text analysis

Amazon

OPERVITIS
ERAM
NVDVS
ET
ME
NVDVS ERAM ET OPERVITIS ME M.
97
rous
M.
jiyn, rous couze
couze
jiyn,

Google

497 G de fode caude NVDVS OPERVITIS ME M de vos imen. ERAM ET us, bous mau eouurt
497
G
de
fode
caude
NVDVS
OPERVITIS
ME
M
vos
imen.
ERAM
ET
us,
bous
mau
eouurt