Human Generated Data

Title

Daniel Intervening on Behalf of Susanna

Date

c. 1579

People

Artist: Hans Collaert the Elder, Netherlandish 1525/30 - 1580

Artist after: Maerten de Vos, Netherlandish 1532 - 1603

Publisher: Gerard de Jode, Flemish 1509 - 1591

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Ketcham Wheaton in memory of Robert Bradford Wheaton, 2011.613.171

Human Generated Data

Title

Daniel Intervening on Behalf of Susanna

People

Artist: Hans Collaert the Elder, Netherlandish 1525/30 - 1580

Artist after: Maerten de Vos, Netherlandish 1532 - 1603

Publisher: Gerard de Jode, Flemish 1509 - 1591

Date

c. 1579

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Ketcham Wheaton in memory of Robert Bradford Wheaton, 2011.613.171

Machine Generated Data

Tags

Amazon
created on 2019-06-25

Person 99.4
Human 99.4
Person 99.3
Person 99.3
Person 99.1
Person 98.8
Art 97.2
Person 96.4
Person 94.6
Painting 71.8
Drawing 59

Clarifai
created on 2019-06-25

art 99.5
painting 99.4
people 99.4
illustration 98.7
print 98.6
saint 97.3
Renaissance 96.5
religion 96.2
man 95
woman 94.6
book 93.7
adult 92.9
lithograph 91.7
cape 91.6
old 91.4
group 89.6
Gothic 89.4
manuscript 88.1
gown (clothing) 87.8
kneeling 86.5

Imagga
created on 2019-06-25

graffito 47.3
money 39.1
currency 35.9
cash 35.7
decoration 33
dollar 32.5
paper 28.2
bank 26.9
finance 26.2
business 24.9
bill 24.7
book jacket 21.5
financial 21.4
dollars 21.2
banking 21.1
old 20.9
postmark 20.7
wealth 20.6
postage 20.6
art 20.6
stamp 20.3
letter 20.2
mail 20.1
postal 19.6
savings 19.6
exchange 19.1
vintage 19
comic book 19
note 18.4
close 17.7
envelope 17
jacket 16.7
hundred 16.5
us 16.4
post 16.2
church 15.7
bills 15.5
pay 15.3
one 14.9
ancient 14.7
brass 14.7
loan 14.4
structure 14.3
philately 13.8
card 13.6
retro 13.1
investment 12.8
printed 12.8
wrapping 12.7
religion 12.5
notes 12.5
office 12.3
circa 11.8
payment 11.5
god 11.5
antique 11.2
memorial 11.2
economy 11.1
shows 10.8
franklin 10.8
closeup 10.8
symbol 10.8
banknotes 10.8
history 10.7
banknote 10.7
rich 10.2
message 10
global 10
economic 9.7
states 9.7
finances 9.6
product 9.6
profit 9.6
covering 9.1
sculpture 9.1
aged 9
address 8.8
funds 8.8
bible 8.8
object 8.8
painter 8.8
museum 8.7
debt 8.7
saint 8.7
change 8.7
faith 8.6
save 8.5
religious 8.4
black 8.4
sign 8.3
man 8.1
detail 8
daily 8
post office 7.9
stamps 7.9
salary 7.9
wages 7.8
creation 7.8
newspaper 7.8
prayer 7.7
holy 7.7
culture 7.7
sales 7.7
price 7.7
cathedral 7.7
communications 7.7
unique 7.6
print media 7.6
billboard 7.5
color 7.2
icon 7.1
market 7.1

Google
created on 2019-06-25

Microsoft
created on 2019-06-25

painting 98.4
text 98
drawing 94.9
book 88.1
person 85.5
cartoon 84.5
clothing 76.8
manuscript 75.5
gallery 74.2
illustration 50.2
picture frame 36.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 48-68
Gender Male, 54.6%
Disgusted 45.2%
Angry 46.1%
Calm 47.7%
Surprised 45.4%
Confused 45.8%
Sad 49.5%
Happy 45.1%

AWS Rekognition

Age 48-68
Gender Male, 54%
Disgusted 45.1%
Sad 46.9%
Happy 45.2%
Confused 45.2%
Angry 45.8%
Surprised 45.2%
Calm 51.5%

AWS Rekognition

Age 20-38
Gender Male, 53.5%
Angry 46%
Surprised 46%
Sad 48.2%
Calm 47.6%
Disgusted 45.7%
Happy 45.8%
Confused 45.7%

AWS Rekognition

Age 26-43
Gender Male, 53.6%
Confused 45.2%
Disgusted 45.1%
Angry 45.1%
Happy 45.1%
Sad 54.1%
Calm 45.3%
Surprised 45.1%

AWS Rekognition

Age 35-52
Gender Female, 53.6%
Confused 45.3%
Sad 50.7%
Surprised 45.5%
Disgusted 45.2%
Calm 45.9%
Angry 46.6%
Happy 46%

AWS Rekognition

Age 26-43
Gender Female, 54.8%
Angry 45.2%
Surprised 45.1%
Sad 54%
Confused 45.2%
Happy 45%
Disgusted 45.1%
Calm 45.4%

AWS Rekognition

Age 26-43
Gender Male, 54.4%
Surprised 45.2%
Confused 46.1%
Happy 45.1%
Angry 46.3%
Calm 49.8%
Disgusted 45.1%
Sad 47.3%

AWS Rekognition

Age 23-38
Gender Female, 54.9%
Sad 50.7%
Calm 45.8%
Happy 45.1%
Angry 46%
Confused 45.7%
Disgusted 45%
Surprised 46.7%

AWS Rekognition

Age 35-52
Gender Female, 53.5%
Surprised 46.5%
Confused 45.2%
Angry 46.7%
Disgusted 45.2%
Sad 49.8%
Calm 45.3%
Happy 46.2%

AWS Rekognition

Age 23-38
Gender Female, 50.6%
Angry 45.3%
Sad 53.9%
Happy 45%
Surprised 45.1%
Calm 45.4%
Disgusted 45%
Confused 45.2%

AWS Rekognition

Age 20-38
Gender Male, 53.7%
Sad 46.1%
Happy 47.8%
Calm 49.4%
Surprised 45.4%
Confused 45.2%
Angry 46%
Disgusted 45.1%

AWS Rekognition

Age 23-38
Gender Female, 52.2%
Disgusted 45.1%
Sad 49.1%
Calm 45.1%
Surprised 45.2%
Angry 49.8%
Happy 45.5%
Confused 45.1%

AWS Rekognition

Age 35-52
Gender Male, 50.6%
Disgusted 45.1%
Happy 45.3%
Sad 49.7%
Angry 45.6%
Surprised 46.4%
Confused 45.5%
Calm 47.6%

AWS Rekognition

Age 26-43
Gender Male, 54.3%
Surprised 45.2%
Sad 52.9%
Happy 45.3%
Disgusted 45.5%
Confused 45.3%
Calm 45.3%
Angry 45.5%

AWS Rekognition

Age 27-44
Gender Female, 50.2%
Surprised 49.5%
Sad 49.9%
Disgusted 49.6%
Calm 49.5%
Confused 49.5%
Happy 49.8%
Angry 49.7%

AWS Rekognition

Age 17-27
Gender Female, 50.5%
Happy 49.5%
Confused 49.5%
Angry 49.5%
Sad 50.4%
Calm 49.5%
Disgusted 49.5%
Surprised 49.5%

AWS Rekognition

Age 23-38
Gender Female, 50.4%
Angry 49.5%
Calm 49.5%
Disgusted 49.5%
Surprised 49.5%
Sad 50.4%
Happy 49.5%
Confused 49.5%

AWS Rekognition

Age 35-52
Gender Female, 53.2%
Happy 45.3%
Confused 45.6%
Disgusted 45.2%
Calm 47.7%
Angry 46.2%
Sad 49.2%
Surprised 45.7%

AWS Rekognition

Age 57-77
Gender Male, 50.9%
Angry 46.7%
Disgusted 46.6%
Confused 45.5%
Surprised 45.9%
Sad 47.2%
Calm 46.1%
Happy 47.1%

AWS Rekognition

Age 4-9
Gender Female, 50%
Angry 49.5%
Surprised 49.6%
Sad 49.7%
Calm 49.5%
Disgusted 49.9%
Happy 49.7%
Confused 49.6%

AWS Rekognition

Age 26-43
Gender Male, 50.3%
Happy 49.5%
Calm 50%
Disgusted 49.5%
Confused 49.5%
Sad 49.8%
Angry 49.5%
Surprised 49.5%

Microsoft Cognitive Services

Age 26
Gender Female

Microsoft Cognitive Services

Age 31
Gender Female

Microsoft Cognitive Services

Age 25
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Painting 71.8%

Categories

Imagga

paintings art 99.1%

Captions

Azure OpenAI

Created on 2024-11-13

This is an image of an aged, colorized print from a book, which appears to show a historical or biblical scene set in a public square. Numerous figures are dressed in garments from an older era, featuring robes and tunics in vibrant colors such as red, blue, green, and purple. In the background, we see classic architecture, including buildings with arches and domes suggesting a European city, possibly from the Renaissance or earlier. The print includes captions and text in Latin, which hint at the illustration being educational or narrative in nature, perhaps part of a historical or religious text. The book's edge is visible on the right side, showing a well-worn page edge and the number 181 at the top, indicating the page number. The careful colorization and the detailed depiction of the scene suggest that this book was a valuable resource, meant to convey its message with visual impact to the reader.

Anthropic Claude

Created on 2024-11-13

The image appears to be an engraved illustration from a historical or religious text. It depicts a crowded street scene in a city, with a group of people dressed in ornate, period-accurate clothing. The buildings in the background feature domed roofs and other architectural details characteristic of an early European or Middle Eastern urban setting. The people in the foreground seem to be engaged in some sort of gathering or event, though the specific context is not entirely clear from the image alone. The illustration has a vintage, hand-colored appearance, suggesting it may be from an older printed work.

Meta Llama

Created on 2024-11-21

The image depicts a page from an antique book, featuring a colorful illustration of a scene from the Bible. The illustration is surrounded by text in an unknown language, possibly Latin or French, and is accompanied by handwritten notes in the same language. * The illustration shows a group of people gathered in a town square, with some individuals dressed in robes and others in more casual attire. A man in the center of the group appears to be speaking to the others, while a woman stands nearby, looking on. * The background of the illustration features buildings and a church steeple, suggesting that the scene takes place in a small town or village. * The overall atmosphere of the illustration is one of community and discussion, with the people gathered together to hear the words of the speaker. The image provides a glimpse into the artistic and cultural traditions of the time period in which it was created, and offers insight into the ways in which people have interpreted and represented religious stories throughout history.

Text analysis

Amazon

libera
facta
Ni
foret
ope
Sufanna
Daniel's
damnata
irant
nomme
Fnlufte damnata Uiris Sufanna yeriBet, Ni Daniel's ope libera facta foret Daniel.
3.45.
Uiris
mnoit
sycru
Minty
Daniel.
9
2dit)
mmo.
yeriBet,
signcier
sany
outill
Minty auun mnoit signcier outill lyprie, nomme Danicl 2dit)
i fol
lyprie,
dii.
Jugrh",
lut sycru Saulhe irant Jmnur.snd 9 mmo.
cnfan
Jugrh", Gun+9. giym) Danel 3.45.
Saulhe
Gun+9.
lafu
Fnlufte
Danel
iny 2ous i fol cnfan lafu
giym)
Danicl
auun
iny
Jmnur.snd
2ous
lut

Google

IeF Ni Danielis ope libera facta foret Daniel i3 viris Sufannaperißet Iiufte damnata dun Jrun nfant insy guon mnot Sutanne adam ort miter usill nomm Saulie, zantge uis Jmectnt sanyam tut l preplo s 3 wrratou ! artunn ant rt Vous i Jans ausr eamin unnu car e
IeF
Ni
Danielis
ope
libera
facta
foret
Daniel
i3
viris
Sufannaperißet
Iiufte
damnata
dun
Jrun
nfant
insy
guon
mnot
Sutanne
adam
ort
miter
usill
nomm
Saulie,
zantge
uis
Jmectnt
sanyam
tut
l
preplo
s
3
wrratou
!
artunn
ant
rt
Vous
i
Jans
ausr
eamin
unnu
car
e