Human Generated Data

Title

The Exodus from Egypt

Date

c. 1579

People

Artist: Johann Sadeler I, Netherlandish 1550 - 1600-8

Artist after: Maarten van Cleve, Netherlandish 1527 - 1581

Publisher: Gerard de Jode, Flemish 1509 - 1591

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Ketcham Wheaton in memory of Robert Bradford Wheaton, 2011.613.47

Human Generated Data

Title

The Exodus from Egypt

People

Artist: Johann Sadeler I, Netherlandish 1550 - 1600-8

Artist after: Maarten van Cleve, Netherlandish 1527 - 1581

Publisher: Gerard de Jode, Flemish 1509 - 1591

Date

c. 1579

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Ketcham Wheaton in memory of Robert Bradford Wheaton, 2011.613.47

Machine Generated Data

Tags

Amazon
created on 2019-06-25

Human 97.2
Person 97.2
Person 97.1
Art 96.8
Person 95.4
Painting 95
Person 94.3
Person 90.3
Person 81.5
Person 74.1
Person 65.6
Mural 56.6

Clarifai
created on 2019-06-25

painting 99.7
art 99.7
illustration 99.6
print 97.4
people 97.3
old 96.6
vintage 95.8
religion 94.7
antique 94.3
retro 93.1
man 92.7
manuscript 92.2
card 91.5
saint 90.9
ancient 90.9
god 90.6
letter 90.1
postcard 89.6
post 89.3
postal 88.3

Imagga
created on 2019-06-25

jigsaw puzzle 57.1
puzzle 45
tray 37.6
game 32.7
receptacle 30.4
vintage 29.8
currency 29.6
container 29.4
money 28.9
old 27.2
comic book 26.5
envelope 26.1
stamp 25.3
mail 24.9
paper 24.3
cash 23.8
postage 23.6
letter 22.9
finance 22.8
postmark 22.7
postal 21.6
retro 21.3
bank 18.8
bill 18.1
post 17.2
art 15.9
dollar 15.8
banking 15.6
financial 15.2
note 14.7
ancient 13.9
philately 13.8
stamps 13.8
global 13.7
exchange 13.4
wealth 12.6
decoration 12.5
notes 12.5
map 12.4
business 12.2
antique 12.1
savings 12.1
circa 11.9
printed 11.8
aged 11.8
banknote 11.7
grunge 11.1
card 11.1
symbol 10.8
bills 10.7
dollars 10.6
payment 10.6
print media 10.6
pay 10.6
international 10.5
united 10.5
culture 10.3
design 10.2
graffito 10.1
collection 9.9
banknotes 9.8
one 9.7
states 9.7
economy 9.3
church 9.3
close 9.1
religion 9
history 9
painter 8.9
world 8.9
geography 8.7
communications 8.6
painted 8.6
black 8.4
communication 8.4
investment 8.3
message 8.2
dirty 8.1
office 8
shows 7.9
binding 7.9
museum 7.8
union 7.7
used 7.7
profit 7.7
book 7.7
unique 7.6
hobby 7.6
china 7.5
icon 7.1
travel 7
country 7

Google
created on 2019-06-25

Painting 95.6
Art 88.7
Miniature 70.9
Textile 65.6
Illustration 65.5
Mythology 64.4
Tapestry 64.3
History 57.6
Visual arts 55
Middle ages 52.3

Microsoft
created on 2019-06-25

painting 99.6
drawing 98.9
text 97.3
cartoon 95.7
child art 92.2
book 79.5
person 79
sketch 77
gallery 51.6
picture frame 44

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 35-55
Gender Male, 50.5%
Surprised 45%
Angry 45.1%
Happy 45%
Sad 53.4%
Disgusted 45%
Calm 46.3%
Confused 45.2%

AWS Rekognition

Age 57-77
Gender Female, 51.8%
Happy 45.1%
Calm 50.1%
Disgusted 45.1%
Confused 45.1%
Sad 49.4%
Angry 45.2%
Surprised 45.1%

AWS Rekognition

Age 60-90
Gender Female, 54.7%
Surprised 45.1%
Disgusted 45.1%
Sad 53.7%
Calm 45.6%
Confused 45.2%
Happy 45.1%
Angry 45.2%

AWS Rekognition

Age 26-43
Gender Female, 51.5%
Angry 45.3%
Sad 51.8%
Happy 45.8%
Surprised 45.1%
Calm 46.7%
Disgusted 45.1%
Confused 45.2%

AWS Rekognition

Age 19-36
Gender Female, 53.1%
Happy 47.1%
Confused 45.7%
Calm 45.5%
Angry 46.1%
Surprised 45.8%
Sad 47%
Disgusted 47.8%

AWS Rekognition

Age 57-77
Gender Female, 52.7%
Surprised 45.2%
Sad 52.5%
Disgusted 45.1%
Calm 46.4%
Confused 45.2%
Happy 45.3%
Angry 45.3%

AWS Rekognition

Age 26-43
Gender Female, 54.8%
Disgusted 45.1%
Happy 45.4%
Surprised 45.9%
Calm 47.2%
Sad 47.9%
Angry 48%
Confused 45.5%

AWS Rekognition

Age 20-38
Gender Female, 52.3%
Angry 45.4%
Calm 47.8%
Disgusted 45.2%
Surprised 45.2%
Sad 49%
Happy 47.3%
Confused 45.2%

AWS Rekognition

Age 49-69
Gender Male, 53%
Angry 45.3%
Surprised 45.3%
Sad 48.8%
Calm 50.1%
Disgusted 45.1%
Happy 45.2%
Confused 45.3%

AWS Rekognition

Age 6-13
Gender Female, 50.4%
Disgusted 49.5%
Sad 50.1%
Happy 49.7%
Confused 49.5%
Angry 49.5%
Calm 49.5%
Surprised 49.6%

Microsoft Cognitive Services

Age 69
Gender Male

Feature analysis

Amazon

Person 97.2%
Painting 95%

Categories

Imagga

paintings art 94.5%
text visuals 3.1%
food drinks 1.7%

Captions

Microsoft
created on 2019-06-25

a close up of a box 40.3%
close up of a box 33.8%

Azure OpenAI

Created on 2024-11-19

This image appears to be a page from an illustrated manuscript or an early printed book, showing a colorful, detailed scene set in a historical or possibly biblical context. The image depicts a group of characters dressed in a variety of late medieval or Renaissance garments, suggesting some diversity in social status or role. Several individuals are carrying possessions or guiding animals, suggesting travel or migration. In the foreground, individuals are engaged in activities such as carrying baskets, some with contents visible, and there's a dog walking through the scene. To the left, one character wearing a red cloak and blue tunic raises a hand, while those around seem to be listening or attending to this figure, indicating some form of leadership or instruction. The background reveals a landscape with mountains, and additional groups of individuals continue the theme of movement with figures seen walking up a slope in the distance. At the bottom of the illustration, there is a line of Latin text, which is likely to be a caption or description related to the scene depicted above it. The presence of text and page number (55) indicates that this is part of a larger work, such as a chronicle or an illuminated manuscript. The image seems to be hand-colored, which was a common practice for illustrating printed books before the advent of full-color printing technologies.

Anthropic Claude

Created on 2024-11-19

The image depicts a crowded scene from what appears to be a biblical or historical event. In the center, there are numerous figures engaged in some sort of activity or confrontation, with people of various ages and genders visible. The background includes mountainous terrain and a cloudy sky. The overall scene seems to be chaotic and bustling with activity. The text below the image provides additional context, but I will refrain from identifying or naming any of the human figures shown.

Text analysis

Amazon

egrediens
quod
populus
humeris
tollere
dr
capit
aurea
Aaypto egrediens populus capit aurea afa, Imponitque humeris tollere quod potuit Exod:
Aaypto
potuit
Imponitque
gptins
aux
d
Exod:
o dingent
sortans
aco..36.1
yartins.
dyrais.sux
Ls
afa,
Lis oraliny sortans d'e d.sman par d Moh aux gptins dyrais.sux dr o dingent o
aco..36.1 36..
d'e
Lis
o
oraliny
par
dTouillrine
Moh
re
36..
d.sman
des
des Cnphn). Donha re fhy.ner Imntl, dTouillrine Ls
Imntl,
fhy.ner
Cnphn).
Donha

Google

7 vafa, Imponitqur bumeris, tollere quod potuit yns dmin r -dingnt Agipto egrediens populus capit Exod: 12 aurea Lisevadies ds Chn4- dom prac4 dtgiph d.mand.vnt par csmmanSm aux Soptans nuirs
7
vafa,
Imponitqur
bumeris,
tollere
quod
potuit
yns
dmin
r
-dingnt
Agipto
egrediens
populus
capit
Exod:
12
aurea
Lisevadies
ds
Chn4-
dom
prac4
dtgiph
d.mand.vnt
par
csmmanSm
aux
Soptans
nuirs