Human Generated Data

Title

Madonna and Child

Date

19th-20th century

People

Artist: Timothy Cole, American 1852 - 1931

Artist after: Cenni di Pepo, called Cimabue, Italian 1240 - 1302

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gray Collection of Engravings Fund, G7121

Human Generated Data

Title

Madonna and Child

People

Artist: Timothy Cole, American 1852 - 1931

Artist after: Cenni di Pepo, called Cimabue, Italian 1240 - 1302

Date

19th-20th century

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Human 98
Person 98
Hat 92.4
Clothing 92.4
Apparel 92.4
Art 91.2
Leisure Activities 91.1
Person 89
Musical Instrument 73.6
Painting 67.1
Lute 66.3
Drawing 61.9

Imagga
created on 2022-02-26

book jacket 64.4
jacket 50.1
wrapping 38.1
vintage 29.8
stamp 27.6
mail 26.8
covering 26.4
postage 25.6
postmark 24.7
currency 24.2
art 23.9
money 23.8
letter 22
envelope 22
old 21.6
postal 21.6
culture 21.4
cash 21.1
post 21
paper 19.7
bookmark 19.6
ancient 18.2
history 17.9
religion 17
printed 16.7
symbol 16.2
dollar 15.8
antique 15.7
sculpture 15.3
bank 15.2
bill 15.2
museum 15.2
finance 15.2
one 14.9
stamps 14.8
shows 14.8
banking 14.7
binding 14.4
financial 14.3
retro 13.9
renaissance 13.8
global 13.7
famous 13
masterpiece 12.9
unique 12.3
business 12.2
black 12
church 12
historic 11.9
post mail 11.9
zigzag 11.9
fame 11.9
known 11.9
philately 11.8
painter 11.8
paintings 11.7
delivery 11.7
cutting 11.6
communications 11.5
painted 11.5
fine 11.5
office 11.3
savings 11.2
economy 11.1
circa 10.9
wealth 10.8
historical 10.4
icon 10.3
monument 10.3
creation 10.3
grunge 10.2
carving 10.1
book 10.1
face 9.9
product 9.7
statue 9.7
dollars 9.7
notes 9.6
exchange 9.6
closeup 9.4
travel 9.2
close 9.1
banknotes 8.8
letters 8.7
stone 8.6
rich 8.4
tourism 8.3
aged 8.1
landmark 8.1
architecture 8
canceled 7.9
temple 7.8
banknote 7.8
finances 7.7
saint 7.7
payment 7.7
pay 7.7
loan 7.7
god 7.7
religious 7.5
note 7.4
investment 7.3
collection 7.2
portrait 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 98.2
drawing 98.2
sketch 94.1
person 83
human face 82.5
painting 76
clothing 69.6
old 51.4

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-28
Gender Male, 77.9%
Calm 89.9%
Confused 3.2%
Sad 1.8%
Surprised 1.8%
Disgusted 1.1%
Angry 0.9%
Fear 0.8%
Happy 0.5%

AWS Rekognition

Age 2-8
Gender Female, 52.1%
Calm 99.7%
Confused 0.2%
Sad 0%
Fear 0%
Surprised 0%
Angry 0%
Disgusted 0%
Happy 0%

Microsoft Cognitive Services

Age 27
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98%
Hat 92.4%

Captions

Microsoft

an old photo of a person 74.9%
old photo of a person 70.5%
a old photo of a person 70.4%

Text analysis

Amazon

from
original
by
the
printed by me from the original
printed
Japan
is
of
me
for
paper,
one
han) de Japan paper, is one of 06/
Limothy
Limothy Cole
Cole
for Wood-engrown
06/
Bauer
han)
-
7121
Brofessomal
Brofessomal Broofspinter
Broofspinter
CICE -
Wood-engrown
Bauer adidas Colors
de
g a
Colors
CICE
adidas

Google

4ee an) on Japan papes, is one of one printed by me from the oniginal BBaver Brofesnmal Baof-printer for Weed-engravers.
papes,
from
the
Baof-printer
is
one
printed
me
oniginal
BBaver
Weed-engravers.
an)
for
Brofesnmal
4ee
on
Japan
of
by