Human Generated Data

Title

Illuminated Figures from Byzantine Manuscript of Tenth Century

Date

c. 1876-1878

People

Artist: Charles Herbert Moore, American 1840 - 1930

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Fine Arts Department, Harvard University, 1926.33.117

Human Generated Data

Title

Illuminated Figures from Byzantine Manuscript of Tenth Century

People

Artist: Charles Herbert Moore, American 1840 - 1930

Date

c. 1876-1878

Classification

Drawings

Machine Generated Data

Tags

Amazon
created on 2020-04-25

Human 98.1
Person 98.1
Person 97.5
Drawing 96.4
Art 96.4
Person 95.7
Person 92.7
Sketch 82.9
Doodle 70.6

Clarifai
created on 2020-04-25

wear 98
art 97.9
people 97.2
woman 95.3
man 94.5
girl 94.4
portrait 94.1
adult 93.7
dress 91.1
retro 90.1
no person 90.1
old 88.8
illustration 88.1
paper 88
outdoors 87.9
painting 87.5
bill 87.1
nature 86.2
bird 85.5
vintage 85.2

Imagga
created on 2020-04-25

envelope 100
container 100
paper 31.4
grunge 28.1
old 27.9
retro 27.9
design 24.2
vintage 24
blank 21.4
frame 20
texture 19.5
antique 19
card 18.7
aged 18.1
art 17.6
ancient 17.3
border 17.2
dirty 17.2
pattern 17.1
symbol 16.2
letter 14.7
space 14
page 13.9
message 13.7
textured 13.2
parchment 12.5
graphic 12.4
damaged 12.4
decoration 11.9
decorative 11.7
drawing 11.2
sign 10.5
gift 10.3
empty 10.3
floral 10.2
note 10.1
holiday 10
color 10
paint 10
wallpaper 10
copy 9.7
text 9.6
mail 9.6
canvas 9.5
document 9.3
greeting 9.3
silhouette 9.1
element 9.1
flag 9
material 8.9
sand 8.9
celebration 8.8
faded 8.8
stamp 8.7
artistic 8.7
ornament 8.6
money 8.5
rough 8.2
backgrounds 8.1
creative 7.9
postage 7.9
crumpled 7.8
torn 7.7
flower 7.7
stain 7.7
cardboard 7.7
used 7.7
weathered 7.6
grungy 7.6
beach 7.6
business 7.3
office 7.2
star 7.2
world 7.1
summer 7.1
day 7.1
travel 7
country 7

Google
created on 2020-04-25

Microsoft
created on 2020-04-25

drawing 98
sketch 96
cartoon 94.5
child art 93.6
text 84.9
painting 74.9
clothing 58.3

Face analysis

Amazon

AWS Rekognition

Age 23-35
Gender Male, 50.2%
Confused 49.5%
Calm 49.6%
Disgusted 49.5%
Sad 49.5%
Surprised 49.6%
Happy 49.5%
Fear 50.2%
Angry 49.6%

AWS Rekognition

Age 26-40
Gender Male, 50.1%
Angry 49.6%
Disgusted 49.5%
Surprised 49.5%
Fear 49.5%
Calm 50.3%
Happy 49.5%
Sad 49.5%
Confused 49.5%

AWS Rekognition

Age 33-49
Gender Male, 50.3%
Angry 49.9%
Disgusted 49.5%
Calm 49.9%
Surprised 49.5%
Sad 49.6%
Fear 49.5%
Happy 49.5%
Confused 49.5%

AWS Rekognition

Age 27-43
Gender Male, 50.3%
Surprised 49.7%
Happy 49.5%
Calm 49.5%
Disgusted 49.5%
Angry 49.6%
Fear 50.1%
Confused 49.5%
Sad 49.6%

AWS Rekognition

Age 46-64
Gender Male, 50.1%
Surprised 49.5%
Angry 49.5%
Happy 49.5%
Confused 49.5%
Calm 49.5%
Sad 50.4%
Disgusted 49.5%
Fear 49.5%

Feature analysis

Amazon

Person 98.1%

Captions

Microsoft

a group of people in a room 49.7%
people around each other 33.3%
a close up of a person 33.2%

Text analysis

Amazon

Pari.
64
iary
hazimae
Snur
Snur 7h.s. ME 64 hazimae iary Pari. och.
7h.s.
ME
och.

Google

M
Tn.S.
64
Lihary
10
Cent.
Srur Tn.S. M 64 hational Lihary Paris 10 Cent.
Srur
hational
Paris