Human Generated Data

Title

Christ

Date

c. 1646

People

Artist: Eustache Le Sueur, French 1616 - 1655

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, The Melvin R. Seiden Fund and Louise Haskell Daly Fund, 1984.593

Human Generated Data

Title

Christ

People

Artist: Eustache Le Sueur, French 1616 - 1655

Date

c. 1646

Classification

Drawings

Machine Generated Data

Tags

Amazon
created on 2020-04-25

Art 94.5
Human 91.6
Person 91.6
Painting 89.6

Clarifai
created on 2020-04-25

sepia pigment 98
antique 97.9
art 97.8
old 97.1
sepia 96.9
retro 96
man 95.6
ancient 95.4
paper 94.6
portrait 94
nude 93
people 92.7
one 92.2
no person 91.8
artistic 90.7
print 90.5
wear 89.5
vintage 88.5
classic 87.6
Renaissance 87

Imagga
created on 2020-04-25

texture 34.7
paper 33
old 32.1
fabric 29.2
grunge 29
textured 28.1
antique 26.8
garment 26.3
vintage 25.6
pattern 25.3
material 25
clothing 24.1
rough 21.9
wallpaper 21.5
sarong 21.4
aged 20.8
ancient 19.9
skirt 19.6
sand 19
blank 18.9
design 18.7
burlap 18.3
retro 18
canvas 18
brown 17.7
cardboard 17.3
parchment 17.3
page 16.7
backgrounds 16.2
textile 16.1
covering 16
bath towel 15.4
linen 15.1
sheet 15
satin 14.7
book 14.6
empty 14.6
history 14.3
towel 13.9
torn 13.5
stained 13.5
cloth 13.5
worn 13.4
backdrop 13.2
document 13
dirty 12.7
stain 12.5
bath linen 12.3
fashion 12.1
dune 11.7
aging 11.5
surface 11.5
silk 11.4
curtain 11.3
color 11.1
frame 10.8
pages 10.7
burnt 10.7
decay 10.6
art 10.5
weathered 10.5
luxury 10.3
shower curtain 10.3
cover 10.2
velvet 10.2
cotton 10.1
crumpled 9.7
structure 9.6
age 9.5
closeup 9.4
letter 9.2
detail 8.9
decor 8.8
wrinkled 8.8
decoration 8.8
manuscript 8.8
artistic 8.7
earth 8.7
obsolete 8.6
soil 8.4
elegance 8.4
decorative 8.4
element 8.3
style 8.2
soft 8.1
cloak 8
wave 7.8
fiber 7.7
damaged 7.6
grungy 7.6
dark 7.5
desert 7.5
dry 7.4
close 7.4
note 7.4
smooth 7.3
border 7.2
furnishing 7.1
shiny 7.1
curve 7

Google
created on 2020-04-25

Microsoft
created on 2020-04-25

sketch 99.7
drawing 99.6
painting 92.7
art 91.6
human face 89.6
text 84.3
woman 68.3
portrait 53.7
child art 50.7

Face analysis

Amazon

AWS Rekognition

Age 26-40
Gender Male, 90.2%
Confused 0.1%
Sad 0.3%
Calm 94.6%
Surprised 0.2%
Happy 4.2%
Fear 0.1%
Disgusted 0.2%
Angry 0.4%

AWS Rekognition

Age 23-35
Gender Female, 97.4%
Fear 0%
Confused 0.2%
Calm 95.1%
Angry 0.8%
Surprised 0.5%
Sad 0.3%
Disgusted 0.6%
Happy 2.4%

Feature analysis

Amazon

Person 91.6%
Painting 89.6%

Captions

Microsoft

an old photo of a man 53.6%