Human Generated Data

Title

PAINTING

Date

-

People

-

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Friends of the Fogg Art Museum Fund, 1925.32.2

Human Generated Data

Title

PAINTING

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Friends of the Fogg Art Museum Fund, 1925.32.2

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Art 85.7
Human 81.6
Soil 79.3
Drawing 75.8
Person 68.1
Archaeology 66.5
Text 64.5
Painting 61.4
Sketch 57.2

Clarifai
created on 2020-04-24

people 99.2
retro 96.2
vintage 95.7
adult 95.3
art 94.9
wear 94.6
old 94.5
illustration 94.4
one 94
antique 93.9
desktop 92.8
paper 91.3
wall 88.3
dirty 87.2
print 87.2
no person 86.9
texture 86.8
pattern 86.4
abstract 84
ancient 82.8

Imagga
created on 2020-04-24

graffito 90.6
decoration 62.4
grunge 52.8
texture 44.5
old 42.5
aged 37.1
rough 34.7
wall 33.2
grungy 32.3
paint 31.7
dirty 31.7
pattern 31.5
vintage 31.4
antique 28.6
textured 28.1
weathered 27.6
retro 23.8
frame 23.3
art 22.1
surface 21.2
stained 21.2
damaged 21
canvas 20.9
border 20.8
material 20.6
dirt 20.1
snow 20.1
messy 19.3
detail 19.3
black 19.2
design 19.1
crater 18.9
aging 18.2
brown 17.7
wallpaper 17.6
rust 17.4
stain 17.3
ancient 16.4
worn 16.2
color 15.6
rusty 15.3
natural depression 15.2
text 14.8
space 14.7
close 14.3
paper 14.1
blank 13.7
torn 13.6
decay 13.5
stone 13.3
scratch 12.7
structure 12.6
backdrop 12.4
digital 12.2
decorative 11.7
geological formation 11.5
weather 11.3
graphic 10.9
architecture 10.9
burned 10.8
burnt 10.7
edge 10.6
backgrounds 10.5
painted 10.5
artistic 10.4
drawing 9.6
forest 9.6
industry 9.4
history 8.9
metal 8.9
noise 8.8
cracked 8.8
urban 8.7
detailed 8.7
obsolete 8.6
concrete 8.6
iron 8.4
element 8.3
effect 8.2
industrial 8.2
painting 8.1
gray 8.1
building 8
noisy 7.9
eroded 7.9
peeling 7.9
negative 7.9
rock 7.8
cement 7.8
crack 7.8
cold 7.8
closeup 7.4
metallic 7.4
artwork 7.3
drop 7.3
ice 7.1
fountain 7

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

drawing 99.7
sketch 99.4
text 87.4
child art 87.4
black and white 67.1
painting 22.2
stone 10.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 12-22
Gender Female, 95.9%
Disgusted 0%
Sad 2.2%
Happy 7.2%
Angry 0.4%
Confused 0.1%
Surprised 0.2%
Fear 0%
Calm 89.9%

Feature analysis

Amazon

Person 68.1%
Painting 61.4%

Categories

Captions