Human Generated Data

Title

Nymph and Satyr

Date

1991

People

Artist: Craig Dongoski, American 1964 -

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, M22989

Human Generated Data

Title

Nymph and Satyr

People

Artist: Craig Dongoski, American 1964 -

Date

1991

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, M22989

Machine Generated Data

Tags

Amazon
created on 2019-11-03

Canvas 99.1
Art 97.9
Modern Art 96.3
Painting 86.7
Human 84.6
Drawing 84.6
Person 62.8
Sketch 58

Clarifai
created on 2019-11-03

painting 99.2
illustration 98.2
art 97.9
people 95.3
wear 93.6
adult 92.6
one 92.3
graphic 89.5
woman 89
no person 88
print 87.9
desktop 87.6
paper 87.2
color 85.8
visuals 85.1
bill 85.1
design 84.9
chalk out 84.3
vintage 83.8
child 83.8

Imagga
created on 2019-11-03

sketch 64.2
drawing 52.2
representation 50.5
map 38
grunge 35.8
vintage 35.6
old 35.5
texture 31.3
graffito 29.9
decoration 29.8
antique 27.7
aged 26.3
handkerchief 24.5
retro 23.8
art 22.2
wallpaper 21.5
design 20.8
graphic 19.7
piece of cloth 19.6
paper 19.6
dirty 19
frame 17.5
pattern 17.1
floral 17
geography 16.4
color 15.6
ancient 15.6
fabric 15
flower 14.6
stain 14.4
envelope 14.3
material 14.3
canvas 14.2
world 14.2
textured 14
paint 13.6
obsolete 13.4
backgrounds 13
decay 12.5
silhouette 12.4
structure 12.2
grain 12
effect 11.9
fracture 11.7
grungy 11.4
travel 11.3
painting 11
atlas 11
container 11
border 10.9
grime 10.8
continent 10.7
crack 10.7
surface 10.6
aging 10.6
style 10.4
planet 10.4
globe 10.2
leaf 10.1
earth 10.1
rough 10
decorative 10
element 9.9
mottled 9.8
worn 9.6
card 9.4
letter 9.2
painterly 8.9
smudged 8.9
text 8.7
forest 8.7
states 8.7
detailed 8.7
edge 8.7
wall 8.6
textures 8.5
space 8.5
artwork 8.2
creative 7.9
postmark 7.9
artistic 7.8
stamp 7.7
torn 7.7
mail 7.7
direction 7.6
old fashioned 7.6
plant 7.5
global 7.3
black 7.2

Google
created on 2019-11-03

Microsoft
created on 2019-11-03

drawing 99.7
child art 99.2
painting 99.1
art 98.6
sketch 98.2
text 96.1
cartoon 88.5
illustration 78.5
gallery 75.9
abstract 75.7
acrylic 56.4
orange 51.9
envelope 51.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 10-20
Gender Female, 52.2%
Happy 52.2%
Confused 45.1%
Disgusted 45%
Calm 45.9%
Fear 45.1%
Sad 45.1%
Angry 45%
Surprised 46.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 86.7%
Person 62.8%

Categories

Captions

Microsoft
created on 2019-11-03

a close up of a map 57.1%
a map on the wall 48.3%
close up of a map 48.2%

Text analysis

Google

Nnn
Nnn