Human Generated Data

Title

A Landscape, Minerva and the Muses

Date

18th-19th century

People

Artist: Richard Earlom, British 1743 - 1822

Artist after: Claude Lorrain, French 1604 - 1682

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gray Collection of Engravings Fund, G8085

Human Generated Data

Title

A Landscape, Minerva and the Muses

People

Artist: Richard Earlom, British 1743 - 1822

Artist after: Claude Lorrain, French 1604 - 1682

Date

18th-19th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gray Collection of Engravings Fund, G8085

Machine Generated Data

Tags

Amazon
created on 2019-11-05

Painting 99.5
Art 99.5
Human 98.4
Person 98.4
Person 91.9
Person 89.4
Person 65.8

Clarifai
created on 2019-11-05

people 99.3
art 99.1
painting 98.8
adult 98.3
wear 98.1
illustration 95.5
print 93
one 92.5
group 91.3
man 90.8
vintage 90.4
picture frame 90.2
old 87.8
wall 86.6
no person 86
retro 85.6
antique 85.3
two 83.9
war 82.6
interaction 82.5

Imagga
created on 2019-11-05

graffito 56.1
grunge 45.1
decoration 43.4
old 43.2
vintage 33.9
memorial 33.7
antique 33
aged 32.6
texture 32
structure 31.6
ancient 31.1
wall 30.6
brass 26.7
retro 25.4
art 24.2
frame 21.6
paper 21.2
grungy 20.9
border 20.8
textured 20.2
dirty 19.9
damaged 19.1
material 18.8
design 18.5
pattern 17.8
brown 16.2
empty 15.5
decay 15.4
detail 15.3
surface 15
faded 14.6
rusty 14.3
stone 14.3
weathered 14.3
architecture 14.1
parchment 13.4
worn 13.4
old fashioned 13.3
space 13.2
gravestone 13.1
dark 12.5
wallpaper 12.3
grain 12
graphic 11.7
crumpled 11.7
torn 11.6
backdrop 11.5
stain 11.5
stucco 11.3
blank 11.1
historic 11
grime 10.7
mottled 10.7
crack 10.7
detailed 10.6
stained 10.6
obsolete 10.5
sculpture 10.5
building 10.4
artwork 10.1
rough 10
fracture 9.7
stains 9.7
black 9.6
artistic 9.6
culture 9.4
wood 9.2
paint 9.1
history 8.9
backgrounds 8.9
color 8.9
ragged 8.8
aging 8.6
canvas 8.5
textures 8.5
travel 8.5
house 8.4
style 8.2
tracery 7.8
burnt 7.8
rust 7.7
edge 7.7
spot 7.7
dirt 7.6
decorative 7.5
close 7.4
exterior 7.4
carving 7.1

Google
created on 2019-11-05

Microsoft
created on 2019-11-05

drawing 95.2
sketch 82.1
old 80.7
text 64
person 52.1
painting 18.3
dirty 10.6
picture frame 7.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 19-31
Gender Male, 50.2%
Disgusted 49.5%
Sad 50.3%
Calm 49.7%
Confused 49.5%
Fear 49.5%
Surprised 49.5%
Happy 49.5%
Angry 49.5%

AWS Rekognition

Age 13-23
Gender Male, 50.3%
Fear 49.5%
Surprised 49.5%
Confused 49.5%
Calm 49.7%
Disgusted 49.6%
Happy 49.6%
Sad 49.5%
Angry 50%

AWS Rekognition

Age 23-35
Gender Male, 50.3%
Happy 49.5%
Calm 49.5%
Surprised 49.5%
Disgusted 49.5%
Confused 49.5%
Sad 49.5%
Angry 49.6%
Fear 50.3%

AWS Rekognition

Age 8-18
Gender Female, 50%
Sad 50.3%
Calm 49.6%
Disgusted 49.5%
Angry 49.5%
Confused 49.5%
Surprised 49.5%
Happy 49.6%
Fear 49.5%

AWS Rekognition

Age 25-39
Gender Female, 50.3%
Fear 49.6%
Happy 49.5%
Angry 49.6%
Sad 49.9%
Confused 49.5%
Calm 49.8%
Disgusted 49.5%
Surprised 49.5%

AWS Rekognition

Age 20-32
Gender Female, 50.2%
Angry 49.6%
Fear 49.9%
Calm 49.8%
Surprised 49.5%
Confused 49.5%
Disgusted 49.5%
Sad 49.6%
Happy 49.5%

Feature analysis

Amazon

Painting 99.5%
Person 98.4%

Categories

Captions

Microsoft
created on 2019-11-05

a painting of a person 77.1%
an old photo of a person 76.4%
old photo of a person 72.8%