Human Generated Data
Title
Woman Seated on Tree Trunk ("Crazy Mag")
Date
19th-20th century
People
Artist: Eugene Higgins, American 1874 - 1958
Classification
Drawings
Credit Line
Harvard Art Museums/Fogg Museum, Gift of James N. Rosenberg, 1929.185
Human Generated Data
Title
Woman Seated on Tree Trunk ("Crazy Mag")
People
Artist: Eugene Higgins, American 1874 - 1958
Date
19th-20th century
Classification
Drawings
Credit Line
Harvard Art Museums/Fogg Museum, Gift of James N. Rosenberg, 1929.185
Machine Generated Data
Tags
Amazon
created on 2020-05-02
Art
95.5
Painting
86.3
Person
79.3
Human
79.3
Clarifai
created on 2020-05-02
no person
97.4
one
97.3
people
95.8
art
94.2
two
92.9
tree
91.2
monochrome
90.8
painting
90.2
print
89.4
tunnel
89.1
mammal
88.8
man
83.7
illustration
80.4
hole
80.1
adult
78.6
portrait
77.6
old
75.3
sooty
72.5
wall
71.8
child
70.7
Imagga
created on 2020-05-02
hole
58.7
old
38.3
wall
29.1
texture
27.8
antique
27.7
grunge
27.2
fastener
25.2
vintage
24
aged
22.6
ancient
20.7
stone
20.6
nut and bolt
20.3
device
20.1
restraint
18.4
grungy
17.1
detail
16.9
rust
16.4
art
16.3
brown
16.2
rough
14.6
dirty
14.5
rusty
14.3
weathered
14.2
paint
13.6
cell
13.3
worn
12.4
latch
12.4
door
12.4
surface
12.3
retro
12.3
knocker
11.5
textured
11.4
design
10.8
material
10.7
black
10.2
architecture
10.2
iron
9.6
damaged
9.5
building
9.5
closeup
9.4
culture
9.4
catch
9.4
house
9.2
wallpaper
9.2
baby
8.9
metal
8.8
paper
8.8
rock
8.7
stain
8.6
empty
8.6
close
8.6
color
8.3
frame
8.3
religion
8.1
rustic
7.7
painted
7.6
pattern
7.5
water
7.3
window
7.1
Google
created on 2020-05-02
Art
83.1
Painting
82.2
Tree
79.9
Visual arts
78.2
Illustration
73.2
Black-and-white
68.3
Stock photography
67.6
Modern art
67.4
Photography
62.4
Drawing
60.7
Plant
58.5
Artwork
51.1
Picture frame
50.8
Still life photography
50.2
Microsoft
created on 2020-05-02
drawing
99.3
sketch
98.7
text
91.7
child art
88.9
cave
67.7
art
66.1
black and white
65.3
old
52.1
dirty
32
dirt
25.6
painting
18
Color Analysis
Face analysis
Amazon
AWS Rekognition
Age
42-60
Gender
Male, 85.9%
Fear
4.1%
Sad
32.6%
Disgusted
1.5%
Surprised
16.4%
Calm
19.2%
Angry
11.3%
Confused
14.2%
Happy
0.7%
Feature analysis
Amazon
Painting
Person
❮
❯
Painting
86.3%
❮
❯
Person
79.3%
Categories
Imagga
paintings art
36.2%
pets animals
34.9%
streetview architecture
18%
nature landscape
6.8%
people portraits
1.3%
interior objects
1.1%
Captions
Microsoft
created on 2020-05-02
a painting in a dirty area
69.1%
a painting on the wall
58.9%
a painting of a dirty field
58.8%
Text analysis
Google
Cogene
Cogene