Human Generated Data
Title
Portrait of M. Gampert
Date
1920
People
Artist: Roger de La Fresnaye, French 1885 - 1925
Classification
Drawings
Credit Line
Harvard Art Museums/Fogg Museum, The Lois Orswell Collection, 1998.269
Human Generated Data
Title
Portrait of M. Gampert
People
Artist: Roger de La Fresnaye, French 1885 - 1925
Date
1920
Classification
Drawings
Credit Line
Harvard Art Museums/Fogg Museum, The Lois Orswell Collection, 1998.269
Machine Generated Data
Tags
Amazon
created on 2023-08-30
Art
100
Painting
99.9
Adult
99.2
Male
99.2
Man
99.2
Person
99.2
Drawing
98.9
Face
91.4
Head
91.4
Photography
57.2
Portrait
57.2
Clarifai
created on 2023-11-01
people
99.6
art
99.1
one
98.5
adult
97.8
man
97.8
portrait
97.6
print
96.9
illustration
96.5
wear
95.8
vintage
93.7
retro
92.6
old
91.1
antique
90
painting
89
nude
87
ancient
86.5
woman
85.2
veil
84.5
paper
83.1
chalk out
82.6
Imagga
created on 2018-12-27
sketch
100
representation
100
drawing
100
old
23
art
20.3
vintage
19.8
grunge
19.6
ancient
18.2
antique
17.3
paper
14.9
sculpture
14.7
retro
13.9
statue
13.3
man
12.8
texture
12.5
aged
11.8
religion
11.6
human
11.2
history
10.7
temple
10.4
black
10.2
body
9.6
famous
9.3
head
9.2
historic
9.2
portrait
9.1
symbol
8.8
architecture
8.6
culture
8.5
money
8.5
stone
8.4
person
8.3
one
8.2
currency
8.1
face
7.8
male
7.8
model
7.8
travel
7.7
bill
7.6
religious
7.5
monument
7.5
design
7.5
style
7.4
close
7.4
cash
7.3
detail
7.2
hair
7.1
Google
created on 2018-12-27
photograph
95.3
figure drawing
89.7
portrait
87.1
black and white
86.6
standing
85.5
drawing
83.8
sketch
77.2
art
70.9
artwork
69
monochrome
68.8
self portrait
57.4
visual arts
56.9
monochrome photography
53.2
Microsoft
created on 2018-12-27
drawing
52.2
sketch
18.6
Color Analysis
Face analysis
Amazon
AWS Rekognition
Age
50-58
Gender
Female, 77.6%
Surprised
85.1%
Calm
39.3%
Fear
6.9%
Happy
3.1%
Sad
2.3%
Disgusted
1.6%
Angry
0.7%
Confused
0.5%
Feature analysis
Amazon
Adult
Male
Man
Person
❮
❯
Adult
99.2%
❮
❯
Male
99.2%
❮
❯
Man
99.2%
❮
❯
Person
99.2%
Categories
Imagga
text visuals
82.2%
paintings art
11.5%
interior objects
2.9%
food drinks
2.4%
Captions
Microsoft
created on 2018-12-27
an old photo of a person
69.1%
old photo of a person
62.5%
a black and white photo of a person
56.8%
Text analysis
Amazon
1920