Human Generated Data

Title

Untitled LII

Date

1992-1993

People

Artist: Christopher Le Brun, British 1951 -

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Pearl K. and Daniel Bell, M25344

Human Generated Data

Title

Untitled LII

People

Artist: Christopher Le Brun, British 1951 -

Date

1992-1993

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Pearl K. and Daniel Bell, M25344

Machine Generated Data

Tags

Amazon
created on 2019-11-01

Person 98.9
Human 98.9
Art 98.7
Person 97.8
Person 95.3
Painting 85.3
Drawing 73.6
Sketch 73.2
Art Gallery 68.2
Modern Art 59.3
Canvas 56.3

Clarifai
created on 2019-11-01

art 99.4
vintage 98.8
retro 98.5
illustration 98.4
design 98.1
old 97.7
artistic 97.7
shot 97.4
painting 97.4
wear 96.7
paper 96.6
letter 96.6
brush 96.6
graphic 96.2
picture frame 96.2
desktop 96
dirty 95.7
decoration 95.2
print 95.2
image 94.4

Imagga
created on 2019-11-01

book jacket 33.6
paper 31.4
old 30
envelope 28.2
vintage 28.1
jacket 26.2
container 25.9
blank 22.3
retro 22.1
wrapping 21.8
frame 21.6
binding 21.1
design 20.8
art 20.4
drawing 19.8
grunge 19.6
texture 19.4
packet 19.4
post 18.1
aged 18.1
package 17.9
sketch 17.4
stamp 17.3
card 17.1
letter 16.5
antique 16.4
border 16.3
covering 15.4
mail 15.3
symbol 14.1
page 13.9
black 13.8
note 13.8
postmark 12.8
postage 12.8
dirty 11.7
pattern 11.6
representation 11.5
textured 11.4
object 11
book 10.9
decoration 10.9
office 10.4
empty 10.3
floral 10.2
message 10
paint 10
postal 9.8
business 9.7
ancient 9.5
graphic 9.5
flower 9.2
notebook 9
sign 9
decor 8.8
album 8.8
grungy 8.5
greeting 8.3
global 8.2
brown 8.1
currency 8.1
philately 7.9
holiday 7.9
photograph 7.8
decorative 7.5
document 7.4
closeup 7.4
cover 7.4
collection 7.2
celebration 7.2
board 7.2
history 7.2
material 7.1
creative 7.1
leaf 7

Google
created on 2019-11-01

Microsoft
created on 2019-11-01

drawing 99.3
sketch 97.1
painting 95.2
gallery 95.1
scene 92.6
art 92
text 91.3
room 88
person 75.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-37
Gender Male, 50.4%
Disgusted 49.5%
Sad 49.6%
Calm 50.4%
Confused 49.5%
Fear 49.5%
Surprised 49.5%
Happy 49.5%
Angry 49.5%

AWS Rekognition

Age 29-45
Gender Female, 50.1%
Happy 49.5%
Surprised 49.5%
Fear 49.5%
Disgusted 49.5%
Angry 50%
Confused 49.5%
Sad 49.5%
Calm 49.9%

Feature analysis

Amazon

Person 98.9%

Categories

Imagga

paintings art 99.5%

Captions

Microsoft
created on 2019-11-01

a close up of a sign 58.8%
a sign on a wall 44.7%
a white sign with black text 32.9%