Human Generated Data

Title

Olives

Date

1994

People

Artist: Sarah Smith, American active 20th c.

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, George R. Nutter Fund, M24360

Human Generated Data

Title

Olives

People

Artist: Sarah Smith, American active 20th c.

Date

1994

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, George R. Nutter Fund, M24360

Machine Generated Data

Tags

Amazon
created on 2019-11-01

Art 88.5
Person 82.4
Human 82.4
Advertisement 75.6
Poster 74.8
Person 73.7
Drawing 60.9
Collage 58.1

Clarifai
created on 2019-11-01

retro 95.6
paper 95.2
blank 95
desktop 95
texture 94.4
empty 93.6
old 92.6
art 92.4
picture frame 92.4
vintage 91.4
cardboard 91.3
design 88.6
antique 88.6
abstract 88.5
nature 87.3
page 87.2
dirty 87.2
sand 84.7
card 84.4
parchment 84.3

Imagga
created on 2019-11-01

envelope 73.2
paper 47.9
container 47.3
old 39.8
texture 36.2
blank 36
grunge 35.8
vintage 34.8
notebook 33.6
antique 32.1
empty 30.1
retro 29.5
page 28.8
frame 28.5
aged 28.1
parchment 26.9
dirty 24.4
canvas 22.8
pattern 22.6
document 22.3
design 22.2
card 21.9
ancient 21.6
brown 21.4
border 19.9
space 19.4
textured 19.3
stained 19.2
message 19.2
rough 19.2
letter 17.4
worn 17.2
damaged 17.2
grungy 17.1
cardboard 16.3
book 16.3
post 16.2
board 15.9
decorative 15.9
note 14.7
yellow 14.6
art 14.4
material 14.3
stamp 13.3
copy 13.3
sheet 13.2
text 13.1
wallpaper 13
backgrounds 13
manuscript 12.7
wall 12.1
binding 11.5
aging 11.5
object 11
symbol 10.8
stains 10.7
torn 10.7
sign 10.5
mail 10.5
old fashioned 10.5
detail 10.5
office 10.5
write 10.4
surface 9.8
business 9.7
crumpled 9.7
color 9.5
book jacket 9.5
cover 9.3
drawing 9.2
collection 9
postage 8.9
artistic 8.7
sketch 8.7
corner 8.7
paint 8.2
package 7.9
scratched 7.8
faded 7.8
burnt 7.8
weathered 7.6
jacket 7.4

Google
created on 2019-11-01

Microsoft
created on 2019-11-01

wall 99.9
drawing 99.1
indoor 93.9
sketch 93.1
cartoon 90.8
text 86.8
gallery 70.2
white 63.3
envelope 58.3
room 56.3
painting 41.4
picture frame 7.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-35
Gender Male, 54.6%
Disgusted 45.1%
Angry 45.6%
Surprised 46%
Happy 45.1%
Calm 49.3%
Sad 45.5%
Fear 48.3%
Confused 45.2%

Feature analysis

Amazon

Person 82.4%

Categories

Imagga

paintings art 99.9%

Captions

Text analysis

Amazon

resmSmh'sy

Google

Ale
Ale