Human Generated Data

Title

Plate III

Date

1992

People

Artist: Richard Ryan, American born 1950

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, M21808

Human Generated Data

Title

Plate III

People

Artist: Richard Ryan, American born 1950

Date

1992

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Art 98.3
Painting 90.2
Human 78.7
Person 78.7

Imagga
created on 2022-01-22

old 32.1
memorial 31.4
vintage 27.3
grunge 25.5
paper 25.1
brass 24.7
retro 23.8
symbol 23.6
stencil 23.4
structure 22
texture 19.5
board 19.4
wall 18.9
sign 18.1
aged 17.2
envelope 15.6
gravestone 15.5
business 15.2
education 14.7
money 14.5
design 14.4
finance 14.4
school 14.4
technology 14.1
stamp 14
chalkboard 13.7
blank 13.7
learn 13.2
note 12.9
art 12.9
letter 12.8
frame 12.5
stone 12.4
study 12.1
ancient 12.1
decoration 11.8
power 11.8
metal 11.3
antique 11.3
dirty 10.8
wallpaper 10.7
textured 10.5
object 10.3
black 10.2
blade 10.2
card 10.1
communication 10.1
currency 9.9
close 9.7
button 9.7
text 9.6
international 9.6
post 9.5
rusty 9.5
bill 9.5
stucco 9.4
metallic 9.2
blackboard 9.2
rough 9.1
history 8.9
measurement 8.7
mail 8.6
concrete 8.6
empty 8.6
dollar 8.4
bank 8.3
banking 8.3
cash 8.2
message 8.2
material 8
graphic 8
science 8
container 8
postage 7.9
mass 7.9
name 7.8
cutting implement 7.8
system 7.6
weight 7.6
word 7.5
element 7.4
cover 7.4
brown 7.4
backgrounds 7.3
push button 7.3
border 7.2
world 7.2
financial 7.1
icon 7.1
idea 7.1
tool 7.1

Google
created on 2022-01-22

Plant 91
Tree 85.4
Wood 82.6
Art 82.5
Rectangle 79
Painting 76.9
Font 75
Trunk 74.6
Visual arts 68.2
Relief 66.7
Drawing 65.3
Wildlife 65.1
Palm tree 64.7
Illustration 64.2
Printmaking 63.8
Monochrome photography 62.3
Monochrome 61.7
History 60.5
Paper product 58.3
Artwork 57.8

Microsoft
created on 2022-01-22

gallery 99.6
room 99.1
scene 99
drawing 96.6
indoor 93
sketch 90.1
art 89.3
old 77.1
white 68.4
black 65
text 63.4
person 57.6
picture frame 53.4
painting 44.3

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 90.2%
Person 78.7%

Captions

Microsoft

a vintage photo of a painting 60.6%
an old photo of a painting 60.5%
a painting on the wall 60.4%

Text analysis

Amazon

1992
R.R 1992
R.R