Human Generated Data

Title

Plate XX

Date

1992

People

Artist: Richard Ryan, American born 1950

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, M21975

Human Generated Data

Title

Plate XX

People

Artist: Richard Ryan, American born 1950

Date

1992

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-03-22

Human 99.3
Person 99.3
Art 93.5
Painting 90.9
Person 86.3

Clarifai
created on 2019-03-22

picture frame 98.9
margin 97.2
retro 97.2
art 97
desktop 94.8
museum 93.8
blank 93.7
old 93.4
illustration 93.2
painting 92.9
design 92.6
vintage 92.4
image 91.9
album 91.7
exhibition 91.6
moment 90.6
people 90.6
decoration 90.2
antique 89.4
wall 88.8

Imagga
created on 2019-03-22

vintage 32.3
old 31.4
insulating material 24.8
frame 24.3
grunge 22.1
paper 22
retro 20.5
blackboard 20.5
building material 18.6
texture 17.4
stamp 17.4
business 17
wall 16.3
symbol 16.2
envelope 16
chalkboard 15.7
antique 15.6
board 15.5
money 15.3
note 14.7
container 14.6
blank 14.6
aged 14.5
finance 14.4
design 14.2
decoration 14
letter 13.8
currency 13.5
blade 13.1
education 13
sign 12.8
global 12.8
dirty 12.7
black 12.6
school 12.6
bill 12.4
device 12.2
ancient 12.1
close 12
post 11.4
empty 11.2
cash 11
message 11
art 10.7
text 10.5
learn 10.4
drawing 10.3
object 10.3
card 10.2
cutting implement 10
bank 9.9
chalk 9.7
tool 9.4
study 9.3
savings 9.3
dollar 9.3
paint 9.1
border 9
seal 9
technology 8.9
microprocessor 8.9
postage 8.8
icon 8.7
mail 8.6
exchange 8.6
rusty 8.6
space 8.5
wallpaper 8.4
page 8.4
pattern 8.2
financial 8
reminder 7.8
stained 7.7
international 7.6
book jacket 7.6
worn 7.6
grungy 7.6
communication 7.6
poster 7.6
print 7.5
cover 7.4
chip 7.4
banking 7.4
film 7.3
world 7.2
home 7.2
copy 7.1
die 7.1
wooden 7
textured 7
photograph 7

Google
created on 2019-03-22

Microsoft
created on 2019-03-22

gallery 98.2
scene 94.1
room 91.2
abstract 52
picture frame 12.6
art 12.6
stamp 10.9
monochrome 9.7
black and white 6.8
print 5.8
illustration 5.1

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 51.9%
Disgusted 45.1%
Angry 45.4%
Happy 45.2%
Confused 45.3%
Calm 50.2%
Surprised 45.1%
Sad 48.6%

AWS Rekognition

Age 20-38
Gender Male, 51.4%
Calm 52.8%
Disgusted 45%
Confused 45.1%
Happy 45.5%
Sad 45.5%
Surprised 45.3%
Angry 45.8%

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

an old photo of a room 66.9%
a black and white photo 44.2%
a black and white photo of a room 44.1%