Human Generated Data

Title

Plate XVII

Date

1992

People

Artist: Richard Ryan, American born 1950

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, M21972

Human Generated Data

Title

Plate XVII

People

Artist: Richard Ryan, American born 1950

Date

1992

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, M21972

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Art 94.3
Painting 86.6
Monitor 84.4
Electronics 84.4
Screen 84.4
Display 84.4
Canvas 82.7
Person 78.5
Human 78.5
Face 68.4
Person 66.8
Portrait 62.7
Photography 62.7
Photo 62.7
Floor 59.1
Indoors 57.4

Clarifai
created on 2023-10-26

art 99.1
museum 98.8
painting 97.8
portrait 97.7
wall 96.9
picture frame 95.6
vintage 95.1
people 94.3
image 94.1
wood 93.7
old 93.3
architecture 92.5
retro 92
landscape 91.9
tree 91.7
window 91.4
illustration 91.3
abstract 91
desktop 90.9
square 89.2

Imagga
created on 2022-01-22

monitor 57.7
television 43.6
blackboard 42.3
equipment 34.1
electronic equipment 33.3
frame 27.9
telecommunication system 23.6
black 21.6
old 21.6
blank 21.4
grunge 20.4
technology 20
vintage 19.8
chalkboard 19.6
billboard 18.5
film 18.2
board 18.1
object 17.6
texture 17.4
retro 15.6
empty 15.4
screen 15.4
school 15.2
space 14.7
chalk 14.6
paper 14.1
business 14
display 13.7
computer 13.1
education 13
photograph 12.9
note 12.9
notice 12.6
microwave 12.3
digital 12.1
antique 12.1
wall 12
aged 11.8
copy 11.5
sign 11.3
design 11.2
electronic 11.2
finance 11
single 10.7
learn 10.4
symbol 10.1
message 10
border 9.9
kitchen appliance 9.8
modern 9.8
home 9.6
3d 9.3
global 9.1
dirty 9
broadcasting 9
negative 8.9
information 8.8
lesson 8.8
closeup 8.8
classroom 8.7
reminder 8.7
text 8.7
signboard 8.7
flat 8.7
wide 8.6
system 8.6
close 8.6
photography 8.5
grungy 8.5
money 8.5
card 8.5
write 8.5
communication 8.4
pattern 8.2
brown 8.1
financial 8
device 7.9
textured 7.9
snapshot 7.8
teach 7.8
laptop 7.8
worn 7.6
media 7.6
old fashioned 7.6
wood 7.5
savings 7.5
cover 7.4
home appliance 7.4
camera 7.4
banking 7.3
security 7.3
letter 7.3
appliance 7.3
structure 7.3
gray 7.2
wealth 7.2
bank 7.2
work 7.1
wooden 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

gallery 99.3
scene 98.9
room 98.4
art 97
drawing 94.4
painting 88.6
picture frame 86.4
sketch 75.5
black and white 74.4
white 61.1
text 58.7
old 53.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Female, 58.5%
Calm 97.9%
Sad 0.9%
Disgusted 0.3%
Fear 0.2%
Surprised 0.2%
Angry 0.1%
Happy 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Monitor 84.4%
Person 78.5%

Captions

Microsoft
created on 2022-01-22

an old photo of a room 70.6%
an old photo of a living room 38%
old photo of a room 37.9%

Text analysis

Amazon

3/30
XVII
P.R.1992

Google

R.RIS72
R.RIS72