Human Generated Data

Title

Plate X

Date

1992

People

Artist: Richard Ryan, American born 1950

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, M21820

Human Generated Data

Title

Plate X

People

Artist: Richard Ryan, American born 1950

Date

1992

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, M21820

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Art 98.4
Painting 89.2
Person 76.5
Human 76.5
Rock 75.6
Art Gallery 74.3
Person 69.9
Sculpture 66.3
Floor 64.3
Modern Art 60.9
Canvas 55.4

Clarifai
created on 2023-10-26

art 99.3
museum 99.2
painting 98.8
portrait 98.1
people 96
wood 95.1
picture frame 94.7
wall 94.5
exhibition 94.4
square 93.6
landscape 93.4
image 93.1
nature 92.9
girl 92.7
desktop 90.5
sea 90.2
no person 89.1
vintage 88.6
tree 87.4
shadow 87.3

Imagga
created on 2022-01-22

blackboard 39.6
television 39
monitor 38.2
frame 33.7
blank 29.1
telecommunication system 26
black 22.3
vintage 22.3
note 21.1
paper 19.7
chalkboard 19.6
old 19.5
board 19
empty 18.9
chalk 17.5
design 17.4
school 17
education 16.4
message 16.4
electronic equipment 16.1
equipment 15.9
film 14.9
object 14.7
business 14.6
text 14
billboard 13.8
grunge 13.6
notice 13.6
sign 13.5
symbol 13.5
communication 13.4
photograph 13.3
web site 13.3
space 13.2
technology 12.6
texture 12.5
retro 12.3
screen 11.9
art 11.8
border 11.8
card 11.3
snapshot 11.1
aged 10.9
reminder 10.7
write 10.3
display 10.1
instant 9.8
modern 9.8
classroom 9.7
layout 9.7
antique 9.6
flat 9.6
college 9.5
learn 9.4
finance 9.3
global 9.1
single 9
computer 9
pattern 8.9
copy 8.8
silver 8.8
lesson 8.8
teach 8.8
drawing 8.7
nobody 8.5
photography 8.5
electronic 8.4
letter 8.3
office 8
class 7.7
mail 7.7
money 7.7
worn 7.6
writing 7.5
brown 7.4
digital 7.3
financial 7.1
idea 7.1
science 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

scene 99.8
gallery 99.7
room 99.6
art 92.8
picture frame 88.3
drawing 70.2
old 65.5
painting 33.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-31
Gender Female, 99.5%
Calm 99.3%
Sad 0.3%
Fear 0.2%
Happy 0.1%
Surprised 0.1%
Confused 0%
Disgusted 0%
Angry 0%

Feature analysis

Amazon

Painting 89.2%
Person 76.5%

Captions

Microsoft
created on 2022-01-22

a vintage photo of a painting 59.3%
an old photo of a painting 59.2%
a painting on the wall 59.1%

Text analysis

Amazon

RR
RR 1992
1992

Google

RR1992
RR1992