Human Generated Data

Title

Imagine an Anchor Plate 4

Date

2004

People

Artist: Nathaniel Hester, American born 1976

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, George R. Nutter Fund, M26185.4

Copyright

© 2004 Nathaniel Christopher Hester

Human Generated Data

Title

Imagine an Anchor Plate 4

People

Artist: Nathaniel Hester, American born 1976

Date

2004

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, George R. Nutter Fund, M26185.4

Copyright

© 2004 Nathaniel Christopher Hester

Machine Generated Data

Tags

Amazon
created on 2021-04-03

Art 92.3
Person 90.6
Human 90.6
Drawing 81.9
Painting 80.1
People 60

Clarifai
created on 2021-04-03

people 99.6
art 99.6
painting 98.7
portrait 97.4
adult 96.9
print 96.8
vintage 96.5
woman 95.7
illustration 94.5
man 93.8
retro 93.5
group 93
wear 92.2
one 92.2
old 89.7
veil 89.6
child 88
two 85.3
furniture 84.5
lid 82.9

Imagga
created on 2021-04-03

padlock 100
lock 100
fastener 75.5
restraint 51.1
device 32.3
box 32.3
old 31.4
security 27.6
metal 26.6
chest 26.3
key 22.1
hasp 20.3
safe 19.6
safety 19.3
vintage 18.2
retro 17.2
catch 16.7
business 16.4
container 15.5
protection 15.5
object 14.7
secure 14.5
aged 14.5
close 14.3
brown 14
open 13.5
steel 13.3
money 12.8
paper 12.5
wood 12.5
iron 12.1
symbol 12.1
currency 11.7
gold 11.5
rusty 11.4
wooden 11.4
door 11.4
antique 11.3
equipment 11.2
grunge 11.1
texture 10.4
finance 10.1
bank 9.9
black 9.6
gift 9.5
card 9.4
metallic 9.2
concepts 8.9
locked 8.9
closed 8.7
rust 8.7
protect 8.7
frame 8.3
sign 8.3
cash 8.2
closeup 8.1
design 8
home 8
information 8
treasure 7.7
communication 7.6
word 7.5
decoration 7.5
stock 7.5
traditional 7.5
number 7.5
technology 7.4
banking 7.4
shopping 7.3
investment 7.3
rough 7.3
board 7.2
office 7.2
dirty 7.2
wealth 7.2
silver 7.1

Google
created on 2021-04-03

Microsoft
created on 2021-04-03

gallery 98.4
room 97.5
indoor 96.3
scene 95.8
text 93.9
person 86.7
drawing 69.2
clothing 63.8
picture frame 51.4
old 49.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Male, 84.3%
Sad 52.9%
Calm 43.3%
Angry 1.6%
Confused 1.2%
Fear 0.4%
Happy 0.3%
Surprised 0.2%
Disgusted 0.1%

AWS Rekognition

Age 19-31
Gender Male, 87.3%
Calm 62.9%
Sad 29.1%
Fear 5%
Happy 1.3%
Angry 1.1%
Confused 0.3%
Surprised 0.1%
Disgusted 0.1%

Feature analysis

Amazon

Person 90.6%
Painting 80.1%

Categories

Imagga

paintings art 97.6%
text visuals 1.4%

Captions

Microsoft
created on 2021-04-03

an old photo of a person 37.8%
a close up of a box 37.7%
an old photo of a room 37.6%

Text analysis

Amazon

"imgine
hmeho
"imgine An hmeho pLit
pLit
An

Google

"tumagime An Amehon- PLat 1/10
"tumagime
An
Amehon-
PLat
1/10