Human Generated Data

Title

Royal Road Test

Date

1967

People

Artist: Edward Ruscha, American born 1937

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, M24200

Copyright

© Ed Ruscha

Human Generated Data

Title

Royal Road Test

People

Artist: Edward Ruscha, American born 1937

Date

1967

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, M24200

Copyright

© Ed Ruscha

Machine Generated Data

Tags

Amazon
created on 2022-06-25

Person 99.8
Human 99.8
Soil 96.4
Ground 89.5
Page 87.7
Text 87.7
Person 76.7
Road 66.3
Clothing 66.2
Apparel 66.2
Photography 64.3
Photo 64.3
Portrait 60.6
Face 60.6
Tarmac 58.6
Asphalt 58.6
Standing 57.8
Pants 57
Word 56.5

Clarifai
created on 2023-10-29

people 99.4
man 97.6
adult 97.3
outdoors 96.8
one 95.9
no person 93.4
woman 92.4
monochrome 91.9
sand 89.8
empty 89.3
two 87.7
nature 86.7
horizontal 86.7
child 86.4
freedom 85.2
beach 83.8
business 83.6
horizontal plane 81.9
water 81.1
solitude 79.9

Imagga
created on 2022-06-25

wall 38.5
old 31.4
texture 28.5
vintage 26.5
grunge 23.8
pattern 23.2
paper 21.2
aged 20.8
retro 20.5
stone 18.5
frame 18.3
rough 18.2
handsaw 18.2
antique 18.2
canvas 18
material 17.9
surface 17.6
padlock 17.6
ancient 17.3
border 16.3
dirty 16.3
grungy 16.1
textured 15.8
weathered 15.2
fortress 14.8
empty 14.6
blank 14.6
saw 14.5
paint 14.5
lock 13.7
detail 13.7
fastener 13.6
architecture 13.3
art 13.1
brown 12.5
space 12.4
design 12.4
wallpaper 12.3
backdrop 11.5
damaged 11.4
construction 11.1
hand tool 11
color 10.6
tourism 9.9
decay 9.6
style 9.6
rust 9.6
text 9.6
structure 9.4
sheet 9.4
restraint 9.2
travel 9.2
clothing 9.1
iron 9.1
notebook 9
newspaper 8.9
decoration 8.9
building 8.7
torn 8.7
worn 8.6
black 8.4
historic 8.3
copy space 8.1
backgrounds 8.1
closeup 8.1
graphic 8
envelope 7.9
sea 7.8
great 7.7
textile 7.6
fabric 7.5
note 7.4
template 7.3
letter 7.3
tower 7.2
brick 7.2
film 7.1

Google
created on 2022-06-25

Microsoft
created on 2022-06-25

text 98.7
outdoor 98.3
person 87
black and white 72.8
man 62.8
monochrome 57.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 65.6%
Sad 99.9%
Calm 24.6%
Surprised 6.4%
Fear 5.9%
Angry 0.7%
Disgusted 0.3%
Happy 0.3%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Categories

Captions

Microsoft
created on 2022-06-25

a man standing next to a woman 42.3%

Text analysis

Amazon

(Right)
Scene
(Figure
to
point
Illustration
stands
showing
background
wreckage
of
impact
in
Figure
foreground
strewn
impact.)
points
feet
189
fartbest
from
distance
at
foreground points to impact area. Figure in background stands at
fartbest point of wreckage 189 feet from point of impact.)
(Right) Illustration showing distance wreckage traveled. (Figure in
Scene of strewn wreckage
area.
traveled.

Google

Scene of strewn wreckage. (Right) Illustration showing distance wreckage traveled. (Figure in foreground points to impact area. Figure in background stands at farthest point of wreckage. 189 feet from point of impact.)
Scene
of
strewn
wreckage
.
(
Right
)
Illustration
showing
distance
traveled
Figure
in
foreground
points
to
impact
area
background
stands
at
farthest
point
189
feet
from