Human Generated Data

Title

Walking Ape

Date

1967

People

Artist: Arthur Herschel Lidov, American 1917 - 1990

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, M23620

Human Generated Data

Title

Walking Ape

People

Artist: Arthur Herschel Lidov, American 1917 - 1990

Date

1967

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, M23620

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Art 98.2
Painting 93.3
Mammal 75.2
Bear 75.2
Animal 75.2
Wildlife 75.2
Archaeology 64.6

Clarifai
created on 2019-10-29

people 98.2
mammal 97.5
print 96.8
art 96.6
illustration 95.8
antique 94.2
retro 93.9
old 93.8
adult 93.6
one 92.9
vintage 92.8
portrait 92.6
paper 92.3
museum 89.5
wear 89.2
animal 89.1
ancient 88.8
painting 88.7
man 88.6
picture frame 88.4

Imagga
created on 2019-10-29

corbel 73
bracket 58.4
support 42.4
old 32.8
texture 32
vintage 29.8
sand 29.3
device 29.1
grunge 28.1
stone 26.1
antique 25.1
aged 24.4
dirty 23.5
ancient 20.8
close 20.6
retro 20.5
memorial 19.7
paper 19.6
gravestone 19.6
art 19
textured 17.5
earth 17.3
pattern 17.1
soil 16.8
structure 16.4
wall 16.4
rough 15.5
wallpaper 15.3
history 15.2
design 14.7
grungy 14.2
brown 14
material 13.4
rusty 13.3
detail 12.9
blank 12.9
culture 12.8
decay 12.5
architecture 12.5
stain 12.5
aging 12.5
damaged 12.4
surface 12.4
cash 11.9
currency 11.7
stained 11.5
weathered 11.4
empty 11.2
money 11.1
book 10.9
age 10.5
banking 10.1
note 10.1
bank 9.9
messy 9.7
parchment 9.6
obsolete 9.6
bill 9.5
canvas 9.5
sheet 9.4
finance 9.3
beach 9.3
yellow 9.3
historic 9.2
travel 9.2
color 8.9
burnt 8.7
torn 8.7
spotted 8.7
sculpture 8.7
worn 8.6
face 8.5
frame 8.3
tile 8.3
backdrop 8.2
one 8.2
doormat 8.2
symbol 8.1
ragged 7.8
cardboard 7.7
dirt 7.6
statue 7.6
poster 7.6
historical 7.5
closeup 7.4
world 7.4
letter 7.3
backgrounds 7.3
global 7.3
wealth 7.2
financial 7.1

Google
created on 2019-10-29

Microsoft
created on 2019-10-29

room 99.6
gallery 99.5
scene 98.7
indoor 89
text 72.2
drawing 54.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-35
Gender Female, 93.3%
Happy 0.9%
Fear 56.6%
Sad 7.4%
Calm 23.1%
Confused 1.4%
Disgusted 0.2%
Angry 4.3%
Surprised 6.2%

AWS Rekognition

Age 51-69
Gender Male, 86.1%
Sad 46.9%
Surprised 0.1%
Fear 0.3%
Calm 51.9%
Disgusted 0.2%
Confused 0.4%
Happy 0%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 93.3%
Bear 75.2%

Categories

Captions

Text analysis

Google

) ES
)
ES