Human Generated Data

Title

Fish, after Egyptian Wall Painting

Date

c. 1876-1878

People

Artist: Charles Herbert Moore, American 1840 - 1930

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Fine Arts Department, Harvard University, 1926.33.64

Human Generated Data

Title

Fish, after Egyptian Wall Painting

People

Artist: Charles Herbert Moore, American 1840 - 1930

Date

c. 1876-1878

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Fine Arts Department, Harvard University, 1926.33.64

Machine Generated Data

Tags

Amazon
created on 2020-04-25

Fish 100
Animal 100
Carp 92.8
Koi 56.5

Clarifai
created on 2020-04-25

fish 100
sea 99.8
ocean 99.7
water 99.2
carp 98.6
marine 98.6
nature 98.3
underwater 98.2
illustration 98.1
seafood 98
saltwater 97.9
animal 97.8
fin 97.7
swimming 97.4
desktop 97.3
print 97.2
tropical 96.4
beach 96.3
biology 96
no person 94.3

Imagga
created on 2020-04-25

mollusk 28.9
gastropod 28.7
sea 28.5
sea slug 28.1
invertebrate 23.6
fish 19.7
water 16.7
seawater 15.4
ocean 14.9
tropical 14.5
swordfish 13.7
tool 13.3
comb 12.6
beach 12.6
wildlife 12.5
shell 12.4
travel 12
sand 11.9
marine 11.4
hair slide 11.4
clip 11.3
lute 10.7
underwater 10.6
fly 10.5
wild 10.4
flying 10.4
hairbrush 10.4
design 10.3
texture 9.7
animal 9.4
bird 9.3
brush 9.1
exotic 9.1
vacation 9
closeup 8.7
space 8.5
decoration 8.5
drawing 8.4
color 8.3
wing 8.2
starfish 8.1
fastener 8.1
symbol 8.1
stringed instrument 8
seashell 7.9
diving 7.8
flight 7.7
snail 7.6
decorative 7.5
device 7.3
metal 7.2
art 7.2
star 7.2
wave 7.2
eye 7.1
summer 7.1

Google
created on 2020-04-25

Fish 98.7
Fish 94.9
Illustration 82
Fin 78.9
Tail 57.8
Drawing 57.6
Bony-fish 54.3
Pomacentridae 53
Carp 51

Microsoft
created on 2020-04-25

fish 99.4
animal 96.9
aquarium 95.8
text 95.2
map 91.8
fin 59
envelope 48.8

Color Analysis

Feature analysis

Amazon

Fish 100%

Categories

Imagga

paintings art 100%

Captions

Microsoft
created on 2020-04-25

a close up of a map 66.6%
close up of a map 61.6%
a map with text 46.3%