Human Generated Data

Title

Catching the Ferry

Date

Mid to late 18th century

People

Artist: Kō Sūkoku, Japanese 1730 - 1804

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Ernest B. and Helen Pratt Dane Fund for Asian Art, 1993.5

Human Generated Data

Title

Catching the Ferry

People

Artist: Kō Sūkoku, Japanese 1730 - 1804

Date

Mid to late 18th century

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Ernest B. and Helen Pratt Dane Fund for Asian Art, 1993.5

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Person 96.4
Human 96.4
Art 94.7
Drawing 90.9
Painting 88.6
Person 85
Person 79.1
Person 77.7
Sketch 71
Transportation 60.8
Boat 60.8
Vehicle 60.8
Photography 58.8
Photo 58.8

Clarifai
created on 2020-04-24

people 99.5
group 99.2
art 99
illustration 98.8
print 97.1
vintage 96.6
adult 96.3
watercraft 93.5
engraving 93.2
veil 92.7
wear 92.5
painting 91.4
cavalry 91.3
antique 91.1
furniture 90.5
man 90
desktop 89.6
tree 89.1
retro 86.8
visuals 86.6

Imagga
created on 2020-04-24

windowsill 44.3
fountain 35.5
sill 35.4
snow 32.1
structure 32.1
structural member 26.6
sketch 25.2
window screen 24.1
grunge 23.8
drawing 23.2
screen 22.9
old 20.9
water 19.3
vintage 18.2
cold 18.1
landscape 17.8
texture 17.4
support 17.1
winter 17
protective covering 15.7
ice 15.4
frozen 15.3
antique 14.7
aged 14.5
weather 14.3
representation 14.3
sky 14
retro 13.1
tree 13.1
pattern 13
ancient 13
dirty 12.6
architecture 12.5
river 12.4
forest 12.2
black 12
travel 12
outdoors 11.9
art 11.7
season 11.7
design 10.7
trees 10.7
rural 10.6
frost 10.6
grungy 10.4
covering 10.4
park 10.2
natural 10
rough 10
wood 10
frame 10
scenery 9.9
material 9.8
device 9.6
paper 9.4
peaceful 9.2
freight car 9.1
city 9.1
tourism 9.1
paint 9
history 8.9
textured 8.8
snowy 8.7
weathered 8.5
outdoor 8.4
wall 8.2
landmark 8.1
backgrounds 8.1
cool 8
color 7.8
freeze 7.8
empty 7.7
decoration 7.7
dirt 7.6
damaged 7.6
building 7.4
graphic 7.3
ocean 7.3
border 7.2
holiday 7.2
wet 7.2
country 7
scenic 7

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

drawing 99.1
sketch 98.3
text 96.4
painting 95.7
ship 77.3
child art 63.5
black and white 62.5
gallery 59.1
art 50

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 13-25
Gender Female, 50.4%
Happy 49.7%
Angry 49.9%
Disgusted 49.5%
Confused 49.5%
Calm 49.7%
Surprised 49.5%
Sad 49.5%
Fear 49.5%

AWS Rekognition

Age 1-5
Gender Female, 50.1%
Sad 49.8%
Surprised 49.5%
Calm 49.6%
Angry 49.6%
Fear 49.5%
Confused 49.5%
Disgusted 49.5%
Happy 50%

AWS Rekognition

Age 15-27
Gender Male, 50.4%
Surprised 49.5%
Disgusted 49.5%
Confused 49.5%
Calm 49.7%
Sad 49.8%
Happy 49.5%
Fear 49.5%
Angry 50%

Feature analysis

Amazon

Person 96.4%
Painting 88.6%
Boat 60.8%

Categories

Captions

Microsoft
created on 2020-04-24

an old photo of a person 33.8%