Human Generated Data

Title

Page from an album of Rice and Silk Culture

Date

Qing dynasty, 1644-1911

People

Artist: Qiu Ying 仇英, Chinese ca. 1494-1552

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of the Hofer Collection of the Arts of Asia, 1985.852.8

Human Generated Data

Title

Page from an album of Rice and Silk Culture

People

Artist: Qiu Ying 仇英, Chinese ca. 1494-1552

Date

Qing dynasty, 1644-1911

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of the Hofer Collection of the Arts of Asia, 1985.852.8

Machine Generated Data

Tags

Amazon
created on 2019-05-31

Human 99.3
Person 99.3
Painting 97.5
Art 97.5
Person 95.4
Person 91
Person 85.4
Archaeology 61.5
Person 56.1
Plot 55.1

Clarifai
created on 2019-05-31

print 99.5
people 99.2
illustration 99.1
adult 99
painting 98.7
art 98.4
lithograph 97.9
group 97
wear 96.7
vehicle 95.7
mammal 95.2
transportation system 95.1
vintage 94.5
water 94.5
no person 92.5
man 92.3
text 91.7
one 91.7
two 89.7
river 88.1

Imagga
created on 2019-05-31

graffito 85.8
decoration 57.5
sketch 52
map 45.9
drawing 44.4
representation 43.7
grunge 42.6
old 40.5
vintage 37.3
antique 35.5
texture 31.3
wallpaper 26.8
geography 25.1
world 24
aged 23.6
wall 23.1
travel 21.9
atlas 21.7
paper 21.2
pattern 19.2
dirty 19
art 18.9
retro 18.1
globe 17.6
country 16.7
canvas 16.1
grain 15.7
stain 15.4
material 15.2
planet 15.1
earth 14.9
textured 14.9
ancient 14.7
continent 14.6
paint 14.5
nation 14.2
frame 14.2
design 14.1
tourism 14
sepia 13.6
navigation 13.5
artistic 13
detail 12.9
route 12.7
road 12.7
decay 12.6
rust 12.5
silhouette 12.4
worn 12.4
graphic 12.4
grungy 12.3
flower 12.3
floral 11.9
rough 11.9
decorative 11.7
city 11.7
detailed 11.6
north 11.5
surface 11.5
direction 11.4
sand 11.3
style 11.1
color 11.1
brown 11.1
effect 11
structure 11
global 11
architecture 10.9
border 10.9
location 10.8
grime 10.8
crack 10.7
edge 10.6
aging 10.6
page 10.2
space 10.1
geographic 9.9
smudged 9.9
boundary 9.9
distressed 9.8
navigate 9.8
backgrounds 9.7
fracture 9.7
states 9.7
stained 9.6
text 9.6
forest 9.6
weathered 9.5
capital 9.5
textures 9.5
plan 9.5
journey 9.4
plants 9.3
backdrop 9.1
painterly 8.9
expedition 8.9
find 8.8
guide 8.8
position 8.8
mottled 8.8
discovery 8.8
tour 8.7
obsolete 8.6
ocean 8.2
gold 8.2
painting 8.1
close 8
explore 7.8
leaf 7.8
card 7.7
china 7.5
artwork 7.3
yellow 7.3
season 7

Google
created on 2019-05-31

Art 81.3
Illustration 79.9
Painting 78.4
Visual arts 71.1
Drawing 53.7

Microsoft
created on 2019-05-31

drawing 96
painting 86.6
art 81.3
cartoon 70.4
sketch 62.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Female, 51.6%
Angry 45.5%
Disgusted 45.3%
Happy 47.7%
Sad 50.2%
Surprised 45.2%
Calm 45.9%
Confused 45.2%

AWS Rekognition

Age 26-44
Gender Female, 50.9%
Confused 45.6%
Calm 45.1%
Sad 52.1%
Surprised 45.3%
Angry 46.1%
Disgusted 45.3%
Happy 45.5%

AWS Rekognition

Age 4-7
Gender Female, 50.2%
Happy 45.7%
Disgusted 45.8%
Angry 45.9%
Surprised 45.3%
Sad 51.6%
Calm 45.2%
Confused 45.5%

AWS Rekognition

Age 20-38
Gender Female, 50.2%
Disgusted 49.5%
Sad 50.4%
Surprised 49.5%
Happy 49.6%
Angry 49.5%
Calm 49.5%
Confused 49.5%

AWS Rekognition

Age 27-44
Gender Female, 54.1%
Disgusted 45.4%
Calm 45.7%
Sad 46.2%
Confused 45.1%
Angry 46.4%
Surprised 45.2%
Happy 50.9%

AWS Rekognition

Age 4-9
Gender Female, 50.2%
Sad 46.9%
Angry 46%
Disgusted 46.9%
Surprised 46%
Happy 45.9%
Calm 47.7%
Confused 45.7%

Feature analysis

Amazon

Person 99.3%
Painting 97.5%

Categories

Captions

Microsoft
created on 2019-05-31

a close up of a horse 53.4%
a close up of a giraffe 35%
close up of a horse 34.9%