Human Generated Data

Title

Page from an album of Rice and Silk Culture

Date

Qing dynasty, 1644-1911

People

Artist: Qiu Ying 仇英, Chinese ca. 1494-1552

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of the Hofer Collection of the Arts of Asia, 1985.852.1

Human Generated Data

Title

Page from an album of Rice and Silk Culture

People

Artist: Qiu Ying 仇英, Chinese ca. 1494-1552

Date

Qing dynasty, 1644-1911

Classification

Paintings

Machine Generated Data

Tags

Amazon
created on 2019-05-31

Art 99.6
Painting 99.6
Person 99.2
Human 99.2
Person 97.6
Person 95.8
Person 73
Nature 60.4
Outdoors 57.6

Clarifai
created on 2019-05-31

painting 99.9
people 99.9
adult 99.7
print 99.4
art 99.3
group 99.2
illustration 99
lithograph 97.4
desert 96.9
landscape 96.3
man 96.2
home 95
wear 94.5
mammal 94.4
travel 93
two 92
woman 91.8
no person 91.7
transportation system 91.5
veil 91.1

Imagga
created on 2019-05-31

sand 96.9
beach 57.3
dune 49.9
soil 48.1
travel 36.7
landscape 35.8
earth 35.4
ocean 32.8
sky 31.4
sea 29.8
desert 29.4
tourism 25.6
scenic 25.5
rock 23.5
coast 23.4
park 22.3
vacation 22.1
water 20.7
mountain 19.6
shore 19.4
summer 19.3
rocks 17
coastline 16.9
island 16.5
sun 15.3
cliff 15.1
canyon 15
outdoors 15
tropical 14.5
mountains 13.9
cloud 13.8
holiday 13.6
national 13.6
sandbar 13.4
scenery 12.6
tree 12.6
seaside 12.5
clouds 11.9
stone 11.8
tourist 11.8
valley 11.8
geology 11.7
scenics 11.5
natural 11.4
ridge 10.9
erosion 10.8
outdoor 10.7
trees 10.7
resort 10.5
waves 10.2
bar 10
landmark 9.9
horizon 9.9
sandstone 9.8
palm 9.4
barrier 9
southwest 8.8
bay 8.7
geological formation 8.7
day 8.6
sunny 8.6
destination 8.4
peaceful 8.2
calm 8.2
seascape 7.7
paradise 7.5
lake 7.3
morning 7.2
road 7.2
nobody 7

Google
created on 2019-05-31

Painting 80.8
Visual arts 76.9
Art 74
Illustration 57.7
Sand 52.3
Dust 50.7

Microsoft
created on 2019-05-31

drawing 99.8
sketch 98.7
child art 98.1
art 97.2
cartoon 65
painting 31.1

Face analysis

Amazon

AWS Rekognition

Age 38-59
Gender Male, 50.1%
Disgusted 47.2%
Happy 46.1%
Sad 48.2%
Calm 45.4%
Angry 46.1%
Surprised 45.6%
Confused 46.4%

AWS Rekognition

Age 10-15
Gender Male, 53.6%
Angry 45.5%
Surprised 45.3%
Disgusted 45.6%
Sad 52.4%
Calm 45.1%
Happy 45.6%
Confused 45.5%

AWS Rekognition

Age 17-27
Gender Female, 50.8%
Sad 54.5%
Confused 45.1%
Disgusted 45.1%
Surprised 45%
Angry 45.1%
Happy 45.1%
Calm 45.2%

AWS Rekognition

Age 17-27
Gender Female, 50.2%
Angry 49.6%
Sad 49.9%
Disgusted 49.5%
Surprised 49.6%
Happy 49.7%
Calm 49.7%
Confused 49.6%

Feature analysis

Amazon

Painting 99.6%
Person 99.2%

Captions

Microsoft

a painting of a horse drawn carriage traveling down a dirt road 45.5%
a painting of a man and a woman walking down a dirt road 45.4%
a painting of a person 45.3%