Human Generated Data

Title

Album of Chinese Export Paintings: Traditional Costume

Date

-

People

-

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Walter H. Trumbull, 1956.194.6

Human Generated Data

Title

Album of Chinese Export Paintings: Traditional Costume

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Walter H. Trumbull, 1956.194.6

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Human 98
Performer 88.6
Drawing 83.2
Doodle 83.2
Art 83.2
Person 80.5
Crowd 78
Leisure Activities 71.6
Carnival 59.9
Costume 58.9

Clarifai
created on 2019-07-07

painting 96.6
people 95.3
wear 92.7
art 92.4
picture frame 92
exhibition 90.3
retro 89.8
woman 88.4
card 87.8
illustration 86.9
vector 86.8
no person 85.1
decoration 83
man 82.9
adult 81.8
museum 80.1
child 77.7
design 77.7
desktop 76.6
one 75.8

Imagga
created on 2019-07-07

person 30.3
adult 27.4
people 24
smiling 23.9
smile 22.8
fashion 22.6
attractive 21.7
women 21.3
pretty 21
maillot 20.4
happy 18.8
happiness 17.2
model 17.1
portrait 16.2
clothing 16.1
sexy 16.1
cute 15.8
holding 14.8
tights 14.8
cheerful 14.6
lady 14.6
lifestyle 14.5
shopping 13.8
dress 12.6
bag 12.2
hair 11.9
standing 11.3
one 11.2
gift 11.2
hosiery 11.1
art 10.9
holiday 10.7
face 10.6
fun 10.5
looking 10.4
shirt 10.3
joy 10
human 9.7
brunette 9.6
one person 9.4
traditional 9.1
dancer 9.1
studio 9.1
oriental 9.1
student 9.1
garment 8.9
shop 8.7
player 8.6
performer 8.6
culture 8.5
child 8.5
girls 8.2
kimono 8.2
style 8.2
man 8.1
posing 8
work 7.9
bags 7.8
brass 7.7
youth 7.7
customer 7.6
golfer 7.5
leisure 7.5
footwear 7.4
blond 7.4
box 7.4
sale 7.4
jersey 7.4
teenager 7.3
business 7.3
male 7.2

Google
created on 2019-07-07

Picture frame 65.7
Illustration 57.7
Art 50.2

Microsoft
created on 2019-07-07

cartoon 94.9
clothing 93.8
person 91.4
drawing 89.4
painting 87.7
design 80.4
illustration 80.4
child art 71.7
woman 50.4
picture frame 27.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-43
Gender Male, 54.4%
Surprised 45.1%
Calm 54.4%
Angry 45.1%
Confused 45.1%
Disgusted 45%
Sad 45.2%
Happy 45.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 80.5%

Categories

Imagga

food drinks 93.9%
paintings art 4.8%