Human Generated Data

Title

Vasakasajja Nayika on a Terrace with Female Attendants, Being Helped with Her Shoes

Date

18th century

People

-

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of John Kenneth Galbraith, 1971.138

Human Generated Data

Title

Vasakasajja Nayika on a Terrace with Female Attendants, Being Helped with Her Shoes

Date

18th century

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of John Kenneth Galbraith, 1971.138

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Human 96
Person 92
Art 88.3
Female 86.6
Drawing 85.9
Person 80.7
Animal 76.5
Bird 76.5
Painting 76.5
Person 76
Girl 73.2
Clothing 71
Apparel 71
People 70.6
Face 64.5
Woman 63.8
Child 61.8
Kid 61.8
Photo 61.7
Portrait 61.7
Photography 61.7
Text 60.9
Teen 60
Dress 59.7
Blonde 55.4
Person 53.1

Clarifai
created on 2020-04-24

people 100
adult 99.3
group 99.1
two 98
woman 97
print 96.8
man 96.2
veil 95.5
leader 91.4
home 91.1
three 90.7
art 90.6
offspring 89.9
child 89.5
group together 89.2
four 88.7
gown (clothing) 88.5
several 88
furniture 87.8
administration 87.3

Imagga
created on 2020-04-24

graffito 68.6
decoration 47.4
building 36.9
architecture 33.1
sketch 29
old 28.6
drawing 23.5
wall 23.1
window 21.8
structure 21.8
house 21.7
city 21.6
stone 21.4
shop 21.3
door 20.6
travel 19
sculpture 18.2
ancient 18.2
representation 17.9
town 16.7
mercantile establishment 16.1
tourism 15.7
historic 15.6
vintage 14.9
facade 14.4
history 14.3
barbershop 14.1
urban 14
exterior 13.8
famous 13
art 12.4
brick 12.2
street 12
religion 11.7
antique 11.3
construction 11.1
landmark 10.8
frame 10.8
place of business 10.8
marble 10.7
retro 10.7
entrance 10.6
statue 10.5
detail 10.5
historical 10.4
monument 10.3
grunge 10.2
glass 10.1
sill 10.1
aged 10
dirty 9.9
architectural 9.6
village 9.6
windows 9.6
home 9.6
balcony 9.1
bakery 9
wooden 8.8
empty 8.6
culture 8.6
destination 8.4
texture 8.3
tourist 8.2
arch 7.8
abandoned 7.8
palace 7.7
medieval 7.7
memorial 7.7
structural member 7.6
temple 7.6
buildings 7.6
style 7.4
column 7.4
brown 7.4
sky 7

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

text 96.9
cartoon 91.7
drawing 88.4
outdoor 88.1
painting 78.4
gallery 74.5
black 74.4
woman 73.4
old 73.2
sketch 71.5
white 60.7
wedding dress 59.5
clothing 58.7
person 56.4
dress 53
black and white 52.4
picture frame 43.3
vintage 25.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 16-28
Gender Female, 53.5%
Calm 47.8%
Sad 45.3%
Fear 45.6%
Happy 45.4%
Disgusted 45.2%
Surprised 46.4%
Confused 45.1%
Angry 49.1%

AWS Rekognition

Age 17-29
Gender Female, 54.2%
Angry 45.1%
Fear 45.1%
Disgusted 45.1%
Happy 45.4%
Calm 52.5%
Sad 45.6%
Surprised 45.1%
Confused 46.2%

AWS Rekognition

Age 17-29
Gender Female, 54.8%
Surprised 45%
Angry 45.1%
Disgusted 45%
Calm 53.8%
Fear 45%
Confused 45%
Happy 45.7%
Sad 45.2%

AWS Rekognition

Age 13-23
Gender Female, 54.6%
Angry 45.1%
Fear 45%
Disgusted 52.3%
Happy 46.9%
Calm 45.1%
Sad 45.2%
Surprised 45.1%
Confused 45.3%

AWS Rekognition

Age 19-31
Gender Female, 50.4%
Disgusted 49.5%
Sad 49.5%
Happy 50.2%
Angry 49.5%
Confused 49.5%
Surprised 49.5%
Fear 49.6%
Calm 49.6%

AWS Rekognition

Age 13-23
Gender Female, 54.8%
Angry 45.4%
Surprised 45.4%
Confused 45.2%
Disgusted 45.1%
Happy 46.2%
Calm 50.9%
Fear 45.1%
Sad 46.7%

AWS Rekognition

Age 23-35
Gender Female, 53.6%
Fear 47.3%
Happy 45%
Confused 49%
Sad 48.2%
Angry 45.2%
Surprised 45.2%
Disgusted 45%
Calm 45.1%

Feature analysis

Amazon

Person 92%
Bird 76.5%
Painting 76.5%

Categories