Human Generated Data

Title

Group of Female Musicians and Dancers

Date

18th century

People

-

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, John Witt Randall Fund, 1952.97

Human Generated Data

Title

Group of Female Musicians and Dancers

Date

18th century

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, John Witt Randall Fund, 1952.97

Machine Generated Data

Tags

Amazon
created on 2019-05-22

Art 92.5
Human 92.4
Person 92.4
Painting 92.3
Person 91.2
Person 88.7
Person 86.2
Apparel 81.9
Clothing 81.9
Person 62.8
Person 62
Tapestry 60.3
Ornament 60.3

Clarifai
created on 2019-05-22

people 99.6
art 99.5
woman 99.2
wear 99
adult 98.8
painting 98.8
illustration 97.5
print 97.2
dress 96.3
group 96.1
religion 95.5
veil 95
man 92.8
princess 92.7
music 91.8
gown (clothing) 90.1
royalty 89.8
two 87.8
crown 87.8
three 86.8

Imagga
created on 2019-05-22

altar 42.5
structure 36.5
art 22.8
old 22.3
culture 16.2
religion 15.2
temple 15.1
oriental 15
religious 14
antique 13.9
traditional 13.3
wall 13
decoration 12.7
person 12.7
faith 12.4
holiday 12.2
historic 11.9
dress 11.7
vintage 11.6
festival 11.5
retro 11.4
ancient 11.2
grunge 11
costume 11
building 10.7
holy 10.6
travel 10.5
god 10.5
celebration 10.3
people 10
history 9.8
pretty 9.8
portrait 9.7
man 9.5
church 9.2
face 9.2
aged 9
design 9
prayer 8.7
saint 8.6
spiritual 8.6
architecture 8.6
black 8.4
texture 8.3
tradition 8.3
style 8.1
paint 8.1
border 8.1
gown 8
interior 8
adult 7.9
paper 7.9
icon 7.9
colorful 7.9
vestment 7.7
empty 7.7
winter 7.6
blackboard 7.6
historical 7.5
frame 7.5
east 7.5
celebrate 7.2
color 7.2
painting 7.2
home 7.2
material 7.1
clothing 7.1
hat 7

Google
created on 2019-05-22

Painting 92.8
Art 79.6
Textile 76
History 74
Miniature 69
Middle ages 65.1
Tapestry 59.6
Visual arts 59.3

Microsoft
created on 2019-05-22

painting 93.2
cartoon 91.2
person 82.1
dance 81.7
clothing 78.5
woman 69.9
poster 60.1
old 41.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-38
Gender Female, 53.6%
Calm 51.4%
Angry 45.4%
Disgusted 45.1%
Surprised 47.1%
Sad 45.7%
Confused 45.3%
Happy 45.1%

AWS Rekognition

Age 15-25
Gender Male, 52.3%
Confused 45.1%
Surprised 45.1%
Sad 45.2%
Happy 45.3%
Calm 54.2%
Angry 45.1%
Disgusted 45%

AWS Rekognition

Age 12-22
Gender Female, 51.3%
Confused 45.5%
Disgusted 45.5%
Surprised 46.7%
Angry 49.2%
Sad 45.6%
Calm 47.4%
Happy 45.1%

AWS Rekognition

Age 20-38
Gender Female, 54.5%
Happy 45.1%
Surprised 45.9%
Calm 52.1%
Disgusted 45.2%
Sad 45.8%
Confused 45.2%
Angry 45.7%

AWS Rekognition

Age 11-18
Gender Female, 53.7%
Angry 45.6%
Sad 45.4%
Surprised 45.3%
Happy 49.4%
Calm 48.7%
Disgusted 45.3%
Confused 45.2%

AWS Rekognition

Age 20-38
Gender Female, 54.6%
Disgusted 45.7%
Sad 45.4%
Angry 46.1%
Confused 45.2%
Surprised 49.6%
Calm 47%
Happy 46%

AWS Rekognition

Age 19-36
Gender Male, 53.3%
Disgusted 45.4%
Angry 46.1%
Confused 45.4%
Happy 47.5%
Calm 48.6%
Surprised 46.3%
Sad 45.8%

AWS Rekognition

Age 20-38
Gender Female, 51.2%
Confused 45.4%
Surprised 46.3%
Sad 46%
Happy 46.2%
Calm 46.2%
Angry 49.5%
Disgusted 45.5%

AWS Rekognition

Age 19-36
Gender Female, 50.1%
Disgusted 45.3%
Sad 46.1%
Happy 45.8%
Surprised 48.7%
Confused 45.5%
Angry 47.3%
Calm 46.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 92.4%
Painting 92.3%