Human Generated Data

Title

A Prince Receives a Water Jug from a Young Woman at a Well

Date

c. 1745

People

-

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of John Kenneth Galbraith, 1972.350

Human Generated Data

Title

A Prince Receives a Water Jug from a Young Woman at a Well

Date

c. 1745

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of John Kenneth Galbraith, 1972.350

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Art 97.7
Painting 97.7
Person 95.8
Human 95.8
Person 95.7
Mammal 94.3
Horse 94.3
Animal 94.3
Person 91.4
Person 90.2
Person 89.6
Person 78.9

Clarifai
created on 2020-04-24

woman 98.6
people 98.6
art 97.9
two 96.9
man 96.4
painting 95.6
illustration 95.5
child 94.5
adult 93.6
dress 93.3
veil 93.3
religion 93.2
print 93.1
recreation 91.3
old 90.1
family 89.5
fun 88.8
wear 88.5
outdoors 88.4
love 87.4

Imagga
created on 2020-04-24

person 30.2
adult 23.3
speaker 21.8
people 20.6
couple 19.2
portrait 18.8
dress 18.1
fan 17.9
groom 17.1
articulator 17
man 16.8
musical instrument 16.7
teacher 16.4
face 16.3
male 16.3
metropolitan 16.2
wind instrument 15.5
love 15
follower 13.7
traditional 13.3
religious 13.1
culture 12.8
costume 12.8
communicator 12.8
professional 12.7
religion 12.5
church 12
tradition 12
happy 11.9
educator 11.9
two 11.9
business 11.5
bride 11.5
new 11.3
fashion 11.3
men 11.2
wedding 11
clothing 11
happiness 11
romantic 10.7
lady 10.5
looking 10.4
architecture 10.1
statue 10.1
accordion 10.1
outdoors 9.7
attractive 9.1
pretty 9.1
park 9.1
gold 9
ceremony 8.7
lifestyle 8.7
spiritual 8.6
married 8.6
smile 8.5
marriage 8.5
wife 8.5
art 8.5
senior 8.4
old 8.4
city 8.3
keyboard instrument 8.1
world 8.1
smiling 8
building 7.9
women 7.9
husband 7.8
travel 7.7
golden 7.7
sax 7.7
faith 7.7
god 7.6
elegance 7.6
bouquet 7.5
suit 7.4
outfit 7.4
long 7.3
hat 7.3
group 7.2
celebration 7.2
romance 7.1
hair 7.1
together 7

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

grass 99.3
painting 97.3
drawing 97.1
text 95.7
sketch 88.9
outdoor 85.5
cartoon 84.3
person 82.7
clothing 76.8
child art 73.7
horse 63.3
dress 62.4
woman 60.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 12-22
Gender Female, 52.4%
Disgusted 46.1%
Sad 45.1%
Fear 45.1%
Angry 45.4%
Confused 45.1%
Calm 52.7%
Surprised 45.3%
Happy 45.2%

AWS Rekognition

Age 13-23
Gender Female, 50%
Calm 47.9%
Sad 45.1%
Surprised 45.9%
Disgusted 45%
Angry 45.3%
Happy 50.5%
Confused 45.1%
Fear 45.1%

AWS Rekognition

Age 12-22
Gender Female, 54.7%
Fear 45%
Angry 45.1%
Confused 45%
Disgusted 45%
Happy 45.1%
Calm 54.5%
Sad 45.2%
Surprised 45%

AWS Rekognition

Age 21-33
Gender Female, 53.1%
Surprised 45.1%
Happy 45%
Fear 45.2%
Confused 45%
Disgusted 45.1%
Sad 45%
Calm 45.3%
Angry 54.2%

AWS Rekognition

Age 13-23
Gender Female, 54.1%
Fear 45%
Disgusted 45%
Confused 45%
Sad 45.1%
Happy 45%
Surprised 45%
Angry 52.7%
Calm 47.1%

AWS Rekognition

Age 16-28
Gender Male, 52%
Disgusted 45%
Happy 45%
Angry 45%
Sad 45%
Confused 45%
Surprised 45%
Fear 45%
Calm 55%

Feature analysis

Amazon

Painting 97.7%
Person 95.8%
Horse 94.3%

Categories

Captions