Human Generated Data

Title

The Sacrifice of the Daughters of Jephtha

Date

17th century

People

Artist: Unidentified Artist,

Artist after: Simon Vouet, French 1590 - 1649

Previous attribution: Charles Le Brun, French 1619 - 1690

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mr. and Mrs. George H. Monks, 1922.109

Human Generated Data

Title

The Sacrifice of the Daughters of Jephtha

People

Artist: Unidentified Artist,

Artist after: Simon Vouet, French 1590 - 1649

Previous attribution: Charles Le Brun, French 1619 - 1690

Date

17th century

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mr. and Mrs. George H. Monks, 1922.109

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Human 98.6
Person 98.6
Person 96.2
Person 94.7
Art 94.1
Person 92.6
Person 79.9
Painting 78
Person 77.4
Person 76.7
Person 76.6
Person 73.7
Person 72.9
Person 70.1
Person 69.5
Person 63.2
Person 54.3
Person 49.2
Person 47.4
Person 41.8

Clarifai
created on 2020-04-24

people 100
group 99.8
many 98.6
adult 97
man 96.8
cavalry 95.8
transportation system 95.5
train 95
administration 94.3
vehicle 91.9
group together 91.6
leader 91.2
child 90.9
war 89.8
crowd 89.2
woman 89.2
art 86
street 84.8
boy 84.5
uniform 82.5

Imagga
created on 2020-04-24

case 52.6
television 33.7
window 17.5
shop 17.4
telecommunication system 17.3
tray 16.3
old 15.3
art 14.9
architecture 14
black 13.8
people 13.4
receptacle 12.4
man 12.1
container 11.6
vintage 11.6
building 11.2
home 11.1
mercantile establishment 10.9
business 10.9
house 10.8
design 10.7
male 10.6
office 10.5
broadcasting 10.5
church 10.2
person 10.1
passenger 9.6
culture 9.4
monitor 9.2
computer 8.9
interior 8.8
screen 8.8
adult 8.4
city 8.3
historic 8.2
one 8.2
retro 8.2
technology 8.2
symbol 8.1
postmark 7.9
postage 7.9
postal 7.8
envelope 7.8
stamp 7.7
modern 7.7
outside 7.7
mail 7.6
telecommunication 7.3
letter 7.3
place of business 7.3
religion 7.2
history 7.1
information 7.1
travel 7

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

window 99.9
person 96.7
painting 90.5
clothing 89
text 83.5
man 72.8
posing 72.7
old 72.3
museum 58.1
human face 53.1
gallery 51.2
picture frame 7.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-35
Gender Female, 50.1%
Calm 45.5%
Disgusted 45.1%
Surprised 53.7%
Fear 45.3%
Happy 45.2%
Angry 45%
Sad 45.1%
Confused 45%

AWS Rekognition

Age 44-62
Gender Female, 50.1%
Angry 45.2%
Happy 46.1%
Calm 51.8%
Sad 46.2%
Disgusted 45.3%
Surprised 45.1%
Fear 45.2%
Confused 45.1%

AWS Rekognition

Age 21-33
Gender Male, 50.2%
Calm 49.7%
Surprised 49.5%
Happy 49.6%
Sad 50%
Disgusted 49.5%
Confused 49.5%
Angry 49.6%
Fear 49.6%

AWS Rekognition

Age 22-34
Gender Male, 50.3%
Calm 45.8%
Happy 45.5%
Fear 45.4%
Sad 49.2%
Angry 45.5%
Confused 45.5%
Disgusted 45.1%
Surprised 48%

AWS Rekognition

Age 17-29
Gender Female, 50.9%
Angry 45.1%
Disgusted 45.2%
Surprised 46.2%
Fear 53.3%
Calm 45%
Happy 45.1%
Sad 45%
Confused 45%

AWS Rekognition

Age 38-56
Gender Male, 52.3%
Fear 46.1%
Happy 46.5%
Calm 47.3%
Surprised 47.7%
Confused 45.4%
Sad 46.1%
Angry 45.6%
Disgusted 45.3%

AWS Rekognition

Age 52-70
Gender Male, 54.3%
Angry 45.6%
Fear 45.4%
Disgusted 45.1%
Happy 45.1%
Calm 46.3%
Sad 45.4%
Surprised 51.6%
Confused 45.4%

AWS Rekognition

Age 41-59
Gender Male, 53%
Happy 46.8%
Disgusted 45.2%
Angry 45.7%
Calm 45.5%
Sad 50.2%
Fear 46.3%
Confused 45.1%
Surprised 45.2%

AWS Rekognition

Age 14-26
Gender Male, 52.9%
Happy 45.1%
Fear 45%
Surprised 45%
Confused 45%
Sad 54.7%
Calm 45.1%
Disgusted 45%
Angry 45%

Feature analysis

Amazon

Person 98.6%
Painting 78%