Human Generated Data

Title

Joseph Interpreting Pharaoh's Dreams

Date

17th century

People

Artist: Rembrandt Harmensz. van Rijn, Dutch 1606 - 1669

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Paul J. Sachs Memorial Fund and Alfred Bader Fund, 1981.126

Human Generated Data

Title

Joseph Interpreting Pharaoh's Dreams

People

Artist: Rembrandt Harmensz. van Rijn, Dutch 1606 - 1669

Date

17th century

Classification

Drawings

Machine Generated Data

Tags

Amazon
created on 2020-04-25

Human 99.4
Person 99.4
Art 98.8
Painting 98.8
Person 89.5
Person 78.1
Archaeology 75.6
Person 75.5
Person 57.9
Person 47.2

Clarifai
created on 2020-04-25

art 99.6
people 99.1
man 98.2
religion 98.1
sculpture 97.9
veil 95.1
god 94.4
monument 94.2
statue 93.3
ancient 93.1
adult 92.4
two 90.4
old 90.4
saint 89.6
Renaissance 87.1
sword 86.9
church 85.9
group 85.9
woman 84.9
print 84.6

Imagga
created on 2020-04-25

column 87.5
sculpture 65.2
statue 53.3
architecture 39.9
art 38.5
stone 35.8
temple 33.5
ancient 32.9
cemetery 32.5
religion 32.3
history 30.5
monument 29
culture 28.3
sketch 27.2
god 26.8
carving 25.4
old 25.1
fountain 24.8
travel 24.7
tourism 23.1
religious 22.5
building 21.5
carved 20.6
famous 20.5
city 20
drawing 18.9
marble 18.9
landmark 18.1
representation 17.4
figure 16.7
historic 16.5
structure 15.8
decoration 15.4
spirituality 15.4
tourist 14.5
church 13.9
detail 13.7
spiritual 13.5
historical 13.2
holy 12.5
exterior 12
catholic 11.7
traditional 11.7
heritage 11.6
design 11.5
face 11.4
east 11.2
antique 11.2
roman 11.1
sculptures 10.9
statues 10.9
palace 10.6
century 9.8
symbol 9.4
museum 9.1
ruins 8.8
golden 8.6
oriental 8.5
head 8.4
decorative 8.4
gold 8.2
style 8.2
relief 8
southeast 7.9
artistic 7.8
ruin 7.8
king 7.8
worship 7.7
facade 7.7
outdoor 7.7
capital 7.6
water 7.4
bust 7.1

Google
created on 2020-04-25

Microsoft
created on 2020-04-25

text 97.7
statue 94.5
book 92.8
ancient 84.5
person 76
art 71.7
sculpture 67.5
old 57.4
drawing 52.2
archaeology 50.6
stone 23
painting 21.5

Face analysis

Amazon

AWS Rekognition

Age 23-37
Gender Male, 54.4%
Confused 45%
Fear 45%
Surprised 45.8%
Sad 45.1%
Happy 46%
Angry 45.1%
Calm 52.9%
Disgusted 45%

AWS Rekognition

Age 44-62
Gender Male, 54.2%
Surprised 45.1%
Fear 45.1%
Happy 45.6%
Calm 48.5%
Confused 45%
Sad 45%
Angry 50.5%
Disgusted 45.1%

AWS Rekognition

Age 49-67
Gender Female, 52.3%
Happy 49.2%
Surprised 50.1%
Confused 45%
Disgusted 45%
Calm 45.5%
Sad 45%
Fear 45.1%
Angry 45%

AWS Rekognition

Age 49-67
Gender Male, 54.6%
Disgusted 45%
Happy 45%
Angry 45%
Sad 45%
Confused 45%
Surprised 45%
Fear 45%
Calm 54.9%

AWS Rekognition

Age 13-23
Gender Male, 53.9%
Happy 47.1%
Angry 45.1%
Surprised 45.1%
Calm 51.9%
Confused 45.1%
Fear 45%
Disgusted 45.1%
Sad 45.6%

AWS Rekognition

Age 4-12
Gender Male, 53.7%
Calm 46.2%
Disgusted 45%
Angry 45.3%
Fear 45.1%
Confused 45.2%
Happy 45%
Surprised 45%
Sad 53.1%

Feature analysis

Amazon

Person 99.4%
Painting 98.8%

Captions

Microsoft

a painting of a person 79.3%
a painting of a person 79.2%
a group of people standing next to a painting 66.4%

Text analysis

Amazon

pnhr