Human Generated Data

Title

Jesus in the Temple among the Pharisees

Date

16th century

People

Artist: Hans Wechtlin, German 1480/1485 - 1526

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R948

Human Generated Data

Title

Jesus in the Temple among the Pharisees

People

Artist: Hans Wechtlin, German 1480/1485 - 1526

Date

16th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R948

Machine Generated Data

Tags

Amazon
created on 2019-08-10

Art 96.8
Painting 95.5
Human 92.4
Person 92.4
Person 89.3
Person 82.6
Person 77.1
Drawing 68.1
Person 59.7

Clarifai
created on 2019-08-10

art 99.2
people 99.1
illustration 98.9
religion 97.5
print 96.5
man 96.5
adult 94.7
god 94.5
engraving 94.3
ancient 92.4
painting 91.2
veil 90.9
chalk out 88.2
old 87.7
group 87.1
sword 86.8
retro 85.5
gown (clothing) 85.5
woodcut 85.4
saint 85.3

Imagga
created on 2019-08-10

art 33.4
old 29.3
drawing 29.2
sketch 26.9
ancient 26.8
stucco 25.6
history 23.3
arabesque 22.8
antique 22.5
vintage 21.5
culture 21.4
pattern 20.5
architecture 19.7
temple 19.5
design 19.2
religion 18.8
paper 18.8
texture 18.8
sculpture 18.6
decoration 18.3
carving 17.9
representation 17.5
retro 16.4
grunge 16.2
structure 15.6
monument 15
arch 14.9
ornate 14.7
landmark 14.5
carved 13.7
stone 13.6
ornament 12.9
wall 12.8
frame 12.7
travel 12.7
traditional 12.5
historic 11.9
statue 11.6
memorial 11.3
famous 11.2
style 11.1
money 11.1
floral 11.1
decor 10.6
building 10.6
symbol 10.1
triumphal arch 9.9
currency 9.9
detail 9.7
historical 9.4
classic 9.3
dollar 9.3
city 9.2
element 9.1
tourism 9.1
card 9.1
close 8.6
tile 8.6
finance 8.5
religious 8.4
east 8.4
stamp 8.2
aged 8.2
bank 8.1
facade 8
engraving 7.9
artistic 7.8
royal 7.7
column 7.7
architectural 7.7
relief 7.6
decorative 7.5
cash 7.3
painting 7.2
material 7.2
financial 7.1

Google
created on 2019-08-10

Microsoft
created on 2019-08-10

nintendo 100
drawing 99.6
text 99.5
sketch 99.4
book 97.1
illustration 94.2
cartoon 92.5
person 82.8
engraving 78.3
woodcut 67.1
clothing 59.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 17-29
Gender Female, 50.2%
Disgusted 45%
Happy 45%
Calm 45%
Fear 45.2%
Surprised 45%
Confused 45%
Angry 45.1%
Sad 54.6%

AWS Rekognition

Age 23-37
Gender Male, 53.8%
Surprised 45%
Fear 45.2%
Calm 49.2%
Disgusted 45%
Happy 45.2%
Confused 45.1%
Sad 48.5%
Angry 46.8%

AWS Rekognition

Age 26-42
Gender Female, 50.2%
Disgusted 45.2%
Happy 45.3%
Fear 47.5%
Surprised 45.7%
Calm 49.4%
Angry 46.1%
Sad 45.7%
Confused 45.1%

AWS Rekognition

Age 26-40
Gender Male, 53.9%
Happy 45%
Angry 45.1%
Fear 46.3%
Sad 48.7%
Calm 49.8%
Surprised 45.1%
Confused 45%
Disgusted 45%

AWS Rekognition

Age 26-40
Gender Male, 54.6%
Happy 45%
Disgusted 45%
Calm 55%
Fear 45%
Angry 45%
Sad 45%
Surprised 45%
Confused 45%

AWS Rekognition

Age 38-56
Gender Male, 52.9%
Surprised 45%
Happy 45%
Calm 52.6%
Fear 45%
Confused 45%
Sad 47.4%
Disgusted 45%
Angry 45%

AWS Rekognition

Age 26-40
Gender Female, 51.8%
Surprised 47.5%
Angry 47.6%
Disgusted 45.1%
Sad 46.4%
Happy 45.1%
Calm 47%
Fear 46.2%
Confused 45.1%

Feature analysis

Amazon

Person 92.4%

Categories

Captions

Microsoft
created on 2019-08-10

a close up of a book 49.3%
close up of a book 43.3%
a close up of a book cover 43.2%