Human Generated Data

Title

Painted Panel with Crucifixion Inset

Date

1800 - 1825

People

Artist: Unidentified Artist,

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Philip M. Lydig, 1930.49

Human Generated Data

Title

Painted Panel with Crucifixion Inset

People

Artist: Unidentified Artist,

Date

1800 - 1825

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Philip M. Lydig, 1930.49

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Building 95.5
Architecture 95.5
Human 86.4
Person 86.4
Church 79.7
Altar 77.9
Art 70
Housing 60.4
Monastery 60.4
Person 59.1
Archaeology 57.6
Home Decor 56.3

Clarifai
created on 2020-04-24

art 98.6
religion 98.2
architecture 97.5
people 96.8
no person 94
illustration 92.6
church 92.5
door 92.1
decoration 91.6
ornate 90.3
old 90.2
building 89.1
grave 88.4
travel 88.3
mosaic 87.9
sculpture 86.7
temple 86.6
ancient 86.6
antique 86.4
castle 86.3

Imagga
created on 2020-04-24

art 30.1
architecture 29.8
religion 29.6
arabesque 26.1
ancient 24.2
stone 19.8
pattern 19.1
temple 18.9
facade 18.3
old 17.4
history 17
sculpture 16.8
church 16.6
wall 16.3
door 16.3
culture 16.2
travel 16.2
antique 16.2
mosaic 15.9
religious 15
relief 14.8
prayer rug 14.7
decoration 14.6
dollars 14.5
building 14.4
design 14.1
texture 13.9
carving 13.8
rug 13.7
currency 13.5
traditional 13.3
detail 12.1
money 11.9
hole 11.7
city 11.6
memorial 11.5
structure 11
cathedral 11
paper 11
wealth 10.8
entrance 10.6
god 10.5
style 10.4
ornament 10.3
close 10.3
east 10.3
century 9.8
oriental 9.4
monument 9.3
historic 9.2
floor cover 8.8
wages 8.8
furnishing 8.8
gravestone 8.8
house 8.8
hundred 8.7
spirituality 8.6
business 8.5
palace 8.5
historical 8.5
number 8.4
sign 8.3
tourism 8.3
cash 8.2
bank 8.2
board 8.1
symbol 8.1
metal 8.1
carved 7.8
gate 7.8
hardware 7.7
statue 7.6
finance 7.6
vintage 7.4
technology 7.4
gold 7.4
banking 7.4
investment 7.3
ornate 7.3
computer 7.2
home 7.2

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

text 95.8
drawing 77.4
white 72.4
store 64.4
old 58.6
art 55.1
altar 12.2
curb 7.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 36-54
Gender Male, 50.4%
Surprised 45.4%
Confused 45%
Happy 45.3%
Fear 45.1%
Sad 45.1%
Calm 53.9%
Angry 45.1%
Disgusted 45%

AWS Rekognition

Age 24-38
Gender Male, 50.1%
Surprised 49.5%
Sad 49.5%
Happy 49.6%
Calm 50.1%
Angry 49.5%
Fear 49.5%
Confused 49.5%
Disgusted 49.7%

AWS Rekognition

Age 54-72
Gender Male, 50.2%
Fear 49.5%
Disgusted 50%
Confused 49.5%
Angry 49.9%
Happy 49.5%
Calm 49.5%
Surprised 49.5%
Sad 49.5%

AWS Rekognition

Age 22-34
Gender Male, 51.5%
Disgusted 45%
Surprised 45.1%
Fear 45.1%
Sad 51.6%
Angry 45.1%
Happy 45%
Calm 48.1%
Confused 45%

AWS Rekognition

Age 36-52
Gender Female, 50.1%
Fear 49.7%
Happy 49.6%
Calm 49.9%
Surprised 49.7%
Confused 49.5%
Sad 49.6%
Angry 49.5%
Disgusted 49.5%

Feature analysis

Amazon

Person 86.4%
Altar 77.9%

Categories

Captions

Text analysis

Amazon

9
Eree3aH
ned noy 9 Eree3aH
Cnpe.stacia e
0
ned
noy
c
CeAuA
CeAuA ROnSFAE
ROnSFAE

Google

eaue nonsIhee 41
eaue
nonsIhee
41