Human Generated Data

Title

Presentation in the Temple

Date

c. 1505

People

Artist: Albrecht Dürer, German 1471 - 1528

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. Denman W. Ross, M973

Human Generated Data

Title

Presentation in the Temple

People

Artist: Albrecht Dürer, German 1471 - 1528

Date

c. 1505

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. Denman W. Ross, M973

Machine Generated Data

Tags

Amazon
created on 2019-08-10

Art 95.1
Painting 91
Person 87
Human 87
Person 86.7
Person 80.5
Person 77.6
Person 69.7
Drawing 69.1
Person 68.9
Person 62.7

Clarifai
created on 2019-08-10

people 100
group 99.3
engraving 99.1
print 98.6
adult 98.4
many 97.4
administration 97
man 96.7
art 96.1
leader 94.8
illustration 89.1
etching 88.7
several 87.2
wear 86.7
woman 86.1
group together 85.2
veil 84.4
government 81.9
one 80.9
furniture 79.4

Imagga
created on 2019-08-10

sketch 75.7
drawing 59.8
representation 44
architecture 40.4
building 39.1
city 36.8
skyscraper 27.2
urban 19.2
skyline 19
new 17.8
modern 17.5
vintage 17.4
office 17.3
art 16.6
tower 16.1
cityscape 16.1
travel 15.5
grunge 15.3
sky 15.3
structure 14.9
design 14.6
business 14
old 13.9
ancient 13.8
downtown 13.4
buildings 13.2
texture 12.5
landmark 11.7
retro 11.5
tall 11.3
border 10.8
frame 10.8
financial 10.7
district 10.7
brown 10.3
construction 10.3
pattern 10.3
town 10.2
finance 10.1
element 9.9
history 9.8
house 9.7
center 9.6
graphic 9.5
paper 9.5
china 9.4
tourism 9.1
black 9
column 8.9
style 8.9
antique 8.8
high 8.7
window 8.6
historic 8.2
metropolis 7.8
skyscrapers 7.8
glass 7.8
palace 7.8
windows 7.7
grungy 7.6
landscape 7.4
decoration 7.4
night 7.1
scenic 7
season 7

Google
created on 2019-08-10

Art 78.1
History 73.3
Painting 70.4
Stock photography 66.6
Illustration 57.7
Column 53.3

Microsoft
created on 2019-08-10

drawing 98
text 97.9
sketch 96.7
illustration 88.3
cartoon 84.6
book 74.2
person 73.9
clothing 56
engraving 55.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-37
Gender Male, 52.7%
Surprised 45.1%
Angry 45.1%
Fear 45.1%
Happy 45.1%
Disgusted 45.3%
Confused 45.1%
Calm 53%
Sad 46.2%

AWS Rekognition

Age 20-32
Gender Male, 51.2%
Sad 45.1%
Confused 45%
Fear 45%
Calm 54.3%
Happy 45%
Angry 45%
Disgusted 45%
Surprised 45.5%

AWS Rekognition

Age 23-35
Gender Male, 54.7%
Disgusted 45%
Happy 46.1%
Angry 45%
Fear 45%
Surprised 45.3%
Calm 53.5%
Sad 45%
Confused 45%

AWS Rekognition

Age 43-61
Gender Male, 54.7%
Disgusted 45%
Sad 45.1%
Angry 45.1%
Happy 45%
Surprised 45%
Fear 45%
Calm 54.8%
Confused 45%

AWS Rekognition

Age 21-33
Gender Male, 50.2%
Angry 49.5%
Disgusted 49.5%
Fear 49.5%
Sad 49.6%
Surprised 49.5%
Happy 49.5%
Confused 49.5%
Calm 50.3%

AWS Rekognition

Age 24-38
Gender Male, 50.1%
Happy 49.5%
Calm 49.5%
Disgusted 49.5%
Fear 49.5%
Surprised 49.5%
Confused 49.5%
Angry 50.4%
Sad 49.6%

AWS Rekognition

Age 37-55
Gender Male, 54%
Confused 45%
Angry 45%
Sad 45.1%
Surprised 45%
Happy 45.1%
Disgusted 45%
Calm 54.8%
Fear 45%

Feature analysis

Amazon

Painting 91%
Person 87%

Categories

Captions

Microsoft
created on 2019-08-10

an old photo of a building 83.6%
a close up of an old building 82.8%
an old photo of a person 70.1%