Human Generated Data

Title

The Annunciation

Date

1475-1499

People

Artist: Unidentified Artist,

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Professor and Mrs. John Tucker Murray, 1938.80

Human Generated Data

Title

The Annunciation

People

Artist: Unidentified Artist,

Date

1475-1499

Classification

Paintings

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Person 96.2
Human 96.2
Person 87.7
Art 87.2
Painting 83.3
Drawing 83.2
Clothing 80
Apparel 80
Female 77.1
Advertisement 73
Poster 71.9
Text 70.3
Girl 60.2
Woman 60
Blonde 60
Kid 60
Child 60
Teen 60
Sketch 56.7
Label 56.5

Clarifai
created on 2020-04-24

people 99.9
adult 99.1
illustration 99
art 98.7
woman 97.6
painting 97.3
print 96
group 94
gown (clothing) 92.8
wear 92.6
portrait 92.4
crown 92.4
veil 90.8
one 90.3
princess 89.9
leader 89
kneeling 88.8
saint 88.5
two 86.6
religion 86.4

Imagga
created on 2020-04-24

architecture 31.6
door 27
old 26.5
decoration 25.9
building 23.2
graffito 21.9
wall 21.5
sculpture 20.7
stone 20.6
art 19.7
ancient 19
device 18.1
antique 17.3
window 16.6
detail 16.1
history 15.2
vintage 14.9
facade 14.4
memorial 14.4
monument 14
carving 14
corbel 13.9
travel 12.7
structure 12.6
entrance 12.6
religion 12.5
temple 12.5
culture 12
historic 11.9
bracket 11.6
tourism 11.6
statue 11.2
classic 11.1
support 10.8
texture 10.4
historical 10.4
pattern 10.3
famous 10.2
ornate 10.1
house 10
book jacket 10
latch 9.9
arch 9.7
design 9.7
weathered 9.5
religious 9.4
knocker 9.3
blackboard 9.2
landmark 9
gravestone 9
carved 8.8
architectural 8.7
grunge 8.5
brass 8.5
fastener 8.5
city 8.3
retro 8.2
tourist 8.2
brown 8.1
metal 8
catch 8
paper 7.9
ornament 7.8
jacket 7.7
stucco 7.6
buildings 7.6
details 7.6
iron 7.5
shop 7.4
symbol 7.4
church 7.4
exterior 7.4
black 7.2
home 7.2
currency 7.2
column 7.1
textured 7

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

text 99.9
book 99.7
cartoon 96.9
drawing 96.3
painting 90.4
window 89.5
sketch 86.4
poster 83.6
illustration 66.6
person 61.5
human face 57.2
graffiti 44.4

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 22-34
Gender Female, 98.7%
Confused 0.3%
Fear 0%
Disgusted 0.1%
Sad 10.5%
Calm 88.1%
Happy 0.5%
Surprised 0.1%
Angry 0.3%

AWS Rekognition

Age 23-37
Gender Female, 96.1%
Fear 1.5%
Disgusted 1.4%
Angry 5.1%
Happy 1.3%
Surprised 7.6%
Calm 73.8%
Sad 7.5%
Confused 1.7%

Microsoft Cognitive Services

Age 25
Gender Female

Microsoft Cognitive Services

Age 27
Gender Female

Feature analysis

Amazon

Person 96.2%
Painting 83.3%

Captions

Microsoft

a person with graffiti on the side of a building 43.8%
a person with graffiti on the side of a window 35.8%
a close up of a person with graffiti on the side of a building 34.8%

Text analysis

Google

ERA
ERA