Human Generated Data

Title

Christian Captives

Date

19th century

People

Artist: Carl Joseph Alois Agricola, German 1779 - 1852

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, 1898.171

Human Generated Data

Title

Christian Captives

People

Artist: Carl Joseph Alois Agricola, German 1779 - 1852

Date

19th century

Classification

Drawings

Machine Generated Data

Tags

Amazon
created on 2020-04-25

Human 99.5
Person 99.5
Person 99.3
Person 98.3
Painting 98
Art 98
Person 97.9
Person 96.1
Person 94.9

Clarifai
created on 2020-04-25

people 100
group 99.8
print 99.7
art 99.7
adult 99.3
engraving 98.5
man 98
wear 97.9
portrait 97
woman 95.6
etching 95.1
painting 95
illustration 94.6
veil 94
leader 93.3
gown (clothing) 93.2
baby 92.8
sepia pigment 92.6
vintage 92.2
three 91.7

Imagga
created on 2020-04-25

sculpture 69.9
statue 47
art 37.7
carving 37
decoration 34
graffito 32.9
architecture 31.4
structure 29.8
ancient 29.4
history 27.7
memorial 25.7
stone 24.4
figure 24.1
monument 23.4
culture 23.1
old 23
landmark 21.7
brass 20.7
support 20.6
marble 20.4
religion 19.7
pedestal 18.6
column 18
building 17.5
travel 16.9
famous 16.8
historical 16
historic 15.6
sketch 15.6
temple 15.4
god 15.3
detail 15.3
plastic art 14.2
city 14.1
tourism 14
fountain 13.5
cemetery 13.2
antique 13.1
carved 12.7
religious 12.2
roman 12.1
church 12
drawing 11.6
exterior 11.1
design 11
tourist 10.9
statues 10.8
supporting structure 10.8
symbol 10.8
bust 10.8
catholic 10.7
facade 10.6
vintage 9.9
representation 9.7
museum 9.7
spiritual 9.6
capital 9.5
money 9.4
face 9.2
holy 8.7
spirituality 8.6
details 8.5
decorative 8.4
cash 8.2
style 8.2
bookend 8
carvings 7.9
sculptures 7.9
mythology 7.9
paper 7.8
heritage 7.7
palace 7.7
detailed 7.7
wall 7.7
architectural 7.7
bill 7.6
finance 7.6
destination 7.5
dollar 7.4
ornate 7.3
currency 7.2

Google
created on 2020-04-25

Microsoft
created on 2020-04-25

text 100
book 99.6
drawing 97.8
sketch 95.1
person 93.3
clothing 91
painting 89.9
old 88.7
posing 84.9
white 65
cartoon 64.4
vintage 63.4

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 4-14
Gender Female, 54.9%
Disgusted 45%
Calm 55%
Surprised 45%
Angry 45%
Happy 45%
Sad 45%
Confused 45%
Fear 45%

AWS Rekognition

Age 22-34
Gender Male, 52.1%
Confused 45.1%
Angry 45.3%
Calm 50.3%
Sad 46.6%
Happy 46.3%
Surprised 46.1%
Fear 45.1%
Disgusted 45.2%

AWS Rekognition

Age 23-35
Gender Female, 53.8%
Disgusted 45%
Confused 45%
Calm 53.3%
Angry 45.1%
Fear 45%
Surprised 45%
Happy 46.5%
Sad 45%

AWS Rekognition

Age 21-33
Gender Female, 53.4%
Angry 45.2%
Fear 45.1%
Disgusted 45.2%
Happy 47.9%
Calm 47.7%
Sad 45.1%
Surprised 45.8%
Confused 48%

AWS Rekognition

Age 23-35
Gender Male, 51.6%
Angry 45.2%
Fear 45%
Confused 45%
Sad 45%
Surprised 45.4%
Disgusted 45%
Happy 45%
Calm 54.3%

Microsoft Cognitive Services

Age 29
Gender Female

Feature analysis

Amazon

Person 99.5%
Painting 98%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 77.6%
a vintage photo of a group of people posing for a picture 77.5%
a vintage photo of a group of people posing for a photo 74.7%

Text analysis

Amazon

Ngrirla

Google

Mgrieda
Mgrieda aruola
aruola