Human Generated Data

Title

Male and Female Bacchants Installing a Herm

Date

c. 1792

People

Artist: Louise Pithoud, French active circa 1782

Artist after: Jean-Guillaume Moitte, French 1746 - 1810

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Acquisition Fund for Prints, 2018.105

Human Generated Data

Title

Male and Female Bacchants Installing a Herm

People

Artist: Louise Pithoud, French active circa 1782

Artist after: Jean-Guillaume Moitte, French 1746 - 1810

Date

c. 1792

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-04-17

Human 98.5
Person 98.5
Person 98.5
Art 97.8
Painting 95.5
Person 92
Person 87.5
Person 86.9
Sculpture 85.5
Person 77.3
Statue 76.3
Person 72.6
Person 71
Archaeology 68.9
Person 67.4
Person 52.5
Person 45.8

Clarifai
created on 2019-04-17

people 100
adult 99.7
group 99.4
man 98.9
administration 98
music 97.8
woman 97.4
two 96.6
many 96.2
leader 96.1
art 93
several 91.5
three 90.9
one 90.6
print 90.4
wear 88.9
furniture 88.9
victory 87.9
seat 87.8
theater 87.7

Imagga
created on 2019-04-17

statue 34.9
sculpture 34.1
newspaper 34
product 29.2
art 25.4
treasury 25.3
creation 24.9
money 23.8
daily 23.4
ancient 22.5
history 22.4
currency 21.5
dollar 21.4
depository 20.9
stone 20.3
marble 20.1
architecture 19.5
cash 18.3
culture 18
finance 17.7
old 17.4
religion 17
bank 16.2
facility 16
travel 15.5
bill 15.2
savings 14.9
paper 14.9
banking 14.7
dollars 14.5
loan 14.4
wealth 14.4
financial 14.3
one 14.2
monument 14
business 14
rich 14
us 13.5
god 13.4
carving 13.2
religious 13.1
historic 12.8
landmark 12.6
tourism 12.4
historical 12.2
famous 12.1
detail 12.1
figure 12.1
hundred 11.6
closeup 11.5
face 11.4
building 11.1
church 11.1
temple 10.9
franklin 10.8
city 10.8
memorial 10.7
bills 10.7
pay 10.5
exchange 10.5
capital 10.4
close 10.3
fountain 10.3
structure 10.2
roman 10
vintage 9.9
catholic 9.7
symbol 9.4
antique 9.3
investment 9.2
brass 8.9
web site 8.6
tourist 8.2
success 8.1
column 8
banknotes 7.8
carved 7.8
portrait 7.8
holy 7.7
price 7.7
spiritual 7.7
cathedral 7.7
sky 7.7
economy 7.4
market 7.1

Google
created on 2019-04-17

Stock photography 72.2
Art 72.1
History 70.9
Painting 59.3
Antique 55.3
Illustration 50.4

Microsoft
created on 2019-04-17

text 99.2
book 92.5
old 53.4
vintage 29.3
boy 29.3
art 18.9
museum 18.7
ballet 18.2
black and white 15.4
person 13.6

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 20-38
Gender Female, 52.5%
Confused 45.4%
Angry 45.8%
Sad 48.9%
Happy 46%
Disgusted 45.5%
Surprised 45.7%
Calm 47.7%

AWS Rekognition

Age 16-27
Gender Female, 50.5%
Surprised 49.5%
Angry 49.6%
Calm 49.8%
Confused 49.5%
Happy 49.6%
Sad 49.9%
Disgusted 49.6%

AWS Rekognition

Age 26-43
Gender Female, 51.5%
Confused 45.1%
Happy 45.1%
Sad 47.3%
Calm 52%
Surprised 45.1%
Angry 45.2%
Disgusted 45.1%

AWS Rekognition

Age 26-43
Gender Female, 52%
Calm 50.7%
Angry 46.3%
Disgusted 45.2%
Surprised 45.3%
Sad 47%
Happy 45.2%
Confused 45.2%

AWS Rekognition

Age 20-38
Gender Male, 50%
Angry 49.7%
Calm 49.8%
Disgusted 49.5%
Confused 49.7%
Sad 49.7%
Happy 49.5%
Surprised 49.6%

AWS Rekognition

Age 23-38
Gender Female, 50.4%
Sad 49.7%
Disgusted 49.6%
Confused 49.5%
Angry 49.6%
Calm 50.1%
Surprised 49.5%
Happy 49.5%

AWS Rekognition

Age 17-27
Gender Male, 53.1%
Surprised 45.8%
Happy 45.1%
Disgusted 45.6%
Confused 45.3%
Sad 45.8%
Angry 45.8%
Calm 51.5%

AWS Rekognition

Age 29-45
Gender Female, 51%
Sad 49.7%
Confused 45.3%
Calm 45.4%
Happy 48.5%
Angry 45.4%
Disgusted 45.2%
Surprised 45.4%

AWS Rekognition

Age 23-38
Gender Female, 50.4%
Calm 49.7%
Surprised 49.6%
Confused 49.5%
Sad 49.8%
Happy 49.5%
Disgusted 49.6%
Angry 49.8%

AWS Rekognition

Age 49-69
Gender Male, 54.4%
Surprised 45.2%
Calm 52.3%
Happy 45.1%
Angry 45.7%
Confused 45.3%
Sad 46%
Disgusted 45.3%

Microsoft Cognitive Services

Age 28
Gender Female

Feature analysis

Amazon

Person 98.5%
Painting 95.5%

Captions

Microsoft

a vintage photo of a person 87.9%
a vintage photo of a person holding a book 56.1%
a vintage photo of a group of people posing for the camera 56%

Text analysis

Google

Л Per r ohr MTri n y R w (/E y e", Maar we é Mm ary r://hrare. .V0/29
w
(
Maar
29
r
n
Л
Per
y
R
E
hrare
.
/
ohr
MTri
,
we
é
ary
:
.V0
e
"
Mm
//