Human Generated Data

Title

Eugène de Beauharnais

Date

19th century

People

Artist: Giuseppe Longhi, Italian 1766 - 1831

Artist after: François-Pascal-Simon Gérard, French 1770 - 1837

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G2402

Human Generated Data

Title

Eugène de Beauharnais

People

Artist: Giuseppe Longhi, Italian 1766 - 1831

Artist after: François-Pascal-Simon Gérard, French 1770 - 1837

Date

19th century

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-11-07

Human 98.8
Person 98.8
Apparel 95
Clothing 95
Art 93.1
Painting 89.4
Female 74.1
Drawing 69.8
Face 66.9
Photo 66.9
Photography 66.9
Portrait 66.9
Sketch 56.8
Woman 56.5
Person 56

Clarifai
created on 2019-11-07

people 99.8
print 98.6
adult 97.5
one 97.2
two 95.2
woman 95.2
group 94.4
wear 93.7
illustration 93.6
lithograph 92.6
man 91.7
art 91.5
portrait 90.6
furniture 88.7
music 85.1
canine 84.7
actress 82.2
leader 78.9
mammal 77
administration 77

Imagga
created on 2019-11-07

statue 39.4
sculpture 34.5
clothing 26.2
military uniform 24.7
art 23.1
old 23
religion 21.5
monument 20.6
uniform 19.9
history 19.7
culture 17.9
tourism 17.3
architecture 17.2
dress 17.2
religious 15.9
famous 15.8
ancient 15.6
covering 15.4
stone 15.3
historical 15.1
travel 14.8
city 14.1
antique 14.1
marble 13.7
consumer goods 13.2
historic 12.8
face 12.8
traditional 12.5
person 12.2
lady 12.2
people 11.2
figure 11.1
man 10.8
vintage 10.8
catholic 10.7
fashion 10.6
god 10.5
portrait 10.4
building 10.3
church 10.2
landmark 9.9
temple 9.7
holy 9.6
faith 9.6
world 9.3
style 8.9
statues 8.9
sacred 8.8
outfit 8.7
heritage 8.7
spiritual 8.6
decoration 8.4
memorial 8.4
makeup 8.2
tourist 8.2
detail 8
adult 7.9
facade 7.9
black 7.8
spirituality 7.7
carving 7.5
retro 7.4
exterior 7.4
structure 7.4
sketch 7.3

Google
created on 2019-11-07

Microsoft
created on 2019-11-07

text 97.4
clothing 97
person 94.2
man 91.1
woman 80.6
footwear 74.6
drawing 65.1
human face 58.9
old 56.3
posing 51

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Male, 54.8%
Surprised 45%
Fear 45%
Sad 45%
Calm 55%
Confused 45%
Angry 45%
Happy 45%
Disgusted 45%

Microsoft Cognitive Services

Age 44
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Painting 89.4%

Captions

Microsoft

a vintage photo of a man 87.8%
a vintage photo of a man standing in front of a building 75.2%
a vintage photo of a man and a woman posing for a picture 64.4%

Text analysis

Amazon

eine
FO1201