Human Generated Data

Title

Patrick Byrne, the Blind Irish Harper

Date

April 1, 1845

People

Artist: Hill & Adamson, British, Scottish active 1843-1848

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of David Becker, Mrs. Phyllis Lambert, and Mr. and Mrs. Herbert W. Pratt, P1973.56

Human Generated Data

Title

Patrick Byrne, the Blind Irish Harper

People

Artist: Hill & Adamson, British, Scottish active 1843-1848

Date

April 1, 1845

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of David Becker, Mrs. Phyllis Lambert, and Mr. and Mrs. Herbert W. Pratt, P1973.56

Machine Generated Data

Tags

Amazon
created on 2022-05-21

Harp 99.1
Musical Instrument 99.1
Lyre 91.6
Leisure Activities 91.6
Painting 87.1
Art 87.1
Person 84.6
Human 84.6

Clarifai
created on 2023-10-29

people 99.7
art 99.7
portrait 99.6
one 99.4
wear 99
harp 98.8
adult 98.7
music 98.1
painting 97.8
woman 97.2
vintage 95.4
man 95.4
print 94.9
veil 94.2
sepia 94.2
antique 93
old 92.5
sepia pigment 92.3
gown (clothing) 92.3
illustration 91.8

Imagga
created on 2022-05-21

harp 100
stringed instrument 52.1
musical instrument 34.9
statue 28.1
sculpture 26
monk 24
dress 23.5
religion 23.3
hair 18.2
old 17.4
fashion 17.3
catholic 16.5
lady 16.2
portrait 16.2
church 15.7
god 15.3
art 15
face 14.9
architecture 14.8
model 14.8
religious 14.1
pretty 14
culture 13.7
stone 13.5
holy 13.5
history 13.4
attractive 13.3
support 13.1
ancient 13
sexy 12.8
faith 12.4
person 12.4
adult 12.3
people 12.3
building 11.9
elegance 11.8
saint 11.5
temple 11.5
antique 11.4
sensual 10.9
brunette 10.5
historical 10.4
monument 10.3
device 10.1
makeup 10.1
posing 9.8
style 9.6
bride 9.6
costume 9.4
cute 9.3
city 9.1
sensuality 9.1
closeup 8.8
worship 8.7
cathedral 8.6
spiritual 8.6
make 8.2
clothing 8
black 7.8
sacred 7.8
pray 7.8
figure 7.7
sitting 7.7
spirituality 7.7
outdoor 7.6
traditional 7.5
famous 7.4
detail 7.2
body 7.2

Google
created on 2022-05-21

Microsoft
created on 2022-05-21

text 98.2
person 96.4
clothing 96.2
human face 86.1
painting 82
posing 74.3
woman 57.3
dressed 26.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 54-64
Gender Male, 100%
Calm 35.1%
Angry 32%
Sad 21.6%
Confused 12.3%
Surprised 6.8%
Fear 6.2%
Happy 0.6%
Disgusted 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 87.1%
Person 84.6%

Categories

Imagga

food drinks 80.2%
paintings art 14.8%
interior objects 3.6%