Human Generated Data

Title

Douglas Horton (1891-1968)

Date

1959

People

Artist: William Franklin Draper, American 1912 - 2003

Sitter: Douglas Horton, 1891 - 1968

Classification

Paintings

Credit Line

Harvard University Portrait Collection, Harvard Divinity School, H580

Human Generated Data

Title

Douglas Horton (1891-1968)

People

Artist: William Franklin Draper, American 1912 - 2003

Sitter: Douglas Horton, 1891 - 1968

Date

1959

Classification

Paintings

Machine Generated Data

Tags

Amazon
created on 2020-04-23

Painting 95.9
Art 95.9
Human 89
Priest 80.2
Person 78.2
Apparel 76.5
Clothing 76.5
Bishop 65.6
Portrait 62.4
Face 62.4
Photography 62.4
Photo 62.4

Clarifai
created on 2020-04-23

man 99.2
people 99
one 98.1
painting 97.9
portrait 96.9
wear 96.3
gown (clothing) 96.1
art 95.9
cape 93.7
cap 93.4
adult 92.4
lid 92.4
elderly 92.3
accomplishment 92
religion 87
old 84.2
two 82.7
graduation 81.6
ceremony 78.4
veil 77.2

Imagga
created on 2020-04-23

academic gown 100
gown 100
outerwear 77.4
clothing 53.1
man 46.3
male 39
covering 31.2
person 29.8
consumer goods 27.5
people 25.1
adult 22.7
portrait 22
work 18
job 17.7
old 17.4
men 17.2
happy 16.3
professional 15.3
businessman 15
mature 14.9
business 14.6
black 14.5
executive 14.1
hat 13.7
handsome 13.4
worker 13.3
senior 13.1
face 12.8
suit 12.7
smile 12.1
guy 12
religion 11.6
art 10.9
smiling 10.8
uniform 10.6
human 10.5
glasses 10.2
elderly 9.6
standing 9.6
model 9.3
hand 9.1
interior 8.8
home 8.8
labor 8.8
lifestyle 8.7
expression 8.5
attractive 8.4
confident 8.2
grandfather 8.1
sculpture 8.1
success 8
statue 7.9
bible 7.8
boy 7.8
sitting 7.7
holy 7.7
god 7.6
serious 7.6
casual 7.6
figure 7.6
head 7.6
religious 7.5
shirt 7.5
vestment 7.2
working 7.1

Google
created on 2020-04-23

Portrait 92.2
Painting 85.9
Scholar 77.8
Elder 75.2
Art 62.5
Preacher 62.3
Friar 54
Self-portrait 51.9

Microsoft
created on 2020-04-23

painting 99.4
text 98.5
drawing 94.1
man 93.1
book 91.8
person 66.4
human face 56.8
old 53.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 51-69
Gender Male, 99.7%
Surprised 0.2%
Sad 0.4%
Disgusted 0.1%
Confused 0.3%
Calm 95.1%
Angry 0.1%
Fear 0%
Happy 3.7%

AWS Rekognition

Age 29-45
Gender Male, 66.8%
Angry 0.2%
Fear 94.2%
Disgusted 0.1%
Happy 0.1%
Calm 1%
Sad 3.4%
Surprised 0.7%
Confused 0.2%

Microsoft Cognitive Services

Age 69
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 95.9%
Person 78.2%

Captions

Microsoft

a man holding a book 47.1%
a man sitting on top of a book 33.2%
a man sitting on a book 32%