Human Generated Data

Title

Two Evangelists

Date

c. 1865

People

Artist: Johannes Adam Simon Oertel, American 1823 - 1909

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Mrs. William Hayes Fogg, 1895.694.2

Human Generated Data

Title

Two Evangelists

People

Artist: Johannes Adam Simon Oertel, American 1823 - 1909

Date

c. 1865

Classification

Paintings

Machine Generated Data

Tags

Amazon
created on 2020-04-23

Art 93.9
Human 92.6
Person 92.6
Clothing 88.1
Apparel 88.1
Person 87.7
Painting 74
Cloak 57.1
Fashion 57.1

Clarifai
created on 2020-04-23

painting 99.4
people 99.3
wear 96.7
gown (clothing) 96.7
art 96.5
adult 96.4
man 96.3
religion 96.1
woman 93.7
one 93.6
two 93.1
Renaissance 92.3
portrait 91.5
facial hair 90.9
cape 88.7
saint 87.2
facial expression 85.7
child 85.2
son 84.9
veil 82.1

Imagga
created on 2020-04-23

clothing 32.1
wardrobe 30.3
person 26.2
dress 26.2
fashion 24.1
adult 24
people 24
pretty 23.8
portrait 23.3
garment 22.2
jacket 21.1
furniture 19
covering 18.2
attractive 18.2
happy 17.5
lady 16.2
sexy 16.1
smile 15.7
face 15.6
furnishing 15.6
monk 15.4
cloak 15.4
smiling 15.2
cute 15.1
clothes 15
model 14.8
hair 14.3
lifestyle 13.7
casual 13.5
brunette 13.1
culture 12.8
style 12.6
traditional 12.5
costume 12.2
cheerful 12.2
standing 12.2
old 11.8
happiness 11.7
man 11.4
religion 10.7
looking 10.4
women 10.3
church 10.2
hanger 10.1
gorgeous 10
robe 9.8
art 9.8
bathrobe 9.7
black 9.6
couple 9.6
faith 9.6
ethnic 9.5
religious 9.4
gown 9.1
building 8.8
outfit 8.7
expression 8.5
outdoor 8.4
sale 8.3
teen 8.3
fun 8.2
one 8.2
outdoors 8.2
children 8.2
child 7.8
scarf 7.7
outside 7.7
vestment 7.7
trench coat 7.6
skin 7.6
joy 7.5
coat 7.3
shopping 7.3
20s 7.3
makeup 7.3
indoor 7.3
color 7.2
male 7.1
family 7.1
lovely 7.1
posing 7.1
love 7.1

Google
created on 2020-04-23

Painting 84.8
Prophet 68.5
Art 58.1

Microsoft
created on 2020-04-23

painting 97
person 96.2
clothing 91.2
text 81.3
human face 78.2
picture frame 55.1
art 52.5

Face analysis

Amazon

Google

AWS Rekognition

Age 28-44
Gender Female, 85.9%
Calm 98%
Happy 0.2%
Sad 0.7%
Angry 0.6%
Disgusted 0.1%
Confused 0.3%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 42-60
Gender Male, 97.2%
Happy 0.4%
Surprised 0.7%
Sad 39.5%
Calm 44.3%
Disgusted 1.1%
Fear 0.6%
Confused 9.3%
Angry 4.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 92.6%
Painting 74%

Captions

Microsoft

a woman standing in front of a mirror posing for the camera 79.6%
a woman standing in front of a mirror 76%
a woman standing next to a door 75.9%