Human Generated Data

Title

Two Girls

Date

20th century

People

Artist: Raphael Soyer, American 1899 - 1987

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of James N. Rosenberg, 1946.8

Human Generated Data

Title

Two Girls

People

Artist: Raphael Soyer, American 1899 - 1987

Date

20th century

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of James N. Rosenberg, 1946.8

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Painting 99.8
Art 99.8
Human 97.1
Person 97.1
Person 60.9

Clarifai
created on 2018-03-16

painting 100
people 99.9
adult 99.9
illustration 99.8
art 99.8
seat 99.7
print 99.7
woman 99.6
two 99.6
furniture 99.6
Renaissance 99
sit 98.7
lithograph 98.3
wear 98.3
man 98
baby 97.6
kneeling 97.6
group 97.5
child 97.5
room 97.4

Imagga
created on 2018-03-16

sculpture 18.1
person 17.9
art 17.3
religion 17
people 16.7
man 16.1
religious 15
adult 15
male 14.9
statue 14.1
mosaic 13.8
old 13.2
wall 13
face 12.1
church 12
temple 11.7
happy 10.6
god 10.5
work 10.3
boa constrictor 10.3
culture 10.3
interior 9.7
portrait 9.7
holy 9.6
spirituality 9.6
faith 9.6
travel 9.2
vintage 9.1
carving 9
fashion 9
lady 8.9
together 8.8
throne 8.8
device 8.7
smiling 8.7
ancient 8.6
child 8.5
human 8.2
boa 8.2
mother 8.1
home 8
building 7.9
indoors 7.9
virgin 7.9
bible 7.8
sacred 7.8
pray 7.8
golden 7.7
saint 7.7
figure 7.7
expression 7.7
stone 7.7
city 7.5
chair 7.4
adolescent 7.4
color 7.2
family 7.1
architecture 7

Google
created on 2018-03-16

art 90.5
painting 90
modern art 71.8
artwork 63.2
human behavior 56.1
hetaira 54.9
girl 51.3
portrait 51.3
watercolor paint 50.5

Microsoft
created on 2018-03-16

person 96.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 16-27
Gender Male, 94.5%
Confused 7.3%
Angry 18%
Surprised 8.7%
Calm 14.8%
Happy 16.1%
Disgusted 11.4%
Sad 23.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 99.8%
Person 97.1%

Categories