Human Generated Data

Title

Mother and Child

Date

c. 1901

People

Artist: Pablo Ruiz Picasso, Spanish 1881 - 1973

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest from the Collection of Maurice Wertheim, Class of 1906, 1951.57

Copyright

© Estate of Pablo Picasso / Artists Rights Society (ARS), New York

Human Generated Data

Title

Mother and Child

People

Artist: Pablo Ruiz Picasso, Spanish 1881 - 1973

Date

c. 1901

Classification

Paintings

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Painting 99.7
Art 99.7
Person 67.9
Human 67.9

Clarifai
created on 2018-03-16

painting 100
art 100
religion 99.9
god 99.4
saint 98.6
prayer 98
people 98
book 97.3
illustration 96.7
holy 96.2
aura 96.2
spirituality 95.8
mural 94.4
Madonna 94.1
woman 93.9
one 93.1
belief 92.5
Mary 92.4
worship 91.9
Renaissance 91.4

Imagga
created on 2018-03-16

graffito 51.9
decoration 49.9
art 30.8
religion 25.1
god 20.1
tattoo 20
old 19.5
money 17.9
design 17.6
currency 17
church 16.6
face 16.3
mosaic 15.9
cash 15.6
dollar 14.8
banking 14.7
ancient 14.7
finance 14.4
statue 13.5
bank 13.4
faith 13.4
paper 13.3
religious 13.1
culture 12.8
business 12.7
financial 12.5
golden 12
wealth 11.7
history 11.6
vintage 11.6
holy 11.6
spirituality 11.5
temple 11.4
texture 11.1
close 10.8
antique 10.4
icon 10.3
pattern 10.3
architecture 10.2
bible 9.8
orthodox 9.8
portrait 9.7
one 9.7
museum 9.7
dollars 9.7
us 9.6
bill 9.5
ornament 9.5
savings 9.3
paint 9
gold 9
carving 9
color 8.9
detail 8.8
empire 8.8
sculpture 8.8
banknote 8.7
pray 8.7
hundred 8.7
prayer 8.7
artist 8.7
wall 8.5
grunge 8.5
painter 8.5
famous 8.4
rich 8.4
economy 8.3
investment 8.2
style 8.2
amulet 8.1
symbol 8.1
prophet 7.9
masterpiece 7.9
halo 7.9
byzantine 7.9
finances 7.7
spiritual 7.7
human 7.5
traditional 7.5
monument 7.5
note 7.3
backgrounds 7.3
figure 7.3
colorful 7.2

Google
created on 2018-03-16

Microsoft
created on 2018-03-16

text 97.5
book 96.4
blue 43.4
painting 20.7
leather 16.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Female, 86.1%
Calm 28%
Surprised 1.3%
Happy 0.9%
Sad 65.4%
Disgusted 1.1%
Confused 1.9%
Angry 1.4%

AWS Rekognition

Age 4-7
Gender Female, 52.9%
Calm 65.4%
Angry 4.3%
Happy 0.4%
Disgusted 0.9%
Sad 24.9%
Surprised 1.5%
Confused 2.5%

Microsoft Cognitive Services

Age 32
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 99.7%
Person 67.9%

Captions

Microsoft

a painting of a person 77.2%
a painting of a person lying on a leather surface 39.6%
a painting of a person lying on the ground 39.5%

Text analysis

Amazon

ninn
rdm
I
MHa