Human Generated Data

Title

Mother and Two Children

Date

1917

People

Artist: Tsugouharu Foujita, Japanese 1886 - 1968

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of James Naumburg Rosenberg, M15539

Copyright

© Foujita / Artists Rights Society (ARS), New York

Human Generated Data

Title

Mother and Two Children

People

Artist: Tsugouharu Foujita, Japanese 1886 - 1968

Date

1917

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-04-05

Art 89.9
Human 77.1
Person 72.2
Text 72.1
Painting 69.7
Person 63.8
Skin 59.2

Clarifai
created on 2018-04-19

people 99.9
illustration 99.5
art 99.1
painting 98.9
person 97.8
print 95.6
famous 94.4
adult 93.7
woman 90.4
entertainment 90.1
man 89.7
ancient 89.3
visuals 83.2
nude 82.2
antique 81.2
retro 80.4
vintage 78.5
religion 74.5
paper 73.5
artistic 72.5

Imagga
created on 2018-04-19

book jacket 100
jacket 83.1
wrapping 63.1
covering 43.1
graffito 28.3
decoration 27
art 23.5
painter 21.2
grunge 17.9
vintage 16.5
design 15.7
old 14.6
comic book 13.2
stamp 12.9
face 12.8
man 12.8
black 12.6
retro 12.3
icon 11.9
postmark 11.8
postage 11.8
postal 11.8
cartoon 11.6
mail 11.5
character 11.3
people 11.2
pattern 10.9
close 10.8
symbol 10.8
currency 10.8
color 10.6
drawing 10.3
letter 10.1
painting 9.9
collection 9.9
sexy 9.6
animal 9.4
antique 8.7
ancient 8.6
painted 8.6
post 8.6
culture 8.5
male 8.5
money 8.5
wallpaper 8.4
portrait 8.4
fashion 8.3
entertainment 8.3
silhouette 8.3
artwork 8.2
tattoo 8.2
paint 8.1
sketch 8.1
game 8
textured 7.9
colorful 7.9
artistic 7.8
envelope 7.8
set 7.6
writing 7.5
happy 7.5
china 7.5
fun 7.5
funny 7.3
detail 7.2
wealth 7.2
holiday 7.2
financial 7.1

Google
created on 2018-04-19

art 93.7
modern art 87.5
painting 85.6
portrait 77.8
illustration 74.2
visual arts 65.4
paint 64.1
artwork 59.3
drawing 55.7
acrylic paint 55.5

Microsoft
created on 2018-04-19

text 99.9
book 99

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 72.2%

Captions

Microsoft

a close up of a book 48.1%
close up of a book 41.5%
a close up of a book cover 41.4%