Human Generated Data

Title

Women and child with actor print

Date

-

People

Artist: Kitagawa Chikanobu, Japanese 1807 - 1817

Classification

Prints

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of MR. AND MRS. PAUL BERNAT, 1970.115

Human Generated Data

Title

Women and child with actor print

People

Artist: Kitagawa Chikanobu, Japanese 1807 - 1817

Classification

Prints

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of MR. AND MRS. PAUL BERNAT, 1970.115

Machine Generated Data

Tags

Amazon
created on 2022-06-11

Hat 99.2
Clothing 99.2
Apparel 99.2
Person 95.7
Human 95.7
Comics 83.5
Book 83.5
Person 83.3
Text 80.3
Art 74
Drawing 62.2
Manga 57.4

Clarifai
created on 2023-10-29

people 99.7
illustration 99.7
print 99.6
woodcut 99.4
wear 99.1
adult 98.5
art 98.3
retro 98.1
vintage 97.7
one 97
engraving 96.9
man 95.3
antique 93.5
two 93.4
visuals 93.1
veil 91.1
etching 90.6
old 89.6
bill 88.2
portrait 86.1

Imagga
created on 2022-06-11

sketch 100
drawing 100
representation 81.1
book jacket 39.2
jacket 30.5
wrapping 23.2
art 18.2
retro 18
vintage 17.4
grunge 17
covering 16.5
old 16
paper 14.1
black 13.8
man 12.8
stamp 12.6
design 12.4
money 11.9
pattern 11.6
mail 11.5
artistic 11.3
letter 11
postmark 10.8
currency 10.8
dollars 10.6
philately 9.9
history 9.8
postage 9.8
style 9.6
ancient 9.5
power 9.2
silhouette 9.1
backdrop 9.1
sport 9.1
paint 9.1
envelope 9
shape 8.9
detail 8.9
antique 8.7
cartoon 8
graphic 8
postal 7.8
line 7.7
post 7.6
finance 7.6
poster 7.6
dollar 7.4
clip art 7.4
symbol 7.4
backgrounds 7.3
business 7.3
comic book 7.2

Google
created on 2022-06-11

Sleeve 82.6
Art 80
Rectangle 79.7
Shimada 79.7
Sakko 76.8
Painting 72.3
Font 69.8
Poster 69.2
Illustration 67.9
Drawing 67.9
Visual arts 64.3
Hat 63.7
Creative arts 58.1
Printmaking 54.7
Fictional character 52.8
Line art 51.5
Costume design 51.2

Microsoft
created on 2022-06-11

text 100
book 100
sketch 99.5
drawing 99.2
cartoon 96
outdoor 88.6
illustration 88.3
human face 86.6
person 68.5

Color Analysis

Feature analysis

Amazon

Hat 99.2%
Person 95.7%

Categories

Imagga

paintings art 100%

Captions

Microsoft
created on 2022-06-11

a close up of a book 51.7%
a close up of a person holding a book 29.9%
close up of a book 29.8%

Text analysis

Google

YOUR ( Her
YOUR
(
Her