Human Generated Data

Title

Abstraction

Date

1958

People

Artist: James Rosati, American 1912-1988

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Lois Orswell, 1988.466

Human Generated Data

Title

Abstraction

People

Artist: James Rosati, American 1912-1988

Date

1958

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Lois Orswell, 1988.466

Machine Generated Data

Tags

Amazon
created on 2023-10-18

Art 99.8
Chair 99.7
Furniture 99.7
Armchair 97.6
Modern Art 94.2
Drawing 73.3
Person 67.9
Painting 56.1

Clarifai
created on 2023-10-18

no person 99
one 97
people 96.3
art 95.5
old 95
wear 93.3
monochrome 92.9
antique 90.1
retro 89.6
texture 86.7
ancient 84.3
man 82.2
leather 82.1
vintage 80.6
science 79.5
hard 78.5
two 78.4
typography 78.3
desktop 78.2
text 77

Imagga
created on 2018-12-18

totem pole 100
column 83.9
structure 58.2
sculpture 53.4
carving 39.5
statue 38.2
art 27
ancient 26.8
religion 26
old 25.8
culture 25.6
stone 24.5
history 22.4
mask 21.3
temple 19
religious 17.8
traditional 17.5
antique 17.3
bookend 15.7
face 15.6
architecture 15.6
travel 15.5
covering 14.8
decoration 14.7
support 14.7
carved 14.7
disguise 14.5
head 14.3
east 14
wood 13.3
monument 13.1
spiritual 12.5
wooden 12.3
god 11.5
device 11.3
plastic art 11
worship 10.6
figure 10.6
brown 10.3
historic 10.1
oriental 9.4
historical 9.4
object 8.8
spirituality 8.6
china 8.5
decorative 8.3
tourist 8.2
detail 8
building 7.9
museum 7.8
holy 7.7
craft 7.6
human 7.5
tourism 7.4
dollar 7.4
close 7.4
church 7.4
gold 7.4
tradition 7.4
peace 7.3
landmark 7.2
black 7.2
attire 7.2

Google
created on 2018-12-18

Microsoft
created on 2018-12-18

seat 36.5
monochrome 36.5
black and white 36
abstract 6.8
wood 5.6
art 4.8

Color Analysis

Feature analysis

Amazon

Person 67.9%

Categories

Captions