Human Generated Data

Title

Stick and Paint

Date

1970-1980

People

Artist: Richard Tuttle, American born 1941

Classification

Sculpture

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Barbara Lee and the Frances F. and John Bowes Fund, 1998.118

Copyright

© Richard Tuttle

Human Generated Data

Title

Stick and Paint

People

Artist: Richard Tuttle, American born 1941

Date

1970-1980

Classification

Sculpture

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Barbara Lee and the Frances F. and John Bowes Fund, 1998.118

Copyright

© Richard Tuttle

Machine Generated Data

Tags

Amazon
created on 2022-06-25

Tabletop 97.7
Furniture 97.7
Mouse 96.5
Electronics 96.5
Computer 96.5
Hardware 96.5
Wood 85.6
Sea 71.3
Nature 71.3
Water 71.3
Outdoors 71.3
Ocean 71.3
Table 71
Art 58.2
Plywood 57.5
Blade 57.1
Weapon 57.1
Weaponry 57.1

Clarifai
created on 2023-10-29

still life 98.5
no person 97
cutout 96.8
wood 96.3
art 95.6
empty 91
paper 91
one 90.8
knife 89.9
flatware 89.8
people 87.2
painting 86.1
furniture 85.7
abstract 85
studio 84.8
tableware 84.6
food 84
room 83.6
conceptual 83.4
family 82.6

Imagga
created on 2022-06-25

knife blade 22
blade 21
device 20.7
sunglass 20.3
object 19
black 17.4
metal 16.1
sunglasses 14.3
cup 14
cutting implement 14
table 13.1
spectacles 12.9
shiny 12.7
tool 12.6
spoon 12.6
lampshade 12.2
computer 12
plate 11.9
container 11.6
light 10.7
silver 10.6
empty 10.5
technology 10.4
restaurant 10.3
glasses 10.2
modern 9.8
food 9.7
business 9.7
shade 9.6
equipment 9.5
lunch 9.4
protective covering 9.3
vessel 9.3
knife 9.3
dinner 9.3
mouse 9.3
design 9
closeup 8.8
optical 8.7
glass 8.6
elegance 8.4
drink 8.3
fashion 8.3
kitchen 8.1
decoration 8
shoe 7.7
shoes 7.7
accessory 7.6
two 7.6
optical instrument 7.5
clean 7.5
close 7.4
meal 7.3
covering 7.3
celebration 7.2
silverware 7.2
steel 7.2
work 7.1

Google
created on 2022-06-25

Microsoft
created on 2022-06-25

wall 98.7
indoor 95.2
design 69.8
mirror 68.9
minimalist 63.6
silver 61.6

Color Analysis

Feature analysis

Amazon

Mouse 96.5%

Categories

Imagga

interior objects 100%

Captions

Microsoft
created on 2022-06-25

a close up of a plate on a table 37.4%
a close up of a plate 37.3%
close up of a plate 35.9%

Azure OpenAI

Created on 2024-01-27

The image shows a composition on a dual-tone background, with the wall behind being a darker shade than the plain surface on which the objects are placed. There are three distinct items in the picture: 1. To the left, there's a pencil with an eraser, leaning against the wall at an angle. Its length suggests it might be unusually long or an artistic representation rather than a regular-sized pencil. 2. In the center, there appears to be a piece of a white spherical object that is broken. Given its size and shape, it could be a part of a mannequin's head or a similar object. 3. To the right, there is a curved black strip lying flat on the surface. It's hard to discern the material, but it might be made of metal or plastic. Overall, the setting seems to resemble a minimalist art installation or a creative display. The arrangement and the simplicity suggest that each object is placed with purpose and may hold artistic or symbolic significance.

Anthropic Claude

Created on 2024-03-29

The image shows a simple geometric composition on a gray background. There is a yellow pencil standing upright, and two abstract shapes - one white and one black. The white shape appears to be a curved, wedge-like form, while the black shape is an elongated, curved object. The objects are arranged in a minimalist and balanced way, creating an abstract, architectural feel to the overall composition.