Human Generated Data

Title

Tool Box I

Date

1966

People

Artist: Jim Dine, American 1935 -

Printer: Kelpra Studio,

Publisher: Editions Alecto, London,

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, 2006.77.1

Copyright

© Jim DIne / Artist Rights Society (ARS), New York

Human Generated Data

Title

Tool Box I

People

Artist: Jim Dine, American 1935 -

Printer: Kelpra Studio,

Publisher: Editions Alecto, London,

Date

1966

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, 2006.77.1

Copyright

© Jim DIne / Artist Rights Society (ARS), New York

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Person 98.5
Human 98.5
Person 95.6
Weaponry 75.3
Weapon 75.3
Advertisement 67.4
Collage 67
People 63.8
Poster 63.2
Gun 62.1
Art 62
Person 49.4

Clarifai
created on 2019-10-29

people 98.9
graffiti 97.6
adult 96.9
man 95.2
illustration 93.1
street 92.9
dirty 91.5
art 90.1
group 88.7
war 87.3
vandalism 86.4
wear 86
no person 85.7
print 85.4
military 83.4
vehicle 82.6
woman 82.2
retro 81.8
bill 80.9
aircraft 80.3

Imagga
created on 2019-10-29

comic book 100
print media 44.1
graffito 36.4
decoration 24.9
man 20.2
art 18.3
black 15.6
person 15.2
male 14.9
people 13.9
mask 13.2
military 12.6
grunge 11.9
style 11.9
urban 11.4
adult 11
paint 10.9
painting 10.8
soldier 10.8
design 10.7
graphic 10.2
face 9.9
studio 9.9
retro 9.8
fashion 9.8
human 9.7
fun 9.7
war 9.6
entertainment 9.2
silhouette 9.1
protection 9.1
danger 9.1
music 9
body 8.8
horror 8.7
party 8.6
costume 8.6
wall 8.6
gun 8.4
book jacket 8.3
color 8.3
street 8.3
suit 8.1
graffiti 7.9
camouflage 7.8
rock 7.8
spooky 7.8
play 7.8
shoot 7.7
scary 7.7
fear 7.7
hand 7.6
city 7.5
clip art 7.4
church 7.4
game 7.3
group 7.3
celebration 7.2
religion 7.2
weapon 7.1

Google
created on 2019-10-29

Photograph 96.2
Art 76.7
Poster 69.5
Illustration 66.9
Visual arts 55

Microsoft
created on 2019-10-29

text 99.1
book 93.4
cartoon 93.3
drawing 90.7
art 87.1
person 85.5
poster 77
clothing 63.2
sketch 58
music 51.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 2-8
Gender Female, 53.2%
Sad 45%
Calm 45%
Surprised 54.9%
Happy 45%
Fear 45%
Disgusted 45%
Confused 45%
Angry 45%

AWS Rekognition

Age 2-8
Gender Female, 53.6%
Disgusted 45%
Sad 45.1%
Fear 45%
Surprised 45.1%
Happy 52.9%
Calm 46.7%
Confused 45%
Angry 45%

Google Vision

Surprise Possible
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Poster 63.2%
Gun 62.1%

Captions

Azure OpenAI

Created on 2024-02-07

The image shows a collage of various black and white photographs that include industrial machinery, a camera, and some animals such as a giraffe. These pictures are arranged in a seemingly random and abstract composition. The predominant colors are monochromes of black, white, and gray, with some areas displaying halftone patterns, common in printed images. The collage possesses a surreal quality due to the unexpected juxtaposition of industrial elements with wildlife.

Anthropic Claude

Created on 2024-03-30

This collage-style image contains a mix of various elements and imagery. In the center, there appears to be a large industrial machine or piece of equipment. Surrounding this central element are various other items, including photographs, abstract shapes, and other visual elements. The overall composition has a surreal, dreamlike quality, blending different objects and imagery in an intriguing and visually striking way. While there are human figures visible in some of the smaller photographs, I will not identify or name any individuals shown, as per the instructions provided.

Text analysis

Google

PE
PE