Human Generated Data

Title

Really (in honor of Felix Gonzalez-Torres)

Date

2000

People

Artist: Jim Hodges, American born 1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, P2003.8.3.A

Human Generated Data

Title

Really (in honor of Felix Gonzalez-Torres)

People

Artist: Jim Hodges, American born 1957

Date

2000

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, P2003.8.3.A

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 88.6
Human 88.6
Play Area 70.3
Playground 70.3
Art 60.7
Roller Coaster 55.7
Amusement Park 55.7
Coaster 55.7

Clarifai
created on 2023-10-25

illustration 96.4
art 96.4
retro 96.2
no person 94.9
fast 93.4
design 93.1
snow 92.3
winter 92.3
abstract 92
vector 91.8
flat 89.1
sport 89
paper 88.7
square 88.1
vintage 87.9
motion 86.6
bill 85.6
old 85.6
dirty 83.8
competition 82.7

Imagga
created on 2022-01-09

backboard 34.8
equipment 30.3
device 17.1
design 15.8
technology 13.4
black 10.9
screen 10.9
symbol 10.8
art 10.6
business 10.3
monitor 9.7
modern 9.1
computer 9.1
fun 9
game 8.9
frame 8.9
billboard 8.9
object 8.8
play 8.6
holiday 8.6
money 8.5
box 8.4
electronic 8.4
sport 8.4
display 8.3
silhouette 8.3
light-emitting diode 8.1
currency 8.1
season 7.8
grunge 7.7
sky 7.7
laptop 7.6
container 7.6
finance 7.6
basket 7.6
one 7.5
style 7.4
retro 7.4
cash 7.3
support 7.3
color 7.2
television 7.2
cartoon 7.1
financial 7.1
drawing 7.1

Google
created on 2022-01-09

Working animal 87.4
Rectangle 84.6
Art 81.6
Font 77.9
Painting 76.7
Bag 74.9
Elephant 73.4
Terrestrial animal 70.4
Circle 68.7
Cart 68.3
Pack animal 67.1
Twig 66.7
Visual arts 66.6
Indian elephant 65.9
Drawing 65.8
Bull 65.6
Paper product 64.3
Tree 64.1
Graphics 63.9
Illustration 62.3

Microsoft
created on 2022-01-09

drawing 95.1
text 94.6
art 82.3
painting 78.6
sketch 55.7
gallery 53.7
screenshot 30.2
picture frame 15.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 16-22
Gender Male, 97.8%
Calm 60%
Sad 24.2%
Happy 5.2%
Fear 3%
Confused 2.3%
Disgusted 2.3%
Angry 1.5%
Surprised 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 88.6%

Categories

Imagga

paintings art 99.9%

Captions

Microsoft
created by unknown on 2022-01-09

a screen shot of a person 74%
a person sitting at a desk 55.8%
a screen shot of a computer 55.7%

Clarifai
created by general-english-image-caption-blip on 2025-05-05

a photograph of a man and woman standing in front of a playground structure -100%

Google Gemini

Created by gemini-2.0-flash-lite on 2025-05-05

The image is a photograph presented on a white background. The photograph within the frame depicts a man standing next to a large, abstract structure. The structure appears to be metallic, with curved and segmented parts. It is vaguely reminiscent of a giant crab claw. The man is shirtless, wearing dark shorts. He is gesturing towards the structure with his right hand. The background of the smaller photograph shows a tropical setting with green trees, a beach, and some water. The overall composition gives the impression of a surreal or artistic photograph, with the central image placed within a much larger white space.

Created by gemini-2.0-flash on 2025-05-05

The image is a photograph of a photograph. The photograph is in a landscape orientation, and is a snapshot of a shirtless man standing in front of a large, abstract sculpture. The sculpture appears to be made of tubing and is painted in a pattern of brown, white, and blue stripes. The man is wearing a black skirt or shorts. The background of the photograph is a beach scene with palm trees and lush green vegetation. The photograph itself is a small rectangle in the center of a much larger white background. The white background is framed by a thin gray border.

Text analysis

Amazon

2000