Human Generated Data

Title

Untitled

Date

1998

People

Artist: David Levinthal, American born 1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous gift, 2017.282

Human Generated Data

Title

Untitled

People

Artist: David Levinthal, American born 1949

Date

1998

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous gift, 2017.282

Machine Generated Data

Tags

Amazon
created on 2019-04-10

Nature 96
Outdoors 93
Screen 90
Electronics 90
Monitor 90
Display 90
Human 86.1
Apparel 81.6
Clothing 81.6
Snow 70.9
Figurine 63.2
Person 60.2
People 56.1
Wood 56

Clarifai
created on 2018-11-05

people 98.3
exhibition 97.9
museum 97.5
blur 97.1
landscape 97.1
wear 95.7
movie 95.6
one 95.3
screen 95.2
music 95
man 93.3
adult 93.3
portrait 92.7
technology 91.9
indoors 89.5
television 89.3
moment 89.2
light 89
picture frame 89
painting 88.9

Imagga
created on 2018-11-05

silhouette 38.1
man 30.3
cornet 30.2
sunset 28.8
brass 28.5
black 22.4
wind instrument 20.7
sky 20.4
sun 18.8
people 18.4
sport 18.1
device 17.9
person 17.8
male 17.7
tripod 14.7
horn 14.5
musical instrument 14.3
sunrise 13.1
support 12.4
dark 11.7
active 11.7
model 11.7
recreation 11.7
lighting 11.3
body 11.2
men 11.2
danger 10.9
equipment 10.7
fun 10.5
gun 10.4
evening 10.3
clouds 10.1
rack 10.1
apparatus 9.9
shadow 9.9
mountain 9.8
mask 9.7
summer 9.6
dusk 9.5
adventure 9.5
play 9.5
weapon 9.5
light 9.4
orange 9.2
leisure 9.1
adult 9.1
fashion 9
outdoors 9
sexy 8.8
instrumentality 8.8
horror 8.7
silhouettes 8.7
standing 8.7
lifestyle 8.7
outdoor 8.4
sports 8.3
human 8.3
freedom 8.2
protection 8.2
exercise 8.2
music 8.1
horizon 8.1
activity 8.1
athlete 8
to 8
boy 7.8
sea 7.8
cloud 7.7
dawn 7.7
chemical 7.7
gas 7.7
hand 7.7
attractive 7.7
jump 7.7
extreme 7.7
health 7.6
passion 7.5
action 7.5
sensuality 7.3
metal 7.2
fitness 7.2
women 7.1
posing 7.1
statue 7.1
love 7.1
happiness 7.1
player 7

Google
created on 2018-11-05

poster 73.8
art 57.3
picture frame 57.1
darkness 53.1

Microsoft
created on 2018-11-05

monitor 99.9
electronics 99.3
indoor 97.9
screen 96.8
television 96.3
wall 96.2
display 95.2
flat 65.1
computer 58.1
set 33.4
picture frame 13.9
entertainment center 11.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 58.4%
Confused 4.7%
Disgusted 3.1%
Calm 51.2%
Sad 26%
Surprised 3.5%
Happy 3.6%
Angry 8%

Feature analysis

Amazon

Monitor 90%
Person 60.2%

Categories

Captions

Azure OpenAI

Created on 2024-11-21

The image shows a photograph with a dark, out-of-focus background and a slightly overexposed top edge, indicative of it being a Polaroid or similar instant film photo. The subject is a small, metallic figurine placed on a surface that resembles sand. The figurine is centered and appears to depict a humanoid character with exaggerated features such as a broad chest and muscular limbs, possibly representing a warrior or a character from fiction or mythology. The image is well-lit, with a focus primarily on the figurine, creating a sense of depth. There's a warm tone to the lighting, enhancing the sepia-like quality of the sand or surface beneath the figurine. The bottom of the photograph includes a white border where a signature and the year "1988" have been written, suggesting that this is the work of an artist or photographer who has signed and dated their creation. The physical photo looks to be in good condition with no apparent signs of aging or damage.

Anthropic Claude

Created on 2024-11-21

The image shows a dark, moody photograph of what appears to be a small figurine or sculpture of a human-like character. The figure is silhouetted against a hazy, reddish-orange background. The figure is standing upright, with its arms slightly raised, suggesting a dramatic or expressive pose. The lighting and composition create a sense of mystery and tension in the image.

Text analysis

Amazon

-2
1985

Google

8b1
8b1