Human Generated Data

Title

Untitled

Date

1989

People

Artist: David Levinthal, American born 1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous gift, 2017.289

Human Generated Data

Title

Untitled

People

Artist: David Levinthal, American born 1949

Date

1989

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous gift, 2017.289

Machine Generated Data

Tags

Amazon
created on 2019-04-10

Figurine 87.8
Pet 80.3
Cat 80.3
Animal 80.3
Mammal 80.3
Honey Bee 77
Bee 77
Invertebrate 77
Insect 77
Dragon 67.5
Photography 59.4
Photo 59.4

Clarifai
created on 2018-11-05

no person 95.5
people 95.3
sunset 94.7
nature 93.6
dawn 93
man 92.5
woman 91.9
water 90.2
old 89.6
landscape 89.2
exhibition 88
desktop 87.8
portrait 87.8
adult 87.6
art 87.5
one 86.5
wood 86.3
picture frame 85.8
dark 85.8
vintage 84.5

Imagga
created on 2018-11-05

candle 44.3
pretty 24.5
body 24
spa 23.3
adult 23.3
fireplace 23
perfume 22.6
source of illumination 22.1
portrait 22
attractive 21.7
person 21.4
model 20.2
people 19
skin 18.8
toiletry 18.5
healthy 18.3
care 18.1
fashion 18.1
flower 17.7
wellness 17.4
health 17.4
lady 17
sexy 16.9
relaxation 16.7
human 16.5
face 16.3
luxury 16.3
smiling 15.9
treatment 15.6
massage 15.4
hair 15.1
lifestyle 14.5
happy 14.4
sensual 13.6
therapy 13.2
studio 12.9
make 12.7
hand 12.1
smile 12.1
clean 11.7
holding 11.6
cute 11.5
hands 11.3
cosmetics 11.2
bride 11.2
sensuality 10.9
lips 10.2
relax 10.1
natural 10
dress 9.9
black 9.8
one 9.7
style 9.6
skincare 9.5
slim 9.2
pink 9.2
brunette 8.7
flowers 8.7
fresh 8.5
wellbeing 8.4
fire 8.4
purity 8.3
gorgeous 8.2
looking 8
home 8
salon 8
dessert 7.9
women 7.9
day 7.8
color 7.8
eyes 7.7
sitting 7.7
aromatherapy 7.7
cake 7.5
lying 7.5
flame 7.5
fun 7.5
feminine 7.5
food 7.4
makeup 7.3
cheerful 7.3
celebration 7.2

Google
created on 2018-11-05

Microsoft
created on 2018-11-05

monitor 98.1
indoor 92.3
screen 91.3
television 90.9
flat 58.5
display 33.2
picture frame 6.1

Color Analysis

Feature analysis

Amazon

Cat 80.3%
Honey Bee 77%

Categories

Captions

Azure OpenAI

Created on 2024-11-18

This image appears to be a photograph of a staged scene. In the foreground, there is a figurine of a person seated, holding what seems to be a jug or vessel. This figure is detailed, with visible patterns on the clothing and appears to be in a contemplative or resting pose. Behind the figure, there are large, shadowy shapes that dominate the background, possibly meant to depict rocks or geological formations, casting stark shadows and creating a dramatic backdrop. The color palette of the scene is warm, with sepia, brown, and tan hues giving it an aged or timeless appearance. At the top edge of the photograph, you can see the texture of the liquid photographic emulsion, which suggests that this could be an analog photograph, not a digital one. There's a signature at the bottom right corner of the image, indicating the artist or photographer's name, followed by a date, which seems to be "1987". However, as per our guidelines, I cannot identify the name in the signature. The photograph has a white border framing the image, adding to its presentation as a piece of art.

Anthropic Claude

Created on 2024-11-18

The image appears to be a surreal, dreamlike photograph. It depicts a shadowy, distorted figure sitting on a rocky surface, surrounded by an ethereal, mountainous landscape in the background. The image has a sepia-toned, vintage quality, and the blurriness and abstraction of the scene give it a sense of mystery and introspection. The photograph seems to evoke a sense of isolation, contemplation, and the interplay between human and nature.

Text analysis

Amazon

1989
1.-8
118 1.-8 1989
118