Human Generated Data

Title

Maya

Date

1942-1943, printed 1987

People

Artist: Alexander Hammid, American 1907 - 2004

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.240

Copyright

© Alexander Hammid

Human Generated Data

Title

Maya

People

Artist: Alexander Hammid, American 1907 - 2004

Date

1942-1943, printed 1987

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.240

Copyright

© Alexander Hammid

Machine Generated Data

Tags

Amazon
created on 2019-04-09

Finger 92.6
Person 90.4
Human 90.4
Face 80.7
Female 63.4
Photo 56.7
Photography 56.7
Portrait 56.7

Clarifai
created on 2018-02-10

people 99.1
portrait 98.9
one 98.8
adult 98.4
woman 96.5
dark 95.5
model 93.8
fashion 92.8
girl 91.7
facial expression 91.5
man 91.3
wear 90.7
face 86.5
light 85.1
veil 84.7
indoors 84.5
glamour 84.2
monochrome 84.2
music 82.7
shadow 82.6

Imagga
created on 2018-02-10

portrait 37.6
harmonica 37.3
black 35
face 34.1
person 33.8
adult 29.8
man 26.9
model 25.7
dark 25.1
male 24.2
expression 23.1
human 22.5
attractive 22.4
eyes 20.7
people 20.1
call 19.3
head 18.5
looking 18.4
close 18.3
fashion 18.1
pretty 17.5
hair 17.5
lady 16.3
one 15.7
makeup 15.6
sexy 15.3
studio 15.2
cigarette 14.6
skin 14.1
look 14
telephone 13.9
depression 13.6
radiotelephone 13.3
smoke 13
hand 12.9
eye 12.5
handsome 12.5
guy 12.4
lips 12
men 12
youth 11.9
device 11.8
smoking 11.8
serious 11.5
sensual 10.9
lifestyle 10.9
style 10.4
cute 10.1
sensuality 10
smile 10
brunette 9.6
hands 9.6
happy 9.4
lipstick 9.2
confident 9.1
boy 9.1
nose 9
hope 8.7
casual 8.5
emotion 8.3
microphone 8.3
life 8
women 7.9
mouth 7.9
business 7.9
facial 7.9
pain 7.7
thinking 7.6
elegance 7.6
religious 7.5
closeup 7.4
20s 7.3
make 7.3
cosmetic 7.2
posing 7.1
cool 7.1

Google
created on 2018-02-10

Microsoft
created on 2018-02-10

glass 16.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Female, 92.5%
Confused 6.2%
Sad 39%
Surprised 9.5%
Angry 11.4%
Calm 26.3%
Happy 2.2%
Disgusted 5.3%

Feature analysis

Amazon

Person 90.4%

Captions

Azure OpenAI

Created on 2024-11-28

The image features a clear glass filled with a dark liquid, possibly a soft drink or a dark alcoholic beverage. The glass is placed on a surface, with a visible hand grasping it while wearing a clear, plastic glove often used for sanitary purposes. The background is very dark, which limits visibility of any additional context. The lighting mainly emphasizes the glass and the hand interacting with it.

Anthropic Claude

Created on 2024-11-27

The image appears to be a black and white portrait photograph. It shows the close-up view of a person's face, with their eyes gazing directly at the camera. The person's hand is covering the lower half of their face, creating an intriguing and somewhat mysterious composition. The high-contrast lighting and the dark background contribute to the dramatic and somber tone of the image.

Meta Llama

Created on 2024-11-26

The image is a black-and-white photograph of a woman's face, partially obscured by a glass or other transparent material. The woman has dark hair and is wearing a long-sleeved shirt or dress. Her right hand is raised to her face, with her fingers curled around the glass. The background of the image is dark and indistinct, but it appears to be a studio setting. The overall effect of the image is one of mystery and intrigue, as the woman's face is partially hidden behind the glass. The use of black and white photography adds to the sense of drama and intensity, creating a moody and atmospheric image.