Human Generated Data

Title

Untitled (man in store window with reflections)

Date

1965

People

Artist: Ken Heyman, American born 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, 2011.548

Copyright

© Ken Heyman

Human Generated Data

Title

Untitled (man in store window with reflections)

People

Artist: Ken Heyman, American born 1930

Date

1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, 2011.548

Copyright

© Ken Heyman

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Human 99.5
Person 99.5
Person 90.5
Finger 89.1
Apparel 83.6
Clothing 83.6

Clarifai
created on 2018-02-10

people 99
one 96.9
man 96.7
adult 95.9
portrait 95.2
street 91
woman 84.8
two 83.9
wear 83.2
monochrome 83.2
boy 82.1
music 80.1
model 77
recreation 75.6
child 71.8
fashion 69.7
urban 66.2
black and white 65.9
art 64.5
person 63.4

Imagga
created on 2018-02-10

person 30.8
sexy 29.7
fashion 28.6
adult 27.9
model 26.4
attractive 25.9
body 25.6
black 24.8
people 24
portrait 23.9
lady 21.9
hair 19.8
sitting 19.8
pretty 19.6
erotic 17.1
lifestyle 16.6
one 16.4
posing 16
blond 15.3
skin 15.2
clothing 15
brunette 14.8
cute 14.3
human 14.2
man 14.1
style 14.1
sensual 13.6
sensuality 13.6
women 13.4
newspaper 13.3
legs 13.2
studio 12.9
dishwasher 12.8
youth 12.8
casual 12.7
elegance 12.6
face 12.1
looking 12
chair 11.6
male 11.5
child 11
pose 10.9
dress 10.8
sofa 10.7
laptop 10.5
jeans 10.5
fun 10.5
product 10.3
television 10.3
elegant 10.3
naked 9.7
shoes 9.6
seductive 9.6
eyes 9.5
house 9.2
creation 9.2
lovely 8.9
interior 8.8
home 8.8
happy 8.8
love 8.7
desire 8.7
smile 8.5
garment 8.5
business 8.5
passion 8.5
hot 8.4
leg 8.3
computer 8.2
teenager 8.2
work 8.1
look 7.9
nude 7.8
couch 7.7
skirt 7.7
guy 7.6
joy 7.5
leisure 7.5
notebook 7.3
stylish 7.2
office 7.2
cool 7.1
job 7.1
happiness 7

Google
created on 2018-02-10

Microsoft
created on 2018-02-10

person 99
man 90.3

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 30-47
Gender Male, 97.2%
Happy 1.8%
Disgusted 9%
Confused 20.6%
Calm 30.4%
Angry 16.4%
Sad 17%
Surprised 4.8%

Microsoft Cognitive Services

Age 36
Gender Male

Feature analysis

Amazon

Person 99.5%

Captions

Azure OpenAI

Created on 2024-11-28

This is a black and white photograph showing an individual seated inside what appears to be a ticket booth or service window. The person inside the booth is wearing a casual short-sleeved shirt and jeans. Directly in the foreground, part of someone's arm is visible, wearing a jacket with distinctive studded decorations on the sleeves. The composition of the image gives the impression of a casual, possibly candid moment captured at some type of service window where the foreground figure might be awaiting service from the individual inside the booth.

Anthropic Claude

Created on 2024-11-27

The image shows a man sitting in a window display, likely meant to be part of a store display or advertisement. He is dressed in casual clothing, including jeans and a jacket, and has a mustache. In the foreground of the image, a hand is visible, gesturing or interacting with the man in the window. The overall scene has a gritty, black and white aesthetic, giving it a somewhat gritty and industrial feel.

Meta Llama

Created on 2024-11-26

The image depicts a black-and-white photograph of two men in a window, with one man sitting on the sill and the other standing outside. The man inside is wearing a t-shirt, jeans, and sneakers, and is holding a striped scarf. He has dark hair and a mustache, and is looking at the camera with a serious expression. The man outside is wearing a dark jacket with white polka dots on the cuffs and is reaching out to touch the window. The background of the image shows a building with white tiles and a balcony, suggesting that the scene is taking place in an urban setting. The overall atmosphere of the image is one of introspection and contemplation, with the two men appearing to be lost in thought.