Human Generated Data

Title

David as Cindy

Date

1982, printed 2013

People

Artist: Peter Hujar, American 1934 - 1987

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2016.163

Copyright

© The Peter Hujar Archive LLC / Artists Rights Society (ARS), New York

Human Generated Data

Title

David as Cindy

People

Artist: Peter Hujar, American 1934 - 1987

Date

1982, printed 2013

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2016.163

Copyright

© The Peter Hujar Archive LLC / Artists Rights Society (ARS), New York

Machine Generated Data

Tags

Amazon
created on 2019-04-09

Human 99.2
Person 99.2
Clothing 97.5
Sleeve 97.5
Apparel 97.5
Pants 73.9
Man 70.9
Sitting 70.1
Long Sleeve 62.6
Accessory 58.5
Accessories 58.5
Glasses 58.5
People 58.2

Clarifai
created on 2018-02-16

people 99.8
adult 99.6
one 99.5
portrait 99.4
man 94.9
wear 94.6
woman 94.5
monochrome 93.4
girl 92.1
music 91.7
facial expression 91
model 90.8
studio 87.4
chair 86.2
musician 85.5
furniture 80.6
actor 78.9
side view 76
two 74.9
room 74.8

Imagga
created on 2018-02-16

person 37.7
portrait 35
adult 33.1
fashion 30.9
attractive 28.7
sexy 27.3
model 27.2
people 26.2
pretty 25.9
face 24.2
sunglasses 22.2
hair 22.2
studio 22
happy 21.3
black 21
microphone 20.4
smile 19.3
one 17.9
style 17.8
posing 17.8
cute 16.5
sensuality 15.4
lady 15.4
pose 15.4
mask 15.3
women 15
brunette 14.8
cheerful 14.6
expression 14.5
smiling 14.5
music 14.4
looking 13.6
youth 12.8
gorgeous 12.7
singer 12.6
spectacles 12.1
fun 12
casual 11.9
stylish 11.8
musician 11.7
lifestyle 11.6
sunglass 11.4
rock 11.3
body 11.2
sensual 10.9
make 10.9
performer 10.8
covering 10.8
clothing 10.8
cool 10.7
human 10.5
blond 10.3
happiness 10.2
dress 9.9
holding 9.9
look 9.6
sitting 9.5
guitar 9.4
clothes 9.4
glasses 9.3
elegance 9.2
dark 9.2
hand 9.1
modern 9.1
instrument 9
skin 8.8
jeans 8.6
fashionable 8.5
device 8.5
sound 8.4
joy 8.4
playing 8.2
erotic 8.1
urban 7.9
standing 7.8
professional 7.8
concert 7.8
seductive 7.7
man 7.4
entertainment 7.4

Google
created on 2018-02-16

Microsoft
created on 2018-02-16

wall 96.5
person 94.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Male, 98.7%
Happy 4%
Sad 2.1%
Surprised 13.9%
Confused 31.2%
Calm 40.6%
Angry 3.8%
Disgusted 4.3%

Microsoft Cognitive Services

Age 31
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Glasses 58.5%

Captions

Azure OpenAI

Created on 2024-11-17

The image shows a person standing against a pale-colored wall. The individual is wearing a white crew neck t-shirt paired with dark-colored jeans that are slightly lowered to reveal the waistband of their undergarments. The setting includes a piece of modern-style furniture to the right, which seems to have a dark surface with light-colored legs. Hanging on the wall, there's an object with a dark, elongated shape that appears to be some sort of brush or duster with a string or handle. The color palette of the picture is monochrome, suggesting it could be a black and white photograph. The lighting and composition convey a candid or casual atmosphere.

Anthropic Claude

Created on 2024-11-17

The image depicts a man wearing a white t-shirt and glasses, standing in a room. He appears to have a serious expression on his face and is looking slightly upward. The background is mostly white, with some indistinct objects or furniture visible in the background.