Human Generated Data

Title

Untitled

Date

1978-1979, printed 1999

People

Artist: David Wojnarowicz, American 1954 - 1992

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.449

Copyright

© Courtesy of the Estate of David Wojnarowicz and P·P·O·W, New York

Human Generated Data

Title

Untitled

People

Artist: David Wojnarowicz, American 1954 - 1992

Date

1978-1979, printed 1999

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.449

Copyright

© Courtesy of the Estate of David Wojnarowicz and P·P·O·W, New York

Machine Generated Data

Tags

Amazon
created on 2019-03-29

Person 99.6
Human 99.6
Person 99.4
Person 98.6
Person 96.2
Restaurant 92.3
Sitting 91.9
Person 90.6
Cafe 83.9
Crowd 76.8
Food 75.2
Meal 75.2
Clothing 72.8
Apparel 72.8
People 66.5
Cafeteria 66.5
Audience 57.9
Food Court 56.8
Person 48.2

Clarifai
created on 2018-02-09

people 99.8
group 98.4
adult 97.2
group together 96.8
woman 96.5
man 95.7
many 94.6
wear 94.5
monochrome 92.2
street 89.9
music 89.6
recreation 87.9
several 86.9
outfit 83.4
administration 82.1
child 79.2
crowd 77.9
bar 77.3
furniture 77.2
vehicle 76.1

Imagga
created on 2018-02-09

shop 100
barbershop 100
man 34.2
establishment 28.8
people 26.8
male 22.7
adult 22.6
happy 18.2
business 17.6
person 17.5
restaurant 17.2
smiling 16.6
lifestyle 16.6
indoors 14.9
sitting 13.7
portrait 13.6
color 13.3
couple 13.1
smile 12.8
cheerful 12.2
hairdresser 11.9
casual 11.9
office 11.5
building 11.5
job 10.6
men 10.3
love 10.3
professional 10.1
20s 10.1
attractive 9.8
work 9.4
enjoyment 9.4
face 9.2
horizontal 9.2
indoor 9.1
hand 9.1
pretty 9.1
old 9.1
businessman 8.8
happiness 8.6
room 8.6
adults 8.5
togetherness 8.5
friends 8.4
fashion 8.3
worker 8.1
suit 8.1
group 8.1
family 8
looking 8
working 7.9
black 7.8
window 7.6
relaxation 7.5
fun 7.5
city 7.5
vintage 7.4
holding 7.4
style 7.4
teamwork 7.4
executive 7.4
home 7.2
women 7.1

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

person 98.5
outdoor 96.6
people 62.2
store 48
crowd 2.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Male, 89.3%
Angry 2.9%
Disgusted 0.6%
Happy 0.3%
Calm 87.4%
Surprised 1.3%
Sad 5.4%
Confused 2%

AWS Rekognition

Age 35-55
Gender Female, 88.4%
Sad 1.1%
Calm 0.8%
Confused 0.4%
Disgusted 0.8%
Happy 96%
Angry 0.6%
Surprised 0.4%

AWS Rekognition

Age 26-43
Gender Female, 58%
Happy 4.4%
Sad 43.8%
Confused 2.9%
Surprised 3.2%
Calm 26.3%
Disgusted 5.3%
Angry 14.2%

AWS Rekognition

Age 26-43
Gender Female, 89%
Happy 13.8%
Angry 8.4%
Confused 4.4%
Surprised 9.2%
Sad 27.8%
Disgusted 5.8%
Calm 30.6%

AWS Rekognition

Age 26-43
Gender Female, 60.7%
Angry 2.6%
Disgusted 1.6%
Confused 1.4%
Surprised 1.1%
Calm 73.6%
Sad 18.7%
Happy 1.1%

AWS Rekognition

Age 4-9
Gender Female, 98.2%
Angry 3.6%
Surprised 3.5%
Calm 18.8%
Sad 58.8%
Disgusted 4.8%
Happy 9%
Confused 1.3%

Microsoft Cognitive Services

Age 44
Gender Female

Microsoft Cognitive Services

Age 43
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Azure OpenAI

Created on 2024-11-28

This black and white image appears to capture passengers sitting inside a subway train. The interior of the train is marked with extensive graffiti; scribbles and tags are visible on the walls and even on the backs of the seats. One of the windows has a posted sign which mentions "Uncle's blessing." The passengers are seated, with one slouching slightly and another sitting straight. There is a slight motion blur, suggesting the photo may have been taken while the train was in motion. It exudes an urban, gritty vibe, indicative of subway systems in many cities during periods when vandalism was more rampant.

Anthropic Claude

Created on 2024-11-27

The image appears to be a black and white photograph of a subway car or other public transportation setting. There are several people visible, including a man and a woman who seem to be sitting together. There is graffiti and text visible on the walls and surfaces around them. The scene has a gritty, urban feel to it.

Meta Llama

Created on 2024-11-26

The image depicts a black-and-white photograph of a subway car, with a group of people seated inside. The subway car is adorned with graffiti and advertisements, and the people are dressed in casual attire. The atmosphere appears to be relaxed, with some individuals engaged in conversation or reading. The overall mood of the image is one of everyday life in the city, capturing a moment in time on the subway.

Text analysis

Amazon

leading
with
Uncle's
The
that comes with
comes
that
only
FLATBUSH AVD
Uncle's blessinne
BIMYF The only leading t
blessinne
t
Co
rehe
BIMYF
YQK
O

Google

The only le hat comes Uncies FLATBUSH AV
The
only
hat
Uncies
AV
le
comes
FLATBUSH