Human Generated Data

Title

Untitled

Date

1978-1979, printed 1999

People

Artist: David Wojnarowicz, American 1954 - 1992

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.449

Copyright

© Courtesy of the Estate of David Wojnarowicz and P·P·O·W, New York

Human Generated Data

Title

Untitled

People

Artist: David Wojnarowicz, American 1954 - 1992

Date

1978-1979, printed 1999

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-03-29

Person 99.6
Human 99.6
Person 99.4
Person 98.6
Person 96.2
Restaurant 92.3
Sitting 91.9
Person 90.6
Cafe 83.9
Crowd 76.8
Meal 75.2
Food 75.2
Clothing 72.8
Apparel 72.8
People 66.5
Cafeteria 66.5
Audience 57.9
Food Court 56.8
Person 48.2

Clarifai
created on 2018-02-09

people 99.8
group 98.4
adult 97.2
group together 96.8
woman 96.5
man 95.7
many 94.6
wear 94.5
monochrome 92.2
street 89.9
music 89.6
recreation 87.9
several 86.9
outfit 83.4
administration 82.1
child 79.2
crowd 77.9
bar 77.3
furniture 77.2
vehicle 76.1

Imagga
created on 2018-02-09

barbershop 100
shop 100
man 34.2
establishment 28.8
people 26.8
male 22.7
adult 22.6
happy 18.2
business 17.6
person 17.5
restaurant 17.2
smiling 16.6
lifestyle 16.6
indoors 14.9
sitting 13.7
portrait 13.6
color 13.3
couple 13.1
smile 12.8
cheerful 12.2
hairdresser 11.9
casual 11.9
office 11.5
building 11.5
job 10.6
men 10.3
love 10.3
professional 10.1
20s 10.1
attractive 9.8
work 9.4
enjoyment 9.4
face 9.2
horizontal 9.2
indoor 9.1
hand 9.1
pretty 9.1
old 9.1
businessman 8.8
happiness 8.6
room 8.6
adults 8.5
togetherness 8.5
friends 8.4
fashion 8.3
worker 8.1
suit 8.1
group 8.1
family 8
looking 8
working 7.9
black 7.8
window 7.6
relaxation 7.5
fun 7.5
city 7.5
vintage 7.4
holding 7.4
style 7.4
teamwork 7.4
executive 7.4
home 7.2
women 7.1

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

person 98.5
outdoor 96.6
people 62.2
store 48
crowd 2.4

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Male, 89.3%
Angry 2.9%
Disgusted 0.6%
Happy 0.3%
Calm 87.4%
Surprised 1.3%
Sad 5.4%
Confused 2%

AWS Rekognition

Age 35-55
Gender Female, 88.4%
Sad 1.1%
Calm 0.8%
Confused 0.4%
Disgusted 0.8%
Happy 96%
Angry 0.6%
Surprised 0.4%

AWS Rekognition

Age 26-43
Gender Female, 58%
Happy 4.4%
Sad 43.8%
Confused 2.9%
Surprised 3.2%
Calm 26.3%
Disgusted 5.3%
Angry 14.2%

AWS Rekognition

Age 26-43
Gender Female, 89%
Happy 13.8%
Angry 8.4%
Confused 4.4%
Surprised 9.2%
Sad 27.8%
Disgusted 5.8%
Calm 30.6%

AWS Rekognition

Age 26-43
Gender Female, 60.7%
Angry 2.6%
Disgusted 1.6%
Confused 1.4%
Surprised 1.1%
Calm 73.6%
Sad 18.7%
Happy 1.1%

AWS Rekognition

Age 4-9
Gender Female, 98.2%
Angry 3.6%
Surprised 3.5%
Calm 18.8%
Sad 58.8%
Disgusted 4.8%
Happy 9%
Confused 1.3%

Microsoft Cognitive Services

Age 43
Gender Female

Microsoft Cognitive Services

Age 44
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

Cyd Charisse, Arthur Rimbaud sitting in front of a store 90.1%
Cyd Charisse, Arthur Rimbaud standing in front of a store 90%
Cyd Charisse, Arthur Rimbaud sitting and standing in front of a store 89.9%

Text analysis

Amazon

leading
with
Uncle's
The
that comes with
comes
that
only
FLATBUSH AVD
Uncle's blessinne
BIMYF The only leading t
blessinne
t
Co
rehe
BIMYF
YQK
O

Google

AV
hat
le
Uncies
The
only
comes
FLATBUSH
The only le hat comes Uncies FLATBUSH AV