Human Generated Data

Title

Occupying Wall Street, December 3, 2011

Date

2011

People

Artist: Accra Shepp, American born 1962

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, 2019.314.9

Copyright

© Accra Shepp

Human Generated Data

Title

Occupying Wall Street, December 3, 2011

People

Artist: Accra Shepp, American born 1962

Date

2011

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, 2019.314.9

Copyright

© Accra Shepp

Machine Generated Data

Tags

Amazon
created on 2023-07-06

Clothing 100
Hat 99.9
Cap 99.9
Adult 99.4
Male 99.4
Man 99.4
Person 99.4
Adult 98.3
Man 98.3
Male 98.3
Person 98.3
Glove 97.7
Person 97.7
Male 96.6
Adult 96.6
Man 96.6
Person 96.6
Person 96.1
Shoe 95.9
Footwear 95.9
Overcoat 95.8
Person 95.5
Glove 94.2
Shoe 90.9
Face 89
Head 89
Accessories 85.3
Bag 85.3
Handbag 85.3
Jacket 73.3
Jacket 69.7
Coat 62.4
Photography 57.9
Portrait 57.9
Beanie 56.2
Lady 56

Clarifai
created on 2023-10-13

people 99.8
street 98.3
man 97.9
adult 97.3
portrait 96.2
one 95.9
coat 94.5
monochrome 92.8
two 90
outerwear 89.4
veil 88.3
wear 88.2
administration 87.5
woman 87
lid 86.7
overcoat 85.2
book series 85.1
elderly 84.3
religion 82.5
writer 81.7

Imagga
created on 2023-07-06

statue 56.4
person 26.7
man 23.6
adult 21.8
people 20.1
male 20
city 19.9
mask 17.1
clothing 16.4
street 14.7
building 14.3
covering 14.2
portrait 13.6
dress 13.5
travel 13.4
black 13.3
face 12.8
sculpture 12.5
business 12.1
suit 12
women 11.9
urban 11.4
fashion 11.3
human 11.2
men 11.2
culture 11.1
architecture 10.9
bag 10.7
outdoor 10.7
pretty 10.5
wind instrument 10.2
model 10.1
happy 10
one 9.7
walking 9.5
smile 9.3
holding 9.1
businessman 8.8
world 8.7
tourist 8.7
musical instrument 8.7
corporate 8.6
walk 8.6
attractive 8.4
old 8.4
outdoors 8.3
costume 8.2
lady 8.1
active 8.1
hat 8.1
religion 8.1
posing 8
performer 8
job 8
disguise 7.9
standing 7.8
sax 7.8
jeans 7.6
sport 7.6
clothes 7.5
alone 7.3
landmark 7.2
looking 7.2
love 7.1

Google
created on 2023-07-06

Microsoft
created on 2023-07-06

outdoor 99.3
person 98.8
black and white 97.2
jacket 95.7
street 95.3
clothing 92.2
coat 88.8
monochrome 88.2
human face 74.6
smile 69.9
way 43.5
sidewalk 27.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 24-34
Gender Female, 100%
Calm 70.8%
Surprised 23.2%
Fear 6.7%
Sad 3.2%
Happy 2.1%
Confused 2%
Angry 1.7%
Disgusted 1%

Microsoft Cognitive Services

Age 26
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.4%
Male 99.4%
Man 99.4%
Person 99.4%
Glove 97.7%
Shoe 95.9%
Handbag 85.3%
Jacket 73.3%
Coat 62.4%

Text analysis

Amazon

THIS
THIS SPACE
SPACE
OCCUPIED
DATA PLAZA

Google

PTY
THIS
SPACE
OCCUPIED
DO
PTY PLAZA THIS SPACE OCCUPIED DO
PLAZA