Human Generated Data

Title

Occupying Wall Street, October 15, 2011

Date

2011

People

Artist: Accra Shepp, American born 1962

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, 2019.314.2

Copyright

© Accra Shepp

Human Generated Data

Title

Occupying Wall Street, October 15, 2011

People

Artist: Accra Shepp, American born 1962

Date

2011

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, 2019.314.2

Copyright

© Accra Shepp

Machine Generated Data

Tags

Amazon
created on 2023-07-06

Person 99.5
Clothing 99.4
Footwear 99.4
Shoe 99.4
Shoe 99.2
Accessories 98.1
Bag 98.1
Photography 97.7
Person 97.7
Adult 97.7
Male 97.7
Man 97.7
Shoe 97.4
Face 96.4
Head 96.4
Portrait 96.4
Shoe 95.9
Person 95.8
Adult 95.8
Male 95.8
Man 95.8
Person 95
Adult 95
Male 95
Man 95
Handbag 90
Person 87.3
Person 82.3
Coat 81.9
Handbag 81.6
Person 75.2
People 72
Handbag 63.4
Text 62
Boot 57.4
Purse 57.3
Blackboard 56.7
Sneaker 56.6
City 56.2
Road 56.2
Street 56.2
Urban 56.2
Box 56.1
Flag 55.4

Clarifai
created on 2023-10-13

people 99.7
street 99.6
monochrome 98.5
woman 96.8
wear 96.8
adult 95.6
man 95.5
one 95.1
stock 93.2
portrait 92.7
group 92
many 91.8
outerwear 91
shopping 89.2
fashion 87.6
two 86.9
child 86.5
war 85.7
boy 85.5
mannequin 84.4

Imagga
created on 2023-07-06

robe 51.9
garment 49.9
clothing 47.6
black 36.6
covering 28.1
people 27.9
person 26.2
man 24.2
adult 21.4
male 21.3
mask 21.1
fashion 19.6
portrait 19.4
dress 19
model 18.7
clothes 16.9
dark 16.7
attractive 16.1
consumer goods 15.9
face 15.6
pretty 14.7
lady 13.8
sexy 13.6
looking 13.6
style 12.6
posing 11.5
women 11.1
business 10.9
lifestyle 10.8
night 10.7
elegant 10.3
elegance 10.1
gorgeous 10
holding 9.9
hair 9.5
indoors 8.8
brunette 8.7
standing 8.7
party 8.6
adults 8.5
costume 8.4
hand 8.4
pose 8.2
wet suit 8
nightlife 7.8
happy 7.5
city 7.5
one 7.5
silhouette 7.4
20s 7.3
make 7.3
body 7.2
love 7.1
disguise 7

Google
created on 2023-07-06

Microsoft
created on 2023-07-06

clothing 98.6
person 96.8
black and white 94.1
street 91.7
footwear 84.7
woman 83.2
human face 78.2
monochrome 70.3
girl 63.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 49-57
Gender Female, 100%
Happy 85.4%
Surprised 6.7%
Fear 6.7%
Calm 4.7%
Sad 3.1%
Confused 1.8%
Disgusted 1.6%
Angry 1%

AWS Rekognition

Age 4-10
Gender Female, 78.6%
Calm 79.2%
Fear 8%
Happy 7.3%
Surprised 6.6%
Sad 3.1%
Disgusted 2.6%
Angry 2%
Confused 0.8%

AWS Rekognition

Age 6-16
Gender Female, 99.3%
Calm 56.7%
Sad 28.6%
Fear 7%
Surprised 6.7%
Happy 6%
Confused 5.3%
Disgusted 4.8%
Angry 3.1%

Microsoft Cognitive Services

Age 54
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Shoe 99.4%
Adult 97.7%
Male 97.7%
Man 97.7%
Handbag 90%
Coat 81.9%
Box 56.1%

Categories

Text analysis

Amazon

the
helped
to
fair
99%
of
and
grow
I
chance
I want a fair
want
One of
a
One
in the U.S. I helped
earn and grow
U.S.
in
chance to work.
earn
work.
to build
build

Google

1 One of the 99% I want a fair chance to work. arn and grow in the US. I helped to build
1
One
of
the
99
%
I
want
a
fair
chance
to
work
.
arn
and
grow
in
US
helped
build