Human Generated Data

Title

Occupying Wall Street, March 17, 2012

Date

2012

People

Artist: Accra Shepp, American born 1962

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, 2019.314.16

Copyright

© Accra Shepp

Human Generated Data

Title

Occupying Wall Street, March 17, 2012

People

Artist: Accra Shepp, American born 1962

Date

2012

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, 2019.314.16

Copyright

© Accra Shepp

Machine Generated Data

Tags

Amazon
created on 2023-07-06

Clothing 100
Footwear 99.6
Adult 99.4
Female 99.4
Person 99.4
Woman 99.4
Accessories 99
Bag 99
Handbag 99
Shoe 99
Adult 98.5
Person 98.5
Male 98.5
Man 98.5
Shoe 97.9
Adult 97.5
Person 97.5
Male 97.5
Man 97.5
Handbag 97.3
Hat 97
Overcoat 96.9
Adult 96.9
Female 96.9
Person 96.9
Woman 96.9
Bride 96.9
Wedding 96.9
Shoe 95.9
Shoe 95.7
Hat 92.6
Hat 91.2
Shoe 90.8
Coat 88.6
Shoe 87.9
Face 85.3
Head 85.3
Adult 84.8
Person 84.8
Male 84.8
Man 84.8
High Heel 82.3
Shoe 76.2
Boot 70.7
Person 69.7
Long Sleeve 69.1
Sleeve 69.1
Person 67.8
Glove 63.8
Shoe 63.8
Fashion 60.8
Person 60.4
Jacket 58.8
Dress 57.9
Photography 57.4
Portrait 57.4
Purse 56.5
Riding Boot 55.7

Clarifai
created on 2023-10-13

people 99.8
street 98.8
wear 98.5
wedding 97.7
group 97.4
dress 97.4
woman 97.1
adult 96.5
monochrome 93.4
man 92
many 92
portrait 91.5
fashion 88.1
several 86.9
model 85.5
group together 85.4
actress 84.9
outfit 83.5
girl 81
two 80.4

Imagga
created on 2023-07-06

dress 41.6
groom 40.8
bride 36.2
wedding 35
couple 34
people 32.4
happy 28.8
adult 28.5
man 28.2
attractive 27.3
love 26.8
fashion 25.6
person 25.2
happiness 24.3
portrait 23.9
smile 22.1
male 22
marriage 21.8
pretty 21.7
smiling 20.3
married 20.1
women 19.8
lady 19.5
lifestyle 18.8
coat 18.5
two 17.8
domestic 17.4
bouquet 17.1
day 15.7
together 14.9
flowers 14.8
gown 14.6
outside 14.5
garment 14.5
sexy 14.5
romantic 14.3
attendant 14.2
outdoors 14.2
clothing 14.1
brunette 13.9
outdoor 13.8
veil 13.7
cute 13.6
outfit 13.6
summer 13.5
joy 13.4
model 13.2
tuxedo 12.8
bridal 12.6
trench coat 12.6
suit 12.4
business 12.1
face 12.1
celebration 12
city 11.6
family 11.6
holding 11.6
human 11.2
expression 11.1
elegance 10.9
cheerful 10.6
talking 10.5
wife 10.4
relationship 10.3
professional 10.2
black 10.1
street 10.1
raincoat 10.1
husband 10
wed 9.8
romance 9.8
handsome 9.8
engagement 9.6
style 9.6
diversity 9.6
looking 9.6
tie 9.5
fashionable 9.5
clothes 9.4
costume 9.3
businesswoman 9.1
park 9.1
posing 8.9
mother 8.8
urban 8.7
ceremony 8.7
building 8.7
standing 8.7
elegant 8.6
life 8.5
casual 8.5
modern 8.4
emotion 8.3
20s 8.2
pose 8.2
new 8.1
success 8
businessman 7.9
hair 7.9
flower 7.7
boutique 7.7
walking 7.6
fur coat 7.6
group 7.3
shop 7.2
holiday 7.2
spring 7.1
work 7.1
jacket 7

Google
created on 2023-07-06

Hairstyle 95.1
Photograph 94.4
White 92.2
Leg 90.9
Black 90.2
Fashion 88.2
Black-and-white 87.9
Style 84.3
Waist 82.4
Fashion design 78.6
People 77.9
Road 77.2
Monochrome 77.1
Formal wear 77
Monochrome photography 75.9
Snapshot 74.3
Flooring 73.8
Event 73.8
Human leg 73.6
Suit 73.2

Microsoft
created on 2023-07-06

dress 98.9
person 98.9
wedding dress 97.3
clothing 95.5
bride 94.7
woman 91.5
black and white 85
wedding 81.7
suit 79.8
smile 77.9
man 71.7
footwear 65.8
people 61.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 30-40
Gender Female, 97.8%
Calm 77.7%
Happy 18.9%
Surprised 6.5%
Fear 6%
Sad 2.3%
Confused 0.8%
Disgusted 0.7%
Angry 0.6%

AWS Rekognition

Age 25-35
Gender Male, 99.7%
Calm 91.2%
Surprised 7.3%
Fear 5.9%
Angry 3%
Sad 2.3%
Confused 2%
Disgusted 0.6%
Happy 0.3%

Microsoft Cognitive Services

Age 33
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.4%
Female 99.4%
Person 99.4%
Woman 99.4%
Handbag 99%
Shoe 99%
Male 98.5%
Man 98.5%
Hat 97%
Bride 96.9%
Coat 88.6%
Glove 63.8%
Jacket 58.8%

Categories

Text analysis

Amazon

I
I I Y
O
S
IN
the
Y

Google

Toin RATER NINY VER NEW YO PAKAW YOUS NEW YORK URNA MÜMUNDO M
Toin
RATER
NINY
VER
NEW
YO
PAKAW
YOUS
YORK
URNA
MÜMUNDO
M