Human Generated Data

Title

Occupying Wall Street, September 16, 2012

Date

2012

People

Artist: Accra Shepp, American born 1962

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, 2019.314.20

Copyright

© Accra Shepp

Human Generated Data

Title

Occupying Wall Street, September 16, 2012

People

Artist: Accra Shepp, American born 1962

Date

2012

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, 2019.314.20

Copyright

© Accra Shepp

Machine Generated Data

Tags

Amazon
created on 2023-07-06

Accessories 99.6
Bag 99.6
Handbag 99.6
Footwear 99.3
Shoe 99.3
Clothing 99.3
Man 99.3
Person 99.3
Adult 99.3
Male 99.3
Person 99.2
Female 99.2
Teen 99.2
Girl 99.2
Shoe 99.2
Shoe 99.1
Pants 98.8
Jeans 98.8
Handbag 98.3
Female 98.2
Woman 98.2
Adult 98.2
Person 98.2
Shoe 98.2
Hat 97.3
Shoe 97.1
Coat 97.1
Person 96.9
Adult 96
Male 96
Man 96
Person 96
Camera 95.9
Electronics 95.9
Handbag 95.2
Person 95
Shoe 94.9
Person 94.3
Glasses 91.5
Person 91
Shoe 90
Hat 89.4
Person 89.2
Person 88
Shoe 87.6
Glasses 87.5
Shoe 84.7
Face 83.9
Head 83.9
Jeans 83.5
Shoe 81.8
Photography 80.2
Machine 76.5
Wheel 76.5
Shoe 69.2
Person 65.2
Handbag 62.1
Bracelet 58.1
Jewelry 58.1
Stroller 58
Cap 56.8
Overcoat 56.7
Sun Hat 56.7
Transportation 56.6
Vehicle 56.6
Photographer 55.7
Portrait 55.1

Clarifai
created on 2023-10-13

people 99.7
street 99.4
woman 97.6
group 95.1
adult 95.1
many 94.5
wear 91.9
man 91.4
road 89.6
group together 88.8
administration 87.8
child 87.4
portrait 87.4
police 84.9
war 84.3
crowd 83
monochrome 80.3
face disguise 79.8
boy 79.5
music 78.3

Imagga
created on 2023-07-06

pedestrian 86.3
people 27.9
man 27.5
person 26.8
male 24.1
helmet 20.1
adult 19.4
women 19
equipment 18.2
men 16.3
two 16.1
soldier 15.6
war 15.5
military 15.4
weapon 15.3
sport 15.1
training 14.8
breastplate 14.2
vehicle 13.8
street 13.8
uniform 13.4
happy 13.2
armor 13.1
outdoor 13
sexy 12.8
armor plate 12.3
smile 12.1
fashion 12.1
safety 12
girls 11.8
active 11.7
portrait 11.6
mask 11.5
machine 11.3
human 11.2
gun 11.2
motor scooter 11.1
protection 10.9
city 10.8
army 10.7
wheeled vehicle 10.7
gym 10.5
urban 10.5
clothing 10.4
strength 10.3
smiling 10.1
transportation 9.9
couple 9.6
shield 9.5
lifestyle 9.4
model 9.3
action 9.3
attractive 9.1
exercise 9.1
warrior 8.8
fight 8.7
jacket 8.7
clothes 8.4
health 8.3
protective covering 8.3
group 8.1
handsome 8
body 8
camouflage 7.9
plate 7.8
black 7.8
bike 7.8
exercising 7.7
pretty 7.7
shoes 7.7
horse 7.6
walking 7.6
power 7.6
strong 7.5
one 7.5
style 7.4
competition 7.3
metal 7.2
fitness 7.2
face 7.1
ride 7.1

Google
created on 2023-07-06

Footwear 98.2
Wheel 96.8
Tire 96.7
Shoe 96
Photograph 94.3
White 92.2
Black 90.1
Human 88.9
Black-and-white 86.7
Gesture 85.3
Style 84.2
Sneakers 80.4
Motor vehicle 79.3
Rolling 78.3
Road 78
Luggage and bags 76.8
Eyewear 76.4
Monochrome photography 75.7
Monochrome 75.5
Bag 74.9

Microsoft
created on 2023-07-06

person 100
outdoor 99
street 98.5
black and white 97.9
clothing 97.6
monochrome 88.2
footwear 82.3
people 80.6
woman 77.3
standing 76.3
group 65.4
city 61.1
man 53
jeans 52.6
posing 38
crowd 0.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 28-38
Gender Female, 60.5%
Calm 96.9%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Happy 1%
Confused 0.9%
Angry 0.2%
Disgusted 0.2%

AWS Rekognition

Age 54-64
Gender Male, 100%
Sad 100%
Calm 10.7%
Surprised 6.3%
Fear 6%
Disgusted 0.3%
Confused 0.2%
Angry 0.2%
Happy 0.1%

AWS Rekognition

Age 22-30
Gender Female, 99.6%
Calm 89.1%
Surprised 7.4%
Fear 6.2%
Sad 4%
Angry 2.2%
Happy 0.8%
Disgusted 0.6%
Confused 0.2%

AWS Rekognition

Age 0-3
Gender Female, 96.1%
Sad 100%
Fear 7.5%
Surprised 6.3%
Calm 1.1%
Angry 0.5%
Disgusted 0.5%
Confused 0.3%
Happy 0.2%

AWS Rekognition

Age 52-60
Gender Male, 99.2%
Angry 59.3%
Calm 10.2%
Surprised 8.7%
Happy 8.7%
Fear 7.8%
Sad 4.7%
Confused 4%
Disgusted 3.4%

AWS Rekognition

Age 23-33
Gender Female, 99.1%
Calm 80.6%
Sad 10.7%
Surprised 6.4%
Fear 6%
Disgusted 4.1%
Angry 2%
Happy 0.4%
Confused 0.4%

AWS Rekognition

Age 16-22
Gender Female, 68.6%
Disgusted 54%
Surprised 14.4%
Fear 14.1%
Calm 9.4%
Angry 4.1%
Sad 4%
Happy 1.6%
Confused 1.1%

Microsoft Cognitive Services

Age 34
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Handbag 99.6%
Shoe 99.3%
Man 99.3%
Person 99.3%
Adult 99.3%
Male 99.3%
Female 99.2%
Teen 99.2%
Girl 99.2%
Jeans 98.8%
Woman 98.2%
Hat 97.3%
Coat 97.1%
Camera 95.9%
Glasses 91.5%
Wheel 76.5%
Bracelet 58.1%

Categories

Text analysis

Amazon

M

Google

P STATIon NOMMIERATION MIDTEN Nations VARES CA IPMA
P
STATIon
NOMMIERATION
MIDTEN
Nations
VARES
CA
IPMA