Human Generated Data

Title

Women watch demonstrators performing reenactment on sidewalk, Veterans Against War demonstration, Washington DC

Date

1978

People

Artist: Leonard Freed, American 1929 - 2006

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Doug and Joan Hansen, 2021.553

Copyright

© Leonard Freed/Magnum Photos

Human Generated Data

Title

Women watch demonstrators performing reenactment on sidewalk, Veterans Against War demonstration, Washington DC

People

Artist: Leonard Freed, American 1929 - 2006

Date

1978

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Doug and Joan Hansen, 2021.553

Copyright

© Leonard Freed/Magnum Photos

Machine Generated Data

Tags

Amazon
created on 2023-01-09

Clothing 100
Path 99.5
Sidewalk 99.5
Person 99.3
Man 99.3
Adult 99.3
Male 99.3
Shoe 99
Footwear 99
Person 98.9
Man 98.9
Adult 98.9
Male 98.9
Handbag 98.8
Bag 98.8
Accessories 98.8
Person 98.6
Man 98.6
Adult 98.6
Male 98.6
Person 98.6
Male 98.6
Child 98.6
Boy 98.6
Person 98.3
Person 98.1
Person 98.1
Man 98.1
Adult 98.1
Male 98.1
Shoe 98
Person 97.7
Person 97.7
Shoe 97.6
Person 97.6
Adult 97.6
Woman 97.6
Female 97.6
Shoe 97.6
Handbag 97.5
Person 97.2
Person 97.1
Glasses 96.6
Shoe 95.9
Shoe 95.7
City 95.7
Shoe 95.6
Shoe 93
Shoe 92.1
Shoe 91.6
Shoe 91.5
Hat 91.4
Shoe 91.2
Coat 90.9
Shoe 90.8
Shoe 87.8
Handbag 84.8
Shoe 84.3
Glasses 82.5
Overcoat 81.9
Wheel 80.7
Machine 80.7
Car 77.9
Vehicle 77.9
Transportation 77.9
Hat 77.6
Street 73.9
Urban 73.9
Road 73.9
Handbag 70.3
Hat 68.8
Glasses 65.9
Hat 65
Car 63.7
Person 60.4
Glasses 60.2
Traffic Light 58.9
Light 58.9
Walking 57.5
Pants 57.1
Person 56.7
Dress 55.7
People 55.5

Clarifai
created on 2023-10-13

people 100
group together 99.3
group 98.8
adult 98
several 97.4
woman 95.8
many 95.1
wear 95.1
child 95.1
administration 94.7
man 94.1
street 94.1
military 94
outfit 93.1
five 93.1
three 93
two 92.9
four 92
soldier 90.1
leader 90

Imagga
created on 2023-01-09

people 27.9
sword 24.8
weapon 22.8
world 22
man 21.5
person 20.5
city 19.9
adult 19.7
sport 18.8
male 16.4
urban 14.9
clothing 14.6
women 14.2
active 14.1
shop 13.9
men 13.7
group 12.9
street 12.9
team 12.5
business 11.5
holiday 11.5
athlete 11.4
performer 11.3
legs 11.3
fashion 11.3
happy 11.3
lifestyle 10.8
player 10.4
portrait 10.4
black 10.3
play 10.3
action 10.2
model 10.1
shopping 10.1
girls 10
dress 9.9
costume 9.9
family 9.8
bags 9.7
interior 9.7
standing 9.6
walking 9.5
child 9.4
tradition 9.2
traditional 9.1
suit 9
gift 8.6
walk 8.6
stand 8.5
bag 8.5
ball 8.5
travel 8.4
clothes 8.4
hand 8.4
mother 8.3
indoor 8.2
dancer 8.2
style 8.2
game 8
celebration 8
happiness 7.8
mall 7.8
motion 7.7
attractive 7.7
winter 7.7
two 7.6
hat 7.6
human 7.5
fun 7.5
leisure 7.5
silhouette 7.4
dance 7.3
window 7.3
competition 7.3
children 7.3
present 7.3
success 7.2
smile 7.1
uniform 7.1

Google
created on 2023-01-09

Microsoft
created on 2023-01-09

person 99.4
outdoor 98.9
clothing 98.6
footwear 96.6
way 80.9
black and white 73.4
sidewalk 71
man 69.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 60-70
Gender Female, 99.7%
Confused 71.2%
Calm 27.2%
Surprised 6.6%
Fear 5.9%
Sad 2.2%
Angry 0.3%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 71-81
Gender Male, 100%
Calm 80.1%
Confused 18.4%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 0.5%
Angry 0.2%
Disgusted 0.2%

AWS Rekognition

Age 16-24
Gender Male, 50.2%
Calm 76.4%
Happy 9.7%
Sad 8.1%
Fear 6.6%
Surprised 6.5%
Disgusted 0.9%
Angry 0.7%
Confused 0.3%

AWS Rekognition

Age 18-26
Gender Male, 100%
Calm 54.1%
Confused 35.3%
Surprised 6.7%
Fear 6%
Sad 5.5%
Angry 1.8%
Disgusted 0.6%
Happy 0.2%

AWS Rekognition

Age 21-29
Gender Female, 80.5%
Calm 66.8%
Sad 63.7%
Surprised 6.5%
Fear 6%
Angry 1.1%
Disgusted 0.2%
Confused 0.2%
Happy 0.2%

Microsoft Cognitive Services

Age 67
Gender Male

Microsoft Cognitive Services

Age 29
Gender Male

Microsoft Cognitive Services

Age 60
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Man 99.3%
Adult 99.3%
Male 99.3%
Shoe 99%
Handbag 98.8%
Child 98.6%
Boy 98.6%
Woman 97.6%
Female 97.6%
Glasses 96.6%
Hat 91.4%
Coat 90.9%
Wheel 80.7%
Car 77.9%

Categories

Text analysis

Amazon

RED
VICE
H
TANDINI
SI
ESERVED

Google

NO STANDING 17-9.30 AM A RESERVED 930AM-4PM PARKING BY PENT ONLY
NO
STANDING
17-9.30
AM
A
RESERVED
930AM
-
4PM
PARKING
BY
PENT
ONLY