Human Generated Data

Title

Disabled musician walking on sidewalk, New York City

Date

1963

People

Artist: Leonard Freed, American 1929 - 2006

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Gift, 2014.455

Copyright

© Leonard Freed/Magnum Photos

Human Generated Data

Title

Disabled musician walking on sidewalk, New York City

People

Artist: Leonard Freed, American 1929 - 2006

Date

1963

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Gift, 2014.455

Copyright

© Leonard Freed/Magnum Photos

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Apparel 99.8
Shoe 99.8
Footwear 99.8
Clothing 99.8
Human 99.5
Person 99.5
Person 99.4
Person 99.3
Person 98.8
Person 98.5
Overcoat 94.2
Coat 94.2
Sunglasses 91.6
Accessories 91.6
Accessory 91.6
Person 85.9
Musical Instrument 81.4
Person 79.4
Hat 79.2
Shoe 78.8
Gong 77.2
Shoe 76.1
Suit 70.5
Person 70
Musician 57.1

Clarifai
created on 2018-02-10

people 100
group together 99.4
group 98.8
adult 98.3
many 97.8
wear 97.6
veil 96.8
military 96.7
war 96.3
man 96.2
administration 96.1
uniform 95.3
outfit 95.1
several 94.7
soldier 92.5
police 91.5
leader 90.2
elderly 85.9
four 85.8
lid 84

Imagga
created on 2018-02-10

shield 40.3
helmet 35
armor 34.3
man 29.5
people 23.4
weapon 22
soldier 19.5
male 19.2
person 17.4
clothing 16.8
military 16.4
sword 15.6
war 15.4
men 14.6
covering 13.9
hat 13.9
mask 13.7
army 13.6
uniform 13.4
protection 12.7
warrior 12.7
brass 12.3
gun 12
old 11.8
work 11.8
adult 11.7
safety 11
history 10.7
industrial 10
city 10
religion 9.9
statue 9.8
portrait 9.7
metropolitan 9.5
culture 9.4
industry 9.4
travel 9.1
hand 9.1
danger 9.1
megaphone 9
trombone 8.8
urban 8.7
ancient 8.6
plate 8.6
shop 8.5
equipment 8.3
tourism 8.2
job 8
disaster 7.8
traditional 7.5
sport 7.4
tradition 7.4
historic 7.3
device 7.3
business 7.3
metal 7.2

Google
created on 2018-02-10

Microsoft
created on 2018-02-10

person 100
outdoor 93.5
people 92.3
group 82.2
standing 78.7
crowd 31.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 48-68
Gender Male, 98.7%
Disgusted 8.4%
Sad 2.8%
Calm 18.7%
Surprised 9.6%
Happy 49.8%
Confused 4%
Angry 6.9%

AWS Rekognition

Age 60-90
Gender Male, 82.1%
Disgusted 1.4%
Confused 0.7%
Happy 1%
Angry 1.1%
Calm 92.8%
Surprised 0.8%
Sad 2.1%

AWS Rekognition

Age 30-47
Gender Male, 95%
Disgusted 1.5%
Sad 13.2%
Angry 9.4%
Calm 48.6%
Surprised 13.4%
Happy 1.2%
Confused 12.8%

AWS Rekognition

Age 26-43
Gender Male, 94.6%
Disgusted 6%
Confused 3.7%
Angry 69%
Surprised 1.7%
Happy 1.9%
Sad 14.6%
Calm 3.1%

AWS Rekognition

Age 35-52
Gender Male, 74.1%
Happy 5.8%
Surprised 28%
Angry 14.5%
Confused 9.6%
Disgusted 2.2%
Calm 26.1%
Sad 13.7%

AWS Rekognition

Age 45-63
Gender Male, 54.2%
Disgusted 45.1%
Sad 54.5%
Calm 45.1%
Surprised 45%
Happy 45.1%
Confused 45.1%
Angry 45.1%

AWS Rekognition

Age 20-38
Gender Male, 50.7%
Surprised 45.5%
Happy 45.1%
Confused 45.5%
Angry 46.5%
Disgusted 45.3%
Calm 46.1%
Sad 51%

AWS Rekognition

Age 26-43
Gender Female, 60.4%
Calm 34.4%
Happy 2.5%
Angry 20.5%
Surprised 5.6%
Sad 14.2%
Confused 6.8%
Disgusted 15.9%

Microsoft Cognitive Services

Age 66
Gender Male

Microsoft Cognitive Services

Age 9
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Shoe 99.8%
Person 99.5%
Sunglasses 91.6%
Hat 79.2%