Human Generated Data

Title

14th St. (New York City)

Date

1932-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2837

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

14th St. (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2837

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Person 99.4
Human 99.4
Clothing 99.3
Hat 99.3
Apparel 99.3
Person 99.1
Person 98.8
Person 98.4
Person 96
Face 93
Person 89.9
Person 89.1
Pedestrian 88.7
People 79.3
Crowd 76.1
Coat 71.4
Overcoat 71.4
Suit 65.8
Police 58.1
Cap 57.5

Clarifai
created on 2018-03-16

people 99.9
group 99.1
many 98.4
group together 97.7
adult 97.6
administration 96.3
man 95.1
street 94.7
vehicle 94.2
leader 93.4
police 92.2
several 90.3
wear 88.7
woman 88.4
veil 87.7
crowd 85.3
transportation system 83.6
outfit 83.3
war 82.7
two 81.8

Imagga
created on 2018-03-16

man 38.9
male 32.6
hat 32.5
person 25.8
people 24.5
old 23.7
men 21.4
portrait 16.8
adult 14.9
clothing 14.5
black 14.4
grandfather 14.1
worker 13.7
city 13.3
looking 12.8
face 12.8
cowboy hat 12.8
one 12.7
senior 12.2
guy 12.1
industry 11.9
vintage 11.9
building 11.8
work 11.8
job 11.5
occupation 11
architecture 10.9
headdress 10.7
posing 10.7
street 10.1
hand 9.9
shop 9.8
outdoors 9.7
statue 9.5
culture 9.4
beard 9.1
industrial 9.1
passenger 9
history 8.9
factory 8.8
lifestyle 8.7
business 8.5
attractive 8.4
uniform 8.3
leisure 8.3
covering 8.3
world 8.3
equipment 8.2
retro 8.2
religion 8.1
engineer 8
love 7.9
couple 7.8
antique 7.8
mysterious 7.8
construction 7.7
two 7.6
painter 7.6
helmet 7.6
head 7.6
dark 7.5
mature 7.4
style 7.4
safety 7.4
suit 7.3
time 7.3
pose 7.2

Google
created on 2018-03-16

Microsoft
created on 2018-03-16

person 99.7
outdoor 86.7
old 81.7
black 80.6
posing 80.3
people 70.5
white 67.1
dressed 33.4
crowd 0.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Male, 88.3%
Calm 66.1%
Confused 2.7%
Surprised 2.2%
Disgusted 6.3%
Happy 1.1%
Angry 9.5%
Sad 12.2%

AWS Rekognition

Age 48-68
Gender Male, 98.3%
Calm 92.1%
Surprised 1.9%
Angry 1.2%
Sad 1.3%
Disgusted 0.7%
Confused 1.5%
Happy 1.3%

AWS Rekognition

Age 16-27
Gender Male, 53.8%
Sad 46.3%
Surprised 45.5%
Disgusted 46.4%
Confused 46.4%
Calm 47.4%
Happy 45.6%
Angry 47.4%

AWS Rekognition

Age 60-90
Gender Female, 52.7%
Disgusted 45.6%
Sad 47.8%
Calm 45.9%
Surprised 45.8%
Happy 45.4%
Confused 45.6%
Angry 48.9%

AWS Rekognition

Age 27-44
Gender Female, 50.3%
Confused 49.6%
Sad 49.8%
Calm 49.8%
Happy 49.5%
Surprised 49.5%
Angry 49.7%
Disgusted 49.6%

AWS Rekognition

Age 20-38
Gender Female, 53.5%
Calm 46.9%
Surprised 45.5%
Happy 47.5%
Sad 47.2%
Disgusted 45.6%
Confused 45.8%
Angry 46.5%

Microsoft Cognitive Services

Age 72
Gender Male

Microsoft Cognitive Services

Age 38
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Hat 99.3%