Human Generated Data

Title

Untitled (demonstration, New York City)

Date

c. 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5594

Human Generated Data

Title

Untitled (demonstration, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

c. 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5594

Machine Generated Data

Tags

Amazon
created on 2019-11-11

Human 99.4
Person 99.4
Person 99.4
Person 99.2
Person 99.2
Person 98.4
Person 97.5
Person 97.4
Apparel 91.8
Hat 91.8
Clothing 91.8
Person 91.5
Crowd 86.9
Person 79.8
Helmet 77.1
People 73.8
Military 72.4
Military Uniform 70.2
Police 69.1
Person 62.6
Person 61.7
Pedestrian 58.6
Officer 58.1
Parade 55.3
Fireman 55.2
Person 49.6

Clarifai
created on 2019-11-11

people 99.9
group 99
many 98.7
group together 98.3
adult 98.2
administration 98
man 96.8
military 96.7
wear 95
leader 94
war 94
police 91.6
soldier 91.4
crowd 91.1
uniform 90.6
woman 88.5
veil 87.1
offense 85.8
law 85.7
outfit 85.4

Imagga
created on 2019-11-11

old 21.6
man 20.9
statue 19
people 18.4
religion 17.9
sculpture 17.7
architecture 16.4
monument 15.9
travel 14.8
metropolitan 14.3
room 14.3
person 14.3
historical 14.1
ancient 13.8
art 13.2
male 13
kin 12.2
history 11.6
religious 11.2
home 11.2
classroom 11.2
tourism 10.7
world 10.7
sitting 10.3
life 10.2
historic 10.1
adult 9.9
building 9.9
worker 9.8
men 9.4
culture 9.4
stone 9.3
vintage 9.2
tourist 9.2
business 9.1
human 9
group 8.9
antique 8.8
indoors 8.8
couple 8.7
women 8.7
god 8.6
house 8.3
church 8.3
grandfather 8
looking 8
clothing 8
monk 7.9
pray 7.7
temple 7.7
casual 7.6
happy 7.5
senior 7.5
traditional 7.5
city 7.5
work 7.4
inside 7.4
love 7.1
day 7.1

Google
created on 2019-11-11

Microsoft
created on 2019-11-11

person 98.5
clothing 97.8
text 95.7
outdoor 90.8
man 85.4
human face 85.1
group 75.1
people 72.1
woman 70
old 59.4
crowd 1.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-45
Gender Female, 99.6%
Sad 0.1%
Happy 97.8%
Surprised 0.1%
Fear 0%
Angry 0%
Calm 1.9%
Disgusted 0%
Confused 0.1%

AWS Rekognition

Age 22-34
Gender Male, 89.5%
Angry 9.9%
Calm 17.5%
Disgusted 5.7%
Confused 33.4%
Sad 16.2%
Surprised 6%
Fear 10.2%
Happy 1.1%

AWS Rekognition

Age 26-40
Gender Female, 54.7%
Surprised 45.1%
Happy 45%
Fear 45%
Angry 45.5%
Confused 45.2%
Calm 53.6%
Sad 45.6%
Disgusted 45.1%

AWS Rekognition

Age 23-37
Gender Female, 52.6%
Disgusted 45%
Sad 45.3%
Confused 45.1%
Happy 45%
Calm 52.1%
Fear 45%
Angry 47.2%
Surprised 45.2%

AWS Rekognition

Age 22-34
Gender Male, 53.8%
Surprised 45.4%
Happy 45.1%
Disgusted 45.1%
Calm 50.2%
Fear 45.2%
Sad 45.7%
Angry 48.2%
Confused 45.1%

AWS Rekognition

Age 23-35
Gender Female, 53.5%
Sad 45.1%
Surprised 45%
Happy 45.7%
Disgusted 45%
Angry 45%
Confused 45%
Fear 45%
Calm 54.2%

AWS Rekognition

Age 37-55
Gender Male, 52.9%
Angry 45.4%
Calm 50.4%
Happy 45.1%
Fear 45.8%
Surprised 46%
Disgusted 45.6%
Sad 46.4%
Confused 45.3%

AWS Rekognition

Age 29-45
Gender Male, 50.4%
Angry 49.5%
Sad 49.5%
Disgusted 49.5%
Surprised 49.5%
Happy 50.4%
Fear 49.5%
Calm 49.5%
Confused 49.5%

AWS Rekognition

Age 24-38
Gender Female, 50.3%
Calm 49.7%
Fear 49.5%
Disgusted 49.7%
Angry 49.8%
Surprised 49.5%
Sad 49.6%
Happy 49.6%
Confused 49.5%

AWS Rekognition

Age 7-17
Gender Female, 52.2%
Happy 45.3%
Angry 51.2%
Calm 46.4%
Confused 45.3%
Disgusted 45.6%
Sad 45.6%
Fear 45.4%
Surprised 45.3%

Microsoft Cognitive Services

Age 33
Gender Female

Microsoft Cognitive Services

Age 32
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Hat 91.8%
Helmet 77.1%