Human Generated Data

Title

Untitled (Artists' Union demonstration?, New York City)

Date

1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5590

Human Generated Data

Title

Untitled (Artists' Union demonstration?, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5590

Machine Generated Data

Tags

Amazon
created on 2019-11-11

Person 99.5
Human 99.5
Person 98.7
Person 97.6
Person 94.8
Hat 93.1
Apparel 93.1
Clothing 93.1
Person 90.1
Person 88.5
Person 80.2
Scarecrow 64.8
Home Decor 57.7
Outdoors 56.1

Clarifai
created on 2019-11-11

people 99.7
street 97.2
adult 96.7
one 95.8
monochrome 94.4
group 94.3
man 92.8
woman 92.7
two 91.4
group together 90.2
music 88.9
three 85.1
wear 85
theater 83.8
outfit 83.3
vehicle 79.9
administration 79.1
city 79
portrait 77.3
child 76.5

Imagga
created on 2019-11-11

statue 70.3
column 54.5
sculpture 52
architecture 39.4
building 28.3
monument 27.1
art 26.5
stone 26.4
ancient 25.1
old 23
religion 21.5
history 20.6
famous 20.5
historic 20.2
travel 19
marble 18.6
city 18.3
culture 18
tourism 17.4
carving 16.7
temple 16.6
landmark 16.3
facade 15.9
structure 15.4
religious 15
fountain 14.2
historical 14.1
antique 13.9
exterior 12.9
figure 12.1
god 11.5
church 11.1
palace 10.8
memorial 10.1
carved 9.8
heritage 9.7
detail 9.7
buildings 9.5
tourist 9.4
ruler 8.9
baroque 8.8
architectural 8.7
cathedral 8.6
place 8.4
man 8.1
roman 7.8
catholic 7.8
bronze 7.7
spirituality 7.7
traditional 7.5
symbol 7.4
water 7.3
university 7.3
face 7.1
wall 7

Google
created on 2019-11-11

Microsoft
created on 2019-11-11

text 99.2
weapon 84
statue 59.5
person 59.1
clothing 55.2
man 51.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-35
Gender Male, 53.4%
Angry 45.2%
Sad 45.3%
Happy 45.5%
Disgusted 45.8%
Fear 45%
Calm 52.6%
Confused 45.2%
Surprised 45.4%

AWS Rekognition

Age 22-34
Gender Male, 54.9%
Fear 45%
Confused 45%
Calm 55%
Angry 45%
Happy 45%
Surprised 45%
Sad 45%
Disgusted 45%

AWS Rekognition

Age 33-49
Gender Male, 54.7%
Fear 45%
Calm 54.9%
Happy 45%
Sad 45.1%
Surprised 45%
Disgusted 45%
Angry 45%
Confused 45%

AWS Rekognition

Age 24-38
Gender Male, 53.8%
Disgusted 45%
Calm 54.5%
Surprised 45%
Fear 45%
Angry 45.1%
Confused 45%
Sad 45.3%
Happy 45.1%

AWS Rekognition

Age 33-49
Gender Male, 96.6%
Confused 1.4%
Fear 0.6%
Surprised 0.6%
Happy 19.3%
Calm 57.9%
Angry 1.2%
Sad 17.6%
Disgusted 1.3%

AWS Rekognition

Age 13-25
Gender Female, 50.1%
Angry 49.6%
Fear 49.5%
Disgusted 49.6%
Calm 50%
Sad 49.6%
Confused 49.6%
Happy 49.6%
Surprised 49.5%

AWS Rekognition

Age 28-44
Gender Male, 50.4%
Surprised 49.5%
Calm 50.3%
Angry 49.5%
Disgusted 49.5%
Happy 49.5%
Sad 49.6%
Fear 49.5%
Confused 49.5%

AWS Rekognition

Age 13-23
Gender Male, 50.2%
Fear 49.5%
Disgusted 49.5%
Sad 49.5%
Angry 50.3%
Happy 49.5%
Calm 49.6%
Surprised 49.5%
Confused 49.5%

Microsoft Cognitive Services

Age 15
Gender Male

Microsoft Cognitive Services

Age 25
Gender Male

Microsoft Cognitive Services

Age 22
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Hat 93.1%

Categories

Captions

Microsoft
created on 2019-11-11

a statue of a person 63.8%
a woman sitting on a statue of a man 39.7%