Human Generated Data

Title

Untitled (Washington Square North, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, by exchange, P2000.39

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Washington Square North, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, by exchange, P2000.39

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Human 99.8
Person 99.8
Person 99.6
Person 99.3
Person 99.3
Person 97.1
Person 96.8
Footwear 96.7
Shoe 96.7
Clothing 96.7
Apparel 96.7
Shoe 94.4
Person 93.9
Person 89.3
Building 88.7
Shoe 86.8
Shoe 85.2
Person 84.5
Person 79
Factory 78.2
Person 74.5
People 74
Person 68.4
Person 68.1
Shoe 67.9
Person 65.1
Crowd 59.9
Sitting 59.2
Person 58.3

Clarifai
created on 2018-03-23

people 100
group 99.8
many 99.7
group together 99.5
adult 97.9
vehicle 97.2
several 96.4
man 96.1
administration 94
transportation system 93.1
wear 90
leader 89.9
child 89.6
two 89.2
woman 88
military 87.2
street 87.2
three 85.8
war 83.6
police 82.1

Imagga
created on 2018-03-23

statue 21.7
city 19.1
sculpture 15.7
art 13.8
architecture 13.4
man 13.4
old 13.2
black 12.6
building 12.2
ancient 12.1
people 11.7
religion 11.6
history 11.6
stone 11.1
culture 11.1
traditional 10.8
vintage 10.7
travel 10.6
person 10.2
mask 10.1
tourism 9.9
holiday 9.3
fountain 9.2
dirty 9
landmark 9
antique 8.9
drawing 8.8
uniform 8.8
male 8.5
monument 8.4
tradition 8.3
street 8.3
historic 8.2
military 7.7
historical 7.5
fun 7.5
clothing 7.4
symbol 7.4
carousel 7.4
decoration 7.4
new 7.3
sketch 7.3
ride 7.2
day 7.1

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

outdoor 97.6
person 97.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 27-44
Gender Female, 51.3%
Disgusted 45.3%
Calm 49.9%
Happy 45.1%
Angry 46%
Sad 48.5%
Confused 45.2%
Surprised 45.1%

AWS Rekognition

Age 35-52
Gender Male, 54.5%
Confused 45.5%
Angry 46.5%
Surprised 45.3%
Calm 47.1%
Happy 46.1%
Disgusted 45.3%
Sad 49.2%

AWS Rekognition

Age 35-52
Gender Female, 52.4%
Surprised 45.3%
Happy 45.2%
Disgusted 45.2%
Calm 45.3%
Sad 53.1%
Angry 45.5%
Confused 45.4%

AWS Rekognition

Age 19-36
Gender Female, 52%
Calm 49.9%
Confused 45.5%
Surprised 45.6%
Disgusted 45.5%
Happy 45.1%
Angry 45.9%
Sad 47.5%

Microsoft Cognitive Services

Age 26
Gender Male

Microsoft Cognitive Services

Age 46
Gender Male

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 27
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 96.7%