Human Generated Data

Title

Untitled (Hong Kong)

Date

March 3, 1960-March 13, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5416

Human Generated Data

Title

Untitled (Hong Kong)

People

Artist: Ben Shahn, American 1898 - 1969

Date

March 3, 1960-March 13, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5416

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Human 99.8
Person 99.8
Person 99.7
Person 99.7
Person 99.7
Person 99.7
Person 99.7
Person 99.4
Market 98.7
Pedestrian 97.4
Clothing 95.4
Apparel 95.4
Bazaar 93.3
Shop 93.3
Shoe 91.8
Footwear 91.8
Shoe 91.4
Shoe 91.2
Person 85.9
Shorts 85.5
Person 81
Shoe 79.9
Shoe 76.2
People 74
Urban 70
Shoe 68.8
Person 68.6
Shoe 64.5
Plant 62
Tree 61.4
Overcoat 60.1
Coat 60.1
Person 59.8
Pants 58.1
Shopping 56.3

Clarifai
created on 2018-03-23

people 100
many 99.8
group 99.8
group together 99.2
adult 98.2
street 94.1
merchant 93.5
man 93.3
vehicle 92.5
wear 92.3
crowd 91
several 90.6
woman 90.4
child 90
war 89.8
military 88.8
administration 85.8
transportation system 82.7
soldier 79.3
outfit 78.6

Imagga
created on 2018-03-23

man 25.5
people 22.3
danger 20
person 19.1
male 15.6
dance 14.3
clothing 14.1
soldier 13.7
private 13.3
uniform 13.1
sport 12.9
men 12.9
protection 12.7
adult 12.4
military 11.6
art 11.1
world 10.9
black 10.8
mask 10.6
dark 10
camouflage 9.9
dirty 9.9
outdoor 9.9
destruction 9.8
accident 9.8
toxic 9.8
nuclear 9.7
gas 9.6
risk 9.6
pedestrian 9.5
light 9.4
happy 9.4
safety 9.2
industrial 9.1
active 9
color 8.9
radioactive 8.8
radiation 8.8
army 8.8
protective 8.8
symbol 8.7
chemical 8.7
war 8.7
dangerous 8.6
grunge 8.5
smoke 8.4
human 8.2
park 8.2
outdoors 8.2
performer 8.1
recreation 8.1
stalker 7.9
holiday 7.9
day 7.8
couple 7.8
gun 7.8
cultural 7.8
portrait 7.8
summer 7.7
protect 7.7
culture 7.7
silhouette 7.4
tradition 7.4
group 7.2
success 7.2
businessman 7.1

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

person 99.5
outdoor 93.8
old 58.4
crowd 22.7

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 20-38
Gender Female, 54.8%
Sad 47.6%
Disgusted 45%
Confused 45.1%
Surprised 45%
Calm 52.2%
Angry 45.1%
Happy 45%

AWS Rekognition

Age 26-43
Gender Male, 54.5%
Angry 46.1%
Disgusted 47.3%
Happy 45.2%
Sad 46%
Calm 47.5%
Confused 47.2%
Surprised 45.7%

AWS Rekognition

Age 26-43
Gender Male, 50.2%
Disgusted 50%
Calm 49.6%
Confused 49.5%
Surprised 49.5%
Sad 49.8%
Happy 49.5%
Angry 49.6%

AWS Rekognition

Age 35-52
Gender Female, 50.1%
Happy 49.6%
Surprised 49.5%
Disgusted 49.5%
Confused 49.5%
Angry 49.6%
Calm 49.6%
Sad 50.3%

AWS Rekognition

Age 20-38
Gender Female, 50.1%
Calm 49.8%
Confused 49.6%
Surprised 49.6%
Disgusted 49.6%
Happy 49.6%
Angry 49.6%
Sad 49.7%

AWS Rekognition

Age 48-68
Gender Female, 50.1%
Disgusted 49.6%
Confused 49.5%
Calm 50%
Surprised 49.6%
Angry 49.6%
Sad 49.6%
Happy 49.6%

AWS Rekognition

Age 20-38
Gender Female, 50.3%
Surprised 49.6%
Disgusted 49.6%
Confused 49.5%
Sad 49.9%
Angry 49.9%
Happy 49.5%
Calm 49.6%

AWS Rekognition

Age 14-25
Gender Male, 50.4%
Happy 49.6%
Disgusted 49.5%
Surprised 49.5%
Calm 50%
Angry 49.6%
Sad 49.8%
Confused 49.6%

AWS Rekognition

Age 35-52
Gender Female, 50%
Confused 49.5%
Sad 49.5%
Surprised 49.5%
Calm 49.5%
Happy 49.5%
Disgusted 50.3%
Angry 49.6%

Microsoft Cognitive Services

Age 24
Gender Female

Feature analysis

Amazon

Person 99.8%
Shoe 91.8%

Text analysis

Amazon

RDR
EMANS