Human Generated Data

Title

Untitled (cremation ceremony, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2400

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (cremation ceremony, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2400

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Person 99.8
Human 99.8
Person 99.8
Person 99.7
Person 99.7
Person 99.6
Person 99.6
Person 99.2
Person 98.8
Person 97.1
Crowd 93.9
Person 93.3
Apparel 87.8
Clothing 87.8
Person 83.9
Audience 81.8
People 81.1
Person 74.3
Person 67
Parade 60
Text 59
Plant 57.6
Festival 55.3

Clarifai
created on 2018-03-23

people 100
many 99.9
group 99.7
group together 99.1
adult 98.6
crowd 98.1
war 97.3
military 97.1
administration 96.7
man 95.3
soldier 93.1
leader 93.1
child 91.2
woman 88.6
wear 88.2
skirmish 88
vehicle 87.6
ceremony 82.8
boy 82.2
spectator 81.7

Imagga
created on 2018-03-23

man 21.5
people 19.5
person 16.6
male 15.2
silhouette 14.1
old 13.2
world 13.1
outdoor 13
city 12.5
walking 12.3
black 12.1
adult 12.1
travel 12
protection 11.8
sunset 11.7
men 11.2
soldier 10.7
building 10.5
industrial 10
group 9.7
military 9.7
summer 9.6
walk 9.5
statue 9.5
spectator 9.1
life 8.6
clothing 8.5
street 8.3
holding 8.2
musical instrument 8.2
weapon 8.2
danger 8.2
dirty 8.1
mask 8.1
child 8.1
religion 8.1
water 8
art 7.9
destruction 7.8
portrait 7.8
nuclear 7.8
sky 7.7
outdoors 7.7
sculpture 7.6
sport 7.6
two 7.6
landscape 7.4
tourism 7.4
historic 7.3
history 7.2
sunlight 7.1
grass 7.1
sea 7

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

outdoor 99.9
sky 99.5
person 98.6
tree 97.6
group 91.1
people 81.3
old 79.4
crowd 42

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Male, 50.4%
Calm 49.7%
Disgusted 49.6%
Angry 49.9%
Happy 49.5%
Confused 49.6%
Sad 49.7%
Surprised 49.5%

AWS Rekognition

Age 20-38
Gender Female, 50.4%
Confused 49.6%
Surprised 49.6%
Happy 49.6%
Angry 49.6%
Disgusted 49.6%
Sad 50%
Calm 49.6%

AWS Rekognition

Age 26-43
Gender Male, 50.2%
Calm 49.8%
Angry 49.5%
Happy 49.5%
Disgusted 49.6%
Sad 50.1%
Surprised 49.5%
Confused 49.5%

AWS Rekognition

Age 29-45
Gender Male, 50.3%
Sad 49.6%
Calm 49.6%
Confused 49.5%
Angry 49.9%
Happy 49.6%
Disgusted 49.7%
Surprised 49.5%

AWS Rekognition

Age 30-47
Gender Female, 50.2%
Calm 49.5%
Disgusted 49.5%
Angry 49.6%
Happy 49.6%
Confused 49.5%
Surprised 49.5%
Sad 50.2%

AWS Rekognition

Age 20-38
Gender Female, 50.2%
Surprised 49.6%
Disgusted 49.6%
Confused 49.6%
Sad 49.6%
Angry 49.8%
Happy 49.6%
Calm 49.8%

AWS Rekognition

Age 26-43
Gender Male, 50.4%
Angry 49.6%
Surprised 49.5%
Happy 49.6%
Sad 49.6%
Calm 49.8%
Disgusted 49.9%
Confused 49.5%

AWS Rekognition

Age 35-52
Gender Male, 50.2%
Happy 49.8%
Confused 49.5%
Angry 49.6%
Disgusted 49.6%
Sad 49.7%
Calm 49.7%
Surprised 49.6%

AWS Rekognition

Age 35-52
Gender Female, 50.4%
Sad 50.4%
Confused 49.5%
Happy 49.5%
Disgusted 49.5%
Angry 49.5%
Surprised 49.5%
Calm 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.1%
Surprised 49.6%
Sad 49.6%
Happy 49.5%
Disgusted 49.7%
Angry 49.8%
Calm 49.8%
Confused 49.5%

AWS Rekognition

Age 20-38
Gender Female, 50.5%
Angry 49.6%
Calm 50.1%
Happy 49.6%
Disgusted 49.6%
Surprised 49.6%
Sad 49.6%
Confused 49.6%

AWS Rekognition

Age 26-43
Gender Female, 50.5%
Angry 49.6%
Disgusted 49.5%
Happy 49.6%
Sad 49.8%
Calm 49.9%
Confused 49.5%
Surprised 49.6%

Feature analysis

Amazon

Person 99.8%