Human Generated Data

Title

Untitled (cremation ceremony, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2403

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (cremation ceremony, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2403

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Human 99.9
Person 99.9
Person 99.5
Person 99.2
Person 99.2
Person 98.9
Person 98.7
Person 98.1
Person 98
Person 97.6
Person 96.9
Person 94.7
Person 94.7
Person 94.1
Person 92.7
Person 91.9
Apparel 86.8
Clothing 86.8
Person 79.6
Person 76.1
Person 74.1
Market 74
Person 73.2
Bazaar 68.9
Shop 68.9
People 68.3
Theme Park 64.6
Amusement Park 64.6
Tree 62.1
Plant 62.1
Outdoors 61.5
Person 61.4
Crowd 56
Person 46.8
Person 42.3

Clarifai
created on 2018-03-23

people 100
group 99.2
many 99
adult 98.3
group together 97.7
man 96.1
woman 94.1
child 93.5
crowd 87.7
music 87.3
recreation 86.1
several 84.3
wear 82.5
musician 82
home 81.7
administration 80.5
tent 79.6
family 78.9
vehicle 76
military 74.8

Imagga
created on 2018-03-23

stage 74.2
platform 55.3
parasol 35.8
canvas tent 19.7
outdoor 17.6
culture 15.4
park 14.6
religion 14.3
travel 14.1
traditional 13.3
architecture 12.5
umbrella 12.5
person 12.3
people 12.3
tree 11.5
sky 11.5
temple 11.4
night 10.6
outdoors 10.4
old 10.4
religious 10.3
holiday 10
light 10
color 10
seller 9.8
building 9.7
summer 9.6
water 9.3
forest 8.7
ancient 8.6
tent 8.3
tradition 8.3
city 8.3
tourism 8.2
vacation 8.2
lady 8.1
recreation 8.1
art 7.8
tropical 7.7
house 7.5
dark 7.5
leisure 7.5
landscape 7.4
sun 7.2
celebration 7.2
colorful 7.2
history 7.2
trees 7.1

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

tree 100
outdoor 100
person 99.2
people 92
group 83.6
old 72.8
white 61.3
crowd 59.9
dressed 36.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-38
Gender Female, 50.4%
Calm 53.5%
Disgusted 45.7%
Sad 45.3%
Angry 45.3%
Confused 45.1%
Surprised 45.1%
Happy 45.1%

AWS Rekognition

Age 26-43
Gender Female, 54.2%
Disgusted 45.1%
Angry 48.1%
Calm 50.6%
Sad 45.7%
Happy 45.1%
Confused 45.2%
Surprised 45.3%

AWS Rekognition

Age 17-27
Gender Male, 54.6%
Angry 45.1%
Calm 47.2%
Happy 45%
Disgusted 45%
Surprised 45.1%
Sad 52.3%
Confused 45.3%

AWS Rekognition

Age 20-38
Gender Male, 50.3%
Sad 49.6%
Surprised 49.5%
Calm 49.6%
Happy 49.5%
Angry 49.7%
Disgusted 50.1%
Confused 49.5%

AWS Rekognition

Age 26-43
Gender Male, 50.3%
Sad 49.7%
Calm 50.1%
Angry 49.5%
Surprised 49.5%
Confused 49.5%
Happy 49.5%
Disgusted 49.5%

AWS Rekognition

Age 12-22
Gender Female, 50.2%
Disgusted 49.7%
Happy 49.6%
Angry 49.5%
Surprised 49.6%
Sad 50%
Calm 49.5%
Confused 49.6%

AWS Rekognition

Age 19-36
Gender Female, 50.2%
Angry 49.6%
Disgusted 49.5%
Calm 49.5%
Happy 49.5%
Sad 50.2%
Surprised 49.5%
Confused 49.5%

AWS Rekognition

Age 26-43
Gender Male, 50.1%
Confused 49.5%
Surprised 49.5%
Calm 49.5%
Happy 49.5%
Angry 49.7%
Sad 50.1%
Disgusted 49.5%

AWS Rekognition

Age 20-38
Gender Male, 50.2%
Sad 49.8%
Disgusted 49.7%
Surprised 49.6%
Confused 49.6%
Angry 49.6%
Happy 49.6%
Calm 49.7%

AWS Rekognition

Age 26-43
Gender Male, 50.4%
Surprised 45.3%
Happy 45.1%
Disgusted 45.1%
Calm 48%
Sad 51%
Angry 45.2%
Confused 45.3%

AWS Rekognition

Age 35-52
Gender Male, 50.1%
Sad 49.8%
Angry 49.7%
Happy 49.5%
Calm 49.7%
Surprised 49.5%
Disgusted 49.7%
Confused 49.6%

AWS Rekognition

Age 27-44
Gender Female, 50.4%
Sad 50.1%
Calm 49.6%
Angry 49.6%
Surprised 49.5%
Confused 49.5%
Happy 49.6%
Disgusted 49.6%

AWS Rekognition

Age 29-45
Gender Female, 50.4%
Disgusted 49.5%
Calm 49.5%
Happy 49.5%
Angry 49.5%
Sad 50.4%
Confused 49.5%
Surprised 49.5%

Microsoft Cognitive Services

Age 18
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.9%