Human Generated Data

Title

Untitled (cremation ceremony, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2399

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (cremation ceremony, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2399

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Human 99.9
Person 99.9
Person 99.6
Person 99.6
Person 99.6
Person 99.4
Person 99.4
Person 99.1
Person 98.4
Person 97.3
Construction 96.2
Person 93.9
Dinosaur 93.1
Animal 93.1
Reptile 93.1
Person 92.8
Building 84.1
Bird 83.5
Clothing 74.2
Apparel 74.2
Urban 74
People 70.1
Person 68.4
Scaffolding 60.8
Helmet 55.9
Person 54.7
Person 54.4
Person 45.8

Clarifai
created on 2018-03-23

people 100
group 99.7
group together 99.6
adult 99.5
many 98.3
war 98.1
man 97.9
several 96.2
vehicle 96.1
military 95.7
skirmish 94.3
wear 94
transportation system 93.9
soldier 93.5
railway 92.8
child 92.7
woman 91.3
weapon 91.2
four 91
administration 89.7

Imagga
created on 2018-03-23

shopping cart 49.8
handcart 39.9
wheeled vehicle 30.4
building 21.4
container 19.7
man 16.1
old 16
structure 15.4
sky 15.3
people 15
conveyance 14.4
vintage 14
grunge 13.6
person 12.9
construction 12.8
black 12.6
dark 12.5
outdoor 12.2
light 12
industry 11.9
industrial 11.8
work 11.3
men 11.2
house 10.9
crosspiece 10.3
wall 10.3
architecture 10.1
device 10.1
danger 10
city 10
silhouette 9.9
landscape 9.7
water 9.3
art 9.1
life 9
destruction 8.8
urban 8.7
helmet 8.7
brace 8.5
travel 8.4
factory 8.4
protection 8.2
dirty 8.1
metal 8
steel 7.9
fence 7.8
disaster 7.8
antique 7.8
explosion 7.7
concrete 7.6
outdoors 7.5
sunset 7.2
history 7.1
male 7.1
night 7.1

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

outdoor 98.4
person 98.2
group 80.6
people 75.2
old 46.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 50.3%
Confused 49.5%
Angry 49.6%
Surprised 49.6%
Calm 50%
Happy 49.6%
Disgusted 49.7%
Sad 49.6%

AWS Rekognition

Age 23-38
Gender Female, 50.1%
Sad 49.7%
Disgusted 49.6%
Surprised 49.5%
Calm 49.5%
Angry 50.1%
Happy 49.5%
Confused 49.5%

AWS Rekognition

Age 45-66
Gender Female, 50.1%
Happy 49.5%
Disgusted 49.5%
Calm 50.4%
Surprised 49.5%
Angry 49.5%
Sad 49.5%
Confused 49.5%

AWS Rekognition

Age 23-38
Gender Female, 50.4%
Sad 49.9%
Confused 49.5%
Happy 49.6%
Disgusted 49.6%
Angry 49.6%
Surprised 49.6%
Calm 49.7%

AWS Rekognition

Age 35-52
Gender Male, 50.4%
Sad 49.6%
Surprised 49.6%
Confused 49.6%
Happy 49.5%
Calm 50%
Disgusted 49.5%
Angry 49.6%

AWS Rekognition

Age 23-38
Gender Male, 50.1%
Angry 49.5%
Disgusted 49.5%
Happy 49.5%
Sad 50.5%
Calm 49.5%
Confused 49.5%
Surprised 49.5%

AWS Rekognition

Age 60-90
Gender Female, 50.3%
Happy 49.5%
Confused 49.5%
Sad 49.9%
Angry 49.6%
Calm 49.6%
Disgusted 49.7%
Surprised 49.5%

Feature analysis

Amazon

Person 99.9%
Dinosaur 93.1%
Bird 83.5%