Human Generated Data

Title

Untitled (cremation ceremony, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2395

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (cremation ceremony, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2395

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Clothing 100
Hat 100
People 94
Face 93
Head 93
Person 91.3
Adult 91.3
Female 91.3
Woman 91.3
Person 90.7
Baby 90.7
Person 88.2
Adult 88.2
Male 88.2
Man 88.2
Sun Hat 87.2
Person 85.4
Person 84.7
Person 83.5
Person 82.2
Person 78.7
Adult 78.7
Male 78.7
Man 78.7
Person 76.6
Photography 61.2
Portrait 56.5
Crowd 56.5
Person 56.3
Architecture 56.2
Building 56.2
Outdoors 56.2
Shelter 56.2
Market 56
Sombrero 55.6
Animal 55.4
Bull 55.4
Mammal 55.4

Clarifai
created on 2018-05-10

people 99.9
group 99.3
many 98.9
war 96.8
group together 95.8
adult 95.7
administration 94.8
military 94.2
child 93.3
soldier 90.7
man 89.4
skirmish 88.4
wear 87.6
several 87.5
crowd 87.2
vehicle 85.5
injury 83.3
woman 82.8
leader 80.9
monochrome 80.4

Imagga
created on 2023-10-05

hat 27.1
cowboy hat 17.9
people 16.7
man 16.1
person 15.3
headdress 15.2
adult 15
clothing 13.2
portrait 12.9
male 12.1
cowboy 11.6
black 10.8
outdoors 10.4
face 9.9
expression 9.4
farm 8.9
happy 8.8
look 8.8
hair 8.7
ranch 8.6
sitting 8.6
outside 8.6
smile 8.5
two 8.5
tree 8.5
field 8.4
pen 8.2
close 8
love 7.9
sport 7.6
bull 7.6
world 7.6
hand 7.6
horse 7.5
fashion 7.5
human 7.5
city 7.5
animals 7.4
brown 7.4
transport 7.3
covering 7.2
looking 7.2
cattle 7.2
eye 7.1
women 7.1
rural 7
autumn 7

Google
created on 2018-05-10

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 9-17
Gender Female, 100%
Sad 99.9%
Calm 14.3%
Surprised 6.7%
Fear 6.4%
Confused 4%
Happy 0.6%
Disgusted 0.6%
Angry 0.5%

AWS Rekognition

Age 18-24
Gender Female, 100%
Sad 99.9%
Calm 16.5%
Surprised 6.5%
Fear 6.1%
Happy 1.2%
Confused 1%
Angry 0.8%
Disgusted 0.8%

AWS Rekognition

Age 23-33
Gender Female, 99.6%
Sad 99.2%
Surprised 20.2%
Fear 9.7%
Calm 5.1%
Confused 4.8%
Angry 2.6%
Disgusted 1.6%
Happy 1.4%

AWS Rekognition

Age 7-17
Gender Male, 59.9%
Fear 97.9%
Surprised 6.3%
Sad 2.2%
Angry 0.4%
Calm 0.4%
Disgusted 0.1%
Happy 0.1%
Confused 0%

Microsoft Cognitive Services

Age 24
Gender Female

Microsoft Cognitive Services

Age 36
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 91.3%
Adult 91.3%
Female 91.3%
Woman 91.3%
Baby 90.7%
Male 88.2%
Man 88.2%

Categories

Captions