Human Generated Data

Title

Untitled (cremation ceremony, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5036

Human Generated Data

Title

Untitled (cremation ceremony, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5036

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Back 100
Body Part 100
Architecture 100
Building 100
Outdoors 100
Shelter 100
People 99.7
Person 98.4
Adult 98.4
Male 98.4
Man 98.4
Person 98.3
Adult 98.3
Male 98.3
Man 98.3
Person 97.7
Adult 97.7
Male 97.7
Man 97.7
Person 97.2
Person 97
Person 97
Adult 97
Male 97
Man 97
Person 96.9
Person 96.7
Countryside 96.7
Hut 96.7
Nature 96.7
Rural 96.7
Person 95.5
Bus Stop 94.1
Person 93.5
Crowd 92.7
Person 92.1
Person 91.5
Person 91.1
Person 90.4
Person 89.4
Person 86.9
Person 82.4
Face 81.6
Head 81.6
Person 80.7
Person 80.3
Person 79.8
Person 79.6
Person 79.6
Person 75.9
Clothing 73.8
Hat 73.8
Person 73.6
Person 67.1
Person 66.2
Person 58.5
Animal 57.5
Bull 57.5
Mammal 57.5
Photography 55.7

Clarifai
created on 2018-05-10

people 99.8
group 98.4
many 98
adult 96.8
man 95.5
child 94.3
war 93.8
crowd 92.7
group together 92.1
woman 86.5
military 85.6
music 84.5
administration 83.6
wear 81.7
monochrome 81.2
soldier 78.1
skirmish 74.1
recreation 70.1
boy 66.2
street 65.5

Imagga
created on 2023-10-06

landscape 29
truck 28.2
vehicle 25.5
field 25.1
sky 21.7
rural 21.1
garbage truck 20.3
motor vehicle 20.2
machine 18.2
tractor 16.2
transportation 16.1
farm 16.1
grass 15.8
agriculture 15.8
industry 15.4
tree 14.8
old 14.6
industrial 14.5
summer 13.5
farming 13.3
country 13.2
passenger 12.9
cloud 12.9
environment 12.3
crop 12.2
countryside 11.9
wheeled vehicle 11.8
road 11.7
trees 11.6
car 11.4
transport 11
structure 10.9
track 10.8
outdoor 10.7
autumn 10.5
scenic 10.5
harvest 10.3
smoke 10.2
hay 9.9
military uniform 9.9
travel 9.9
farmland 9.7
uniform 9.5
equipment 9.3
city 9.1
fall 9.1
farmer 8.9
forest 8.7
scene 8.7
heavy 8.6
rusty 8.6
land 8.5
wheel 8.5
hill 8.4
grain 8.3
outdoors 8.2
horizon 8.1
meadow 8.1
plant 7.8
machinery 7.8
steam 7.8
military vehicle 7.7
locomotive 7.6
power 7.6
factory 7.5
snow 7.4
yellow 7.3
danger 7.3
dirty 7.2
scenery 7.2
spectator 7.2
feed 7.1
work 7.1
stall 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

crowd 1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 19-27
Gender Male, 65.3%
Calm 99%
Surprised 6.3%
Fear 5.9%
Sad 2.4%
Happy 0.1%
Confused 0.1%
Disgusted 0%
Angry 0%

AWS Rekognition

Age 22-30
Gender Female, 98.3%
Calm 84.9%
Sad 14.8%
Surprised 6.3%
Fear 5.9%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%
Happy 0.1%

AWS Rekognition

Age 21-29
Gender Male, 96.4%
Calm 85.9%
Happy 7.7%
Surprised 6.7%
Fear 6.1%
Sad 2.8%
Angry 1.6%
Confused 0.8%
Disgusted 0.4%

AWS Rekognition

Age 18-24
Gender Male, 90.7%
Calm 56.1%
Happy 40.2%
Surprised 6.8%
Fear 6.2%
Sad 2.4%
Confused 0.4%
Disgusted 0.4%
Angry 0.4%

AWS Rekognition

Age 19-27
Gender Female, 77.6%
Calm 70.3%
Happy 16.4%
Surprised 7.1%
Fear 6.9%
Sad 3.6%
Disgusted 2.8%
Confused 1.4%
Angry 1.3%

AWS Rekognition

Age 20-28
Gender Male, 93.3%
Sad 99.8%
Calm 16.2%
Fear 6.6%
Surprised 6.5%
Angry 6.1%
Disgusted 1.9%
Confused 1.3%
Happy 1.1%

AWS Rekognition

Age 6-14
Gender Female, 89%
Sad 100%
Surprised 6.4%
Fear 6%
Angry 1.5%
Happy 0.8%
Calm 0.5%
Confused 0.5%
Disgusted 0.4%

Feature analysis

Amazon

Person 98.4%
Adult 98.4%
Male 98.4%
Man 98.4%