Human Generated Data

Title

Untitled (cremation ceremony, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5037

Human Generated Data

Title

Untitled (cremation ceremony, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5037

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Adult 98.5
Female 98.5
Person 98.5
Woman 98.5
Adult 98.4
Person 98.4
Male 98.4
Man 98.4
Person 98.3
Adult 98.2
Person 98.2
Male 98.2
Man 98.2
Person 98.1
Person 98
Adult 97.9
Person 97.9
Male 97.9
Man 97.9
Adult 97.7
Person 97.7
Male 97.7
Man 97.7
Art 96.4
Person 94.2
Person 91.5
Painting 89.8
Person 85.4
People 84.2
Plant 81.9
Tree 81.9
Adult 80.3
Female 80.3
Person 80.3
Woman 80.3
Bride 80.3
Wedding 80.3
Footwear 74.5
Shoe 74.5
Outdoors 69.6
Head 69.5
Adult 67.9
Female 67.9
Person 67.9
Woman 67.9
Bride 67.9
Face 67.2
Person 67.2
Shorts 64.9
Amusement Park 57.9
Drawing 57.8
Fun 57
Theme Park 57
Hat 56.1
Nature 55.5

Clarifai
created on 2018-05-10

people 100
group 99.6
group together 99.2
many 98.7
adult 96.8
man 96.2
administration 95.2
child 94.3
woman 92.5
military 92.1
war 90.5
crowd 90.3
vehicle 86.3
campsite 86
soldier 84.6
several 81.6
music 81.5
leader 81.4
furniture 80.3
recreation 79

Imagga
created on 2023-10-06

loom 32.5
cart 30.2
musical instrument 29.5
shopping 29.3
textile machine 26.1
shopping cart 25.8
device 24.6
machine 23.9
percussion instrument 22.9
marimba 21
shop 20.8
buy 20.6
basket 16.8
supermarket 16.8
handcart 16.1
trolley 15.8
stringed instrument 15.2
sale 14.8
retail 14.2
market 14.2
people 13.4
store 13.2
person 13
work 12.9
man 12.8
adult 12.3
container 12.3
metal 12.1
wheeled vehicle 12
sitting 12
male 11.3
outdoors 11.2
old 10.4
happy 10
wicker 9.8
business 9.7
portrait 9.7
buying 9.6
purchase 9.6
push 9.5
water 9.3
grocery 8.7
lifestyle 8.7
trade 8.6
empty 8.6
two 8.5
outdoor 8.4
pretty 8.4
object 8.1
family 8
pushcart 7.9
product 7.6
wheel 7.5
cheerful 7.3
smiling 7.2
park 7.1
women 7.1
day 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

old 96.4
group 85.1
people 75
posing 67.1
vintage 49.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-30
Gender Male, 92.6%
Calm 99.2%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.3%
Disgusted 0.1%
Happy 0.1%
Confused 0%

AWS Rekognition

Age 13-21
Gender Female, 98.9%
Calm 66.7%
Surprised 10.8%
Sad 8.2%
Fear 6.7%
Angry 6%
Confused 5.9%
Happy 1.3%
Disgusted 1.2%

AWS Rekognition

Age 24-34
Gender Male, 99.4%
Calm 56%
Surprised 40.6%
Angry 11.2%
Fear 6.1%
Sad 3.5%
Disgusted 1.1%
Happy 0.4%
Confused 0.2%

AWS Rekognition

Age 18-24
Gender Male, 79%
Sad 99.6%
Calm 20.2%
Surprised 7.4%
Fear 6%
Confused 5.4%
Happy 2.8%
Angry 2.5%
Disgusted 1.6%

AWS Rekognition

Age 21-29
Gender Male, 99.6%
Calm 58%
Sad 18.1%
Surprised 14.5%
Angry 7.3%
Fear 6.2%
Happy 5.1%
Disgusted 1%
Confused 0.8%

AWS Rekognition

Age 18-24
Gender Male, 92%
Calm 68.1%
Disgusted 9.2%
Surprised 7.1%
Fear 7%
Angry 6.6%
Happy 6.2%
Sad 4.3%
Confused 0.7%

AWS Rekognition

Age 13-21
Gender Female, 84.1%
Fear 52.1%
Surprised 43%
Disgusted 7.4%
Angry 6.4%
Sad 6.3%
Happy 3.9%
Calm 2.3%
Confused 1.5%

AWS Rekognition

Age 7-17
Gender Female, 62%
Surprised 40.4%
Fear 34.4%
Sad 10.5%
Confused 8.8%
Calm 8.7%
Happy 6%
Angry 3.3%
Disgusted 2.7%

Feature analysis

Amazon

Adult 98.5%
Female 98.5%
Person 98.5%
Woman 98.5%
Male 98.4%
Man 98.4%
Bride 80.3%
Shoe 74.5%
Shorts 64.9%
Hat 56.1%