Human Generated Data

Title

Untitled (Borobudur, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5118

Human Generated Data

Title

Untitled (Borobudur, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5118

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Archaeology 100
Baby 92.7
Person 92.7
Person 91.7
Baby 91.1
Person 91.1
Face 90.7
Head 90.7
Baby 89
Person 89
Baby 88.7
Person 88.7
Art 70.7
Person 70.3
Goa Gajah 69.5
Landmark 69.5
Altar 56.2
Architecture 56.2
Building 56.2
Church 56.2
Prayer 56.2

Clarifai
created on 2018-05-10

people 100
group 99.5
man 98.9
many 98.6
art 98.3
engraving 97.7
furniture 96.4
seat 95.8
adult 95.6
antique 93.8
illustration 93.2
old 92.1
vintage 92.1
leader 91.8
print 91.5
chair 90.9
portrait 88.6
room 87.6
sit 86.4
retro 86.3

Imagga
created on 2023-10-06

brass 95.3
memorial 91.5
structure 61.1
sculpture 45.8
stone 33.9
architecture 31.2
statue 30.9
temple 27.6
religion 26.9
ancient 26.8
history 25.9
art 24.7
old 24.4
culture 23.9
carving 23.4
monument 23.3
travel 19.7
cemetery 17.4
religious 15
famous 14.9
historic 14.7
building 14.5
landmark 14.4
carved 13.7
money 13.6
god 13.4
cash 12.8
antique 12.5
traditional 12.5
bill 12.4
currency 11.7
heritage 11.6
tourism 11.5
spirituality 11.5
figure 11.5
financial 10.7
wall 10.7
decoration 10.5
historical 10.3
dollar 10.2
finance 10.1
banking 10.1
head 10.1
gravestone 10
bank 10
wealth 9.9
holy 9.6
spiritual 9.6
capital 9.5
east 9.3
face 9.2
ornate 9.1
statues 8.9
detail 8.8
savings 8.4
city 8.3
marble 7.9
worship 7.7
dollars 7.7
us 7.7
architectural 7.7
rich 7.4
vintage 7.4
close 7.4
exterior 7.4
investment 7.3
business 7.3
paper 7.1

Google
created on 2018-05-10

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 99.9%
Angry 31.3%
Sad 21.7%
Fear 15.9%
Happy 12.7%
Disgusted 8.4%
Surprised 8.2%
Calm 7.9%
Confused 2%

AWS Rekognition

Age 26-36
Gender Male, 97.7%
Calm 95.7%
Surprised 7.2%
Fear 6%
Sad 2.2%
Confused 1.6%
Happy 0.3%
Disgusted 0.2%
Angry 0.1%

AWS Rekognition

Age 25-35
Gender Male, 99.9%
Calm 74.6%
Sad 10.8%
Surprised 7.3%
Fear 6.5%
Happy 5.3%
Confused 2.4%
Disgusted 1.1%
Angry 1%

AWS Rekognition

Age 26-36
Gender Female, 99.8%
Calm 52.4%
Fear 43.8%
Surprised 6.8%
Sad 3.5%
Confused 3%
Angry 1.1%
Disgusted 0.8%
Happy 0.5%

AWS Rekognition

Age 23-31
Gender Male, 93.7%
Calm 81%
Sad 12.6%
Surprised 6.7%
Fear 6%
Happy 1.4%
Angry 1.3%
Disgusted 0.9%
Confused 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Baby 92.7%
Person 92.7%

Categories

Imagga

paintings art 99.5%