Human Generated Data

Title

Untitled (Borobudur, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2265

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Borobudur, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Building 97.3
Architecture 97.3
Head 96.3
Art 95.4
Statue 95.4
Sculpture 95.4
Brick 92.5
Rock 88.4
Temple 87
Worship 80.9
Shrine 79
Hydrant 74
Fire Hydrant 74
Archaeology 70.3
Indoors 61.6
Face 57.9
Buddha 57.8
Housing 57
Monastery 56.6

Clarifai
created on 2018-03-23

sculpture 99.4
people 98.6
religion 98.5
art 98.4
statue 98.4
Buddha 97.9
one 97.7
no person 96.7
ancient 96.2
monochrome 95.9
temple 94.5
old 94.2
portrait 93.3
travel 91.8
stone 89
architecture 88.2
spirituality 87.5
two 87.3
adult 86.7
wat 86.1

Imagga
created on 2018-03-23

spider web 37.9
fountain 33.1
structure 29.8
cobweb 29.3
web 25.8
old 23
tree 22.2
landscape 20.8
stone 18.7
snow 18.6
travel 18.3
forest 18.3
park 16.5
trees 15.1
temple 15
water 14.7
dirty 14.5
architecture 14.2
grunge 13.6
winter 13.6
natural 12.7
art 12.6
texture 12.5
autumn 12.3
ancient 12.1
spider 11.8
fall 11.8
river 11.6
outdoor 11.5
ice 11.3
black 10.9
tourism 10.7
ruins 10.7
environment 10.7
textured 10.5
woods 10.5
sun 10.5
rock 10.4
weather 10.4
pattern 10.3
vintage 9.9
mountain 9.8
wall 9.8
cold 9.5
light 9.4
season 9.4
foliage 9.2
branch 9.1
religion 9
outdoors 9
sky 8.9
sculpture 8.8
ruin 8.8
artistic 8.7
trap 8.6
dark 8.4
statue 8.3
window 8.2
backgrounds 8.1
wet 8
close 8
tourist 7.8
color 7.8
stream 7.6
building 7.5
historical 7.5
sunrise 7.5
rough 7.3
fantasy 7.2
colorful 7.2
history 7.2
grass 7.1
cemetery 7.1
surface 7.1
rural 7

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

old 59.5

Face analysis

Amazon

AWS Rekognition

Age 57-77
Gender Male, 69.5%
Surprised 0.3%
Confused 0.3%
Disgusted 0.1%
Calm 94.8%
Angry 0.8%
Sad 3.4%
Happy 0.2%

AWS Rekognition

Age 4-7
Gender Female, 50.1%
Happy 49.6%
Disgusted 49.7%
Angry 49.6%
Surprised 49.5%
Sad 49.7%
Calm 49.9%
Confused 49.5%

Feature analysis

Amazon

Captions

Microsoft

a person standing in front of a building 56.6%
an old photo of a person 56.5%
an old photo of a person 53.5%