Human Generated Data

Title

Untitled (east entrance gate, Angkor Thom, Cambodia)

Date

February 28, 1960-March 1, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5301

Human Generated Data

Title

Untitled (east entrance gate, Angkor Thom, Cambodia)

People

Artist: Ben Shahn, American 1898 - 1969

Date

February 28, 1960-March 1, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5301

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Archaeology 100
Architecture 100
Building 100
Monastery 100
Cross 99.9
Symbol 99.9
Person 98
Art 91.1
Tomb 65.3
Outdoors 62.9
Rock 57.1
Plant 56.2
Tree 56.2
Gravestone 56.2
Wall 56.1
Sculpture 56.1
Statue 56.1
Slate 56
Vegetation 55.8
Painting 55.6
Ruins 55.6
Bunker 55.2

Clarifai
created on 2018-05-10

people 99.8
adult 98.9
one 97.3
wear 97.2
man 95.9
art 93.3
woman 92.5
group 91.1
two 89.6
portrait 88.1
painting 85.1
window 84
picture frame 83.6
facial expression 81.8
old 81.3
vintage 81.2
furniture 77.6
desktop 77.3
retro 76.8
music 74.5

Imagga
created on 2023-10-05

hair 17.4
person 16.6
attractive 15.4
vessel 14.7
pretty 14.7
outdoor 14.5
portrait 14.2
adult 14.2
man 13.4
people 13.4
wall 13.3
sexy 12.8
toilet tissue 12.4
lifestyle 12.3
seat 11.7
black 11.4
face 11.4
happy 11.3
outdoors 11.2
sitting 11.2
support 11.1
chair 10.9
lonely 10.6
eyes 10.3
tree 10.1
alone 10
dirty 9.9
dress 9.9
park 9.9
tissue 9.9
landscape 9.7
summer 9.6
shovel 9.6
device 9.6
cute 9.3
relaxation 9.2
wood 9.2
child 9.1
old 9.1
paper 9
women 8.7
sad 8.7
water 8.7
male 8.6
smile 8.5
casual 8.5
skin 8.5
nice 8.2
body 8
art 7.8
bench 7.7
snow 7.6
relax 7.6
fashion 7.5
human 7.5
one 7.5
bathtub 7.3
lady 7.3
gorgeous 7.2
rest 7.2
building 7.1
clothing 7.1
love 7.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 36-44
Gender Female, 99.7%
Fear 76.8%
Calm 16.6%
Surprised 10.5%
Happy 5.5%
Angry 4%
Sad 3.7%
Disgusted 3.2%
Confused 0.9%

Feature analysis

Amazon

Person 98%

Captions

Microsoft
created on 2018-05-10

an old photo of a person 52.7%
a person sitting in front of a window 28.6%
a dirty window 28.5%