Human Generated Data

Title

In Front of Ramases Palace, Thebes (panorama)

Date

1890s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Dr. Robert Drapkin, 2.2002.2712

Human Generated Data

Title

In Front of Ramases Palace, Thebes (panorama)

People

Artist: Unidentified Artist,

Date

1890s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Dr. Robert Drapkin, 2.2002.2712

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Soil 99.4
Person 98.9
Human 98.9
Person 98.2
Person 97.8
Person 97.3
Archaeology 97.1
Person 94.9
Person 94.7
Person 94.3
Person 92.4
Person 87.8
Painting 87.6
Art 87.6
Person 82
Person 81.2
Building 65.5
People 61.7
Person 60.8
Bunker 57.9
Crypt 56.8
Architecture 55.6
Clothing 55.1
Apparel 55.1

Clarifai
created on 2023-10-25

people 99.8
adult 97.9
man 97.8
woman 95.5
wear 95.2
child 95.1
group 94.8
art 92.8
sepia 91.2
two 90.9
cavalry 86.8
boy 86.4
group together 86
many 85.9
transportation system 83.9
sepia pigment 83.7
military 82.2
soldier 80.9
desert 80.6
travel 78.3

Imagga
created on 2022-01-08

old 48.8
grunge 48.6
wall 45.6
ancient 39.8
antique 38.4
texture 37.5
aged 36.2
vintage 35.6
graffito 32.1
grave 31.7
grungy 28.5
pattern 24.6
dirty 24.4
retro 23.8
stucco 23.6
decoration 23.1
paper 22.8
textured 22.8
empty 22.4
architecture 21.9
material 21.5
stone 21.1
rough 21
brown 19.9
frame 19.2
damaged 19.1
blank 18.9
worn 18.2
structure 18.1
weathered 18.1
decay 17.4
surface 16.8
historic 16.5
design 16.3
parchment 16.3
page 15.8
brick 15.2
paint 14.5
border 14.5
stained 14.4
aging 14.4
building 14.3
art 14.2
canvas 14.2
artistic 13.9
rock 13.9
detail 13.7
torn 13.6
stain 13.5
concrete 13.4
travel 13.4
decorative 13.4
space 13.2
wallpaper 13
construction 12
stains 11.7
burnt 11.7
history 11.6
textures 11.4
backgrounds 11.4
document 11.1
color 11.1
materials 10.8
sand 10.7
crumpled 10.7
dirt 10.5
memorial 10.2
house 10
backdrop 9.9
manuscript 9.8
facade 9.7
building material 9.7
cardboard 9.6
obsolete 9.6
old fashioned 9.5
culture 9.4
cement 9.4
grain 9.2
dark 9.2
grime 8.8
text 8.7
spot 8.6
yellow 8.6
monument 8.4
gray 8.1
relic 7.9
ragged 7.8
fracture 7.8
sepia 7.8
crack 7.8
your 7.7
brass 7.7
sheet 7.5
drawing 7.5
landscape 7.4
tourism 7.4
exterior 7.4
letter 7.3
landmark 7.2

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

outdoor 99.1
person 92.7
text 84.9
clothing 74.7
group 65.9
man 58.7
old 56.2
several 14.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-31
Gender Female, 87.9%
Calm 82.7%
Sad 8.2%
Happy 4.2%
Disgusted 1.5%
Angry 1.1%
Fear 1%
Confused 0.8%
Surprised 0.4%

AWS Rekognition

Age 30-40
Gender Male, 99.1%
Calm 60.4%
Sad 17.1%
Disgusted 11.1%
Happy 4.6%
Fear 3.2%
Confused 1.3%
Angry 1.3%
Surprised 0.8%

AWS Rekognition

Age 16-24
Gender Female, 60.7%
Calm 85.4%
Disgusted 4.5%
Happy 2.9%
Fear 2.3%
Surprised 1.5%
Confused 1.3%
Angry 1%
Sad 1%

AWS Rekognition

Age 22-30
Gender Female, 92.2%
Calm 96%
Surprised 1.2%
Happy 0.9%
Disgusted 0.6%
Confused 0.5%
Sad 0.4%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 21-29
Gender Male, 78%
Calm 82.8%
Sad 8%
Disgusted 2%
Angry 1.6%
Surprised 1.5%
Fear 1.5%
Confused 1.3%
Happy 1.3%

Feature analysis

Amazon

Person 98.9%
Painting 87.6%

Categories