Human Generated Data

Title

Luxor (panorama)

Date

1890s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Dr. Robert Drapkin, 2.2002.2716

Human Generated Data

Title

Luxor (panorama)

People

Artist: Unidentified Artist,

Date

1890s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Architecture 99.6
Building 99.6
Person 99.1
Human 99.1
Person 98.6
Temple 98.2
Worship 98
Pillar 97.7
Parthenon 97.7
Column 97.7
Shrine 97.7
Person 87.3
Ruins 74.3

Imagga
created on 2022-01-08

picket fence 100
fence 100
barrier 95
obstruction 61.7
structure 42.1
architecture 36.2
sky 31.3
stone 27.6
landscape 26.8
travel 26.1
history 23.3
tourism 22.3
column 21.7
ancient 20.8
old 19.5
building 19.3
temple 18.2
landmark 18.1
memorial 16.3
statue 16.2
grass 15.8
summer 14.8
tree 14.6
culture 14.6
gravestone 14.3
city 14.2
famous 14
ruins 13.7
coast 13.5
water 13.4
clouds 12.7
scenic 12.3
beach 11.8
sea 11.7
outdoors 11.2
historic 11
columns 10.8
ruin 10.7
wall 10.5
sun 10.5
skyline 10.5
coastline 10.4
cloud 10.4
ocean 10
scenery 9.9
roman 9.8
field 9.2
park 9.1
horizon 9
sunlight 8.9
forest 8.7
day 8.6
classical 8.6
buildings 8.5
outdoor 8.4
shore 8.4
urban 7.9
sand 7.9
archeology 7.9
marble 7.8
past 7.8
destinations 7.7
garden 7.6
historical 7.5
wood 7.5
hill 7.5
vacation 7.4
freedom 7.3
countryside 7.3
country 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

sky 90.4
building 82.9
colonnade 8.4

Face analysis

Amazon

AWS Rekognition

Age 16-22
Gender Female, 92%
Calm 97.4%
Confused 0.8%
Sad 0.8%
Happy 0.4%
Angry 0.2%
Disgusted 0.2%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 22-30
Gender Male, 99.4%
Happy 56.7%
Sad 19.8%
Calm 16.3%
Disgusted 2.4%
Angry 2.1%
Surprised 1.1%
Confused 0.8%
Fear 0.7%

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a close up of a bridge 51.4%
a close up of an old building 51.3%
a large building 51.2%