Human Generated Data

Title

Untitled (Carytid Porch, Erechtheum, Acropolis)

Date

c. 1900

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2445

Human Generated Data

Title

Untitled (Carytid Porch, Erechtheum, Acropolis)

People

Artist: Unidentified Artist,

Date

c. 1900

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2445

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Architecture 99
Building 99
Temple 98.9
Worship 97.9
Shrine 97.3
Person 95.9
Human 95.9
Pillar 94.4
Parthenon 94.4
Column 94.4
Person 94.4
Person 91.4
Ruins 86.3
Person 81.4
Person 72.2
Person 62
Person 59.5

Clarifai
created on 2023-10-25

people 99.5
temple 98
archaeology 96.9
Parthenon 96.7
art 96.3
acropolis 96.1
ancient 95.7
ruin 95.2
print 94.4
group 94.1
no person 93.1
two 92.3
adult 92.1
building 91.9
monochrome 91.1
old 86.7
engraving 86.2
religion 86.2
theater 85.5
column 83.9

Imagga
created on 2021-12-15

column 100
arch 64
architecture 48.5
statue 45.5
triumphal arch 40.8
memorial 39.6
monument 37.4
history 36.7
ancient 36.4
travel 34.6
landmark 34.4
stone 34
structure 30.4
famous 29.8
tourism 29.8
temple 29.3
old 27.2
city 25.8
historic 24.8
building 24.7
culture 19.7
marble 19.4
place 18.7
historical 17.9
ruin 17.5
roman 16.8
tourist 16.5
sky 16
ruins 15.6
columns 14.7
arc 13.8
buildings 13.3
town 13
construction 12.9
landscape 12.7
sculpture 12.4
past 11.6
architectural 11.6
classical 11.5
archeology 10.8
park 10.7
gate 10.7
vacation 10.7
antique 10.5
night 9.8
england 9.5
day 9.4
clouds 9.3
facade 9.2
religion 9
acropolis 8.9
church 8.3
traditional 8.3
street 8.3
civilization 7.9
rock 7.8
wall 7.7
attraction 7.7
capital 7.6
outdoors 7.5
ornate 7.3
road 7.2
scenic 7
urban 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

old 86.6
building 80.6
black and white 75.3
white 68.5
sketch 65.4
text 63.3
drawing 59.2
ruin 41.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 9-19
Gender Male, 65.7%
Calm 82.6%
Fear 5.1%
Surprised 3.1%
Happy 2.9%
Sad 2.5%
Disgusted 1.9%
Confused 1%
Angry 0.9%

AWS Rekognition

Age 22-34
Gender Male, 94.2%
Sad 52.7%
Calm 23%
Happy 12.2%
Angry 5.4%
Disgusted 3.8%
Confused 1.4%
Surprised 0.8%
Fear 0.8%

AWS Rekognition

Age 25-39
Gender Female, 77.9%
Sad 75.2%
Calm 19.7%
Confused 3.4%
Angry 0.6%
Happy 0.5%
Disgusted 0.3%
Fear 0.3%
Surprised 0.2%

AWS Rekognition

Age 38-56
Gender Female, 62.2%
Calm 43.5%
Happy 34.1%
Sad 9.4%
Fear 7.4%
Angry 2.8%
Surprised 2.1%
Confused 0.4%
Disgusted 0.2%

AWS Rekognition

Age 50-68
Gender Female, 74.9%
Calm 39.5%
Happy 37.4%
Sad 13.3%
Disgusted 6.5%
Confused 1.3%
Angry 1%
Fear 0.7%
Surprised 0.4%

Feature analysis

Amazon

Person 95.9%

Categories

Captions

Text analysis

Amazon

17
Times
Coryaltus Times
Coryaltus

Google

17
17