Human Generated Data

Title

Harvard Tercentenary, 18 September 1936

Date

1937

People

Artist: Waldo Peirce, American 1884 - 1970

Classification

Paintings

Credit Line

Harvard University Portrait Collection, Gift of Mrs. Bingham in memory of her husband, William J. Bingham, 1981, L107

Human Generated Data

Title

Harvard Tercentenary, 18 September 1936

People

Artist: Waldo Peirce, American 1884 - 1970

Date

1937

Classification

Paintings

Credit Line

Harvard University Portrait Collection, Gift of Mrs. Bingham in memory of her husband, William J. Bingham, 1981, L107

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Art 100
Painting 100
Person 95.2
Person 94.4
Person 93.9
Person 87.9
Person 84.5
Person 83.4
Person 81.3
Concert 80.2
Crowd 80.2
Person 74.6
Person 74.6
Altar 74
Architecture 74
Church 74
Prayer 74
Building 72.7
Person 71.3
Crypt 70.4
Person 69.7
Face 64.6
Head 64.6
Indoors 63
Drawing 57.5
Stage 56.7
Theater 56.4
Mural 55.6
Audience 55.3

Clarifai
created on 2018-05-10

people 99.4
no person 99.1
art 98.1
group 96.6
architecture 94.3
one 93.8
many 92.8
sculpture 92.4
travel 91.5
religion 91.4
administration 89.5
theater 89.3
column 88.9
adult 87.9
building 86.1
statue 85.3
military 84.2
ancient 84.2
man 83.4
two 82.9

Imagga
created on 2023-10-06

facade 49.2
architecture 47.9
building 39.5
landmark 29.8
stone 25.3
statue 24.5
cathedral 23.5
old 23
church 22.2
column 21.5
history 21.5
monument 20.5
famous 20.5
city 20
ancient 19
sculpture 18.3
tourism 18.1
structure 17.3
religion 17
travel 16.9
historic 15.6
architectural 15.4
arch 13.7
art 12.8
tourist 12.7
memorial 12.3
buildings 12.3
historical 12.2
marble 12
exterior 11.1
columns 10.8
organ 10.7
capital 10.4
window 10.4
wind instrument 10
keyboard instrument 9.9
carved 9.8
balcony 9.8
musical instrument 9.7
university 9.5
sky 8.9
antique 8.9
detail 8.8
decoration 8.8
temple 8.8
attraction 8.6
wall 8.6
united 8.6
religious 8.4
national 8.2
symbol 8.1
light 8
night 8
classical 7.6
vintage 7.4
style 7.4
ornate 7.3
tower 7.2
drawing 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

old 70.9
white 64
altar 57.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 7-17
Gender Female, 79.1%
Sad 81.9%
Fear 19.2%
Disgusted 18.8%
Calm 11.5%
Surprised 6.8%
Angry 5.3%
Happy 4.6%
Confused 1.4%

AWS Rekognition

Age 13-21
Gender Female, 92%
Calm 36.8%
Angry 21.2%
Confused 10.8%
Sad 9.5%
Fear 8.3%
Surprised 8.3%
Disgusted 8.1%
Happy 3%

AWS Rekognition

Age 19-27
Gender Female, 93.7%
Fear 82.4%
Angry 19.5%
Calm 10.3%
Surprised 6.5%
Sad 3.6%
Happy 0.7%
Disgusted 0.6%
Confused 0.4%

AWS Rekognition

Age 19-27
Gender Female, 60.8%
Fear 59%
Surprised 24.2%
Confused 10.3%
Sad 8%
Disgusted 6.5%
Angry 3.2%
Calm 3.2%
Happy 2.4%

AWS Rekognition

Age 22-30
Gender Female, 84.6%
Angry 86.7%
Calm 11.9%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Disgusted 0.5%
Confused 0.1%
Happy 0.1%

AWS Rekognition

Age 14-22
Gender Male, 74.4%
Sad 65.8%
Calm 43.9%
Angry 17.9%
Surprised 6.9%
Fear 6.2%
Disgusted 2.6%
Confused 1.2%
Happy 1%

AWS Rekognition

Age 20-28
Gender Male, 82.3%
Calm 73.5%
Fear 11.2%
Sad 7%
Surprised 6.9%
Angry 2.7%
Happy 1.7%
Confused 0.9%
Disgusted 0.9%

AWS Rekognition

Age 22-30
Gender Female, 95.8%
Sad 99.9%
Fear 24.5%
Surprised 6.4%
Angry 0.5%
Disgusted 0.2%
Confused 0.2%
Happy 0.2%
Calm 0.2%

AWS Rekognition

Age 22-30
Gender Male, 80.3%
Angry 62.7%
Fear 13.5%
Sad 11.8%
Surprised 7.5%
Disgusted 3.2%
Calm 3%
Confused 1.5%
Happy 0.9%

AWS Rekognition

Age 19-27
Gender Female, 69.6%
Calm 76.7%
Angry 20.4%
Surprised 7%
Fear 6%
Sad 2.3%
Confused 0.4%
Disgusted 0.3%
Happy 0.1%

AWS Rekognition

Age 18-26
Gender Male, 66%
Sad 99.7%
Calm 15.3%
Fear 7.7%
Surprised 6.9%
Disgusted 5.6%
Confused 2.7%
Angry 2.2%
Happy 1.9%

Feature analysis

Amazon

Person 95.2%
Building 72.7%

Categories