Human Generated Data

Title

Untitled (Borobudur, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5110

Human Generated Data

Title

Untitled (Borobudur, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5110

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Archaeology 100
Baby 96.5
Person 96.5
Altar 95.9
Architecture 95.9
Building 95.9
Church 95.9
Prayer 95.9
Baby 95.8
Person 95.8
Baby 94.4
Person 94.4
Baby 92.9
Person 92.9
Baby 91.4
Person 91.4
Face 89.1
Head 89.1
Baby 86.1
Person 86.1
Baby 81.4
Person 81.4
Person 67.7
Art 60.3
Goa Gajah 58.3
Landmark 58.3
Painting 57.8
Temple 55.4

Clarifai
created on 2018-05-10

people 100
group 99.3
child 99
many 98
military 98
art 97.1
man 96.1
war 95.7
soldier 94.4
sit 92.7
adult 91.9
antique 91.7
engraving 91.1
memory 90.8
centennial 90.5
wear 90.1
old 90
portrait 88.9
group together 87.8
uniform 87.7

Imagga
created on 2023-10-06

ancient 35.4
sculpture 34.5
stone 31.1
old 29.9
statue 29.8
history 29.5
travel 26.7
temple 26.6
architecture 25.8
monument 24.3
memorial 24
religion 23.3
structure 22.6
art 22.4
culture 22.2
carving 20.4
brass 20.2
tourism 19.8
landmark 16.2
heritage 15.4
famous 13.9
rock 13
historic 12.8
grave 12.6
ruins 12.6
wall 12.6
building 11.8
fountain 11.7
ruin 11.7
traditional 11.6
spirituality 11.5
close 11.4
carved 10.7
army 10.7
antique 10.2
past 9.7
china 9.4
east 9.3
tourist 9.2
decoration 9.1
ruined 8.8
soldier 8.8
god 8.6
historical 8.5
black 8.4
place 8.4
sky 8.3
vintage 8.3
man 8.1
detail 8
civilization 7.8
cemetery 7.8
religious 7.5
city 7.5
landscape 7.4
stall 7.3
aged 7.2
bronze 7.1
marble 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 94.5
old 74.9
posing 66.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 27-37
Gender Male, 79.7%
Disgusted 70.3%
Calm 25.9%
Surprised 6.7%
Fear 6%
Sad 2.4%
Angry 1%
Confused 0.6%
Happy 0.2%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Sad 80.7%
Calm 61.3%
Surprised 6.5%
Fear 6%
Angry 0.5%
Disgusted 0.3%
Happy 0.2%
Confused 0.2%

AWS Rekognition

Age 25-35
Gender Male, 97.4%
Happy 23.8%
Calm 20.7%
Angry 19.5%
Surprised 13%
Sad 8.3%
Confused 7.5%
Fear 7.2%
Disgusted 6%

AWS Rekognition

Age 23-33
Gender Male, 53.5%
Sad 96.2%
Confused 27.7%
Calm 15.5%
Surprised 6.8%
Fear 6.2%
Angry 2.9%
Disgusted 1.6%
Happy 1.1%

AWS Rekognition

Age 26-36
Gender Female, 87.8%
Calm 94.9%
Surprised 6.7%
Fear 6.2%
Sad 2.3%
Happy 1.8%
Disgusted 0.5%
Confused 0.5%
Angry 0.2%

AWS Rekognition

Age 23-31
Gender Male, 99.9%
Sad 100%
Surprised 6.4%
Fear 6.1%
Calm 4.4%
Disgusted 1%
Confused 0.5%
Angry 0.3%
Happy 0.3%

AWS Rekognition

Age 26-36
Gender Male, 80.5%
Sad 92.3%
Calm 32.6%
Disgusted 10.2%
Surprised 9.6%
Fear 6%
Happy 4%
Angry 1.8%
Confused 1.5%

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 50
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Baby 96.5%
Person 96.5%