Human Generated Data

Title

Untitled (Borobudur, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5087

Human Generated Data

Title

Untitled (Borobudur, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5087

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Archaeology 100
Person 97.3
Art 95.7
Person 94.8
Baby 94.8
Person 93.8
Baby 93.8
Face 91.1
Head 91.1
Painting 80.1
Person 73.1
Person 58.5
Architecture 57.9
Building 57.9
Wall 57.9
Sculpture 57.3
Statue 57.3
Monastery 55.9

Clarifai
created on 2018-05-10

people 100
group 99
man 98.9
many 98.1
soldier 97.2
military 96.6
war 95.8
adult 95.3
group together 95.2
engraving 93.9
print 91.9
child 91.9
art 88.4
antique 87.6
army 86.7
leader 85.4
vintage 82.4
woman 81.5
several 81.2
old 81.2

Imagga
created on 2023-10-05

sculpture 26.5
old 25.8
statue 24.7
antique 21
ancient 20.7
snow 20.6
history 19.7
art 18.8
vintage 18.2
sketch 17.8
stone 17.4
architecture 17.2
culture 16.2
drawing 15.4
grunge 15.3
monument 14.9
bench 14.4
sky 14
people 13.9
travel 13.4
park bench 13.3
building 12.8
religion 12.5
city 12.5
winter 11.9
tourism 11.5
tree 11.5
outdoor 11.5
representation 11.4
decoration 11.2
landscape 11.2
landmark 10.8
sepia 10.7
retro 10.6
texture 10.4
famous 10.2
structure 9.9
outdoors 9.7
textured 9.6
grave 9.6
man 9.4
historic 9.2
park 9.1
seat 9
color 8.9
marble 8.9
temple 8.7
brick 8.7
sand 8.6
historical 8.5
person 8.4
figure 8.3
kin 8.3
aged 8.1
design 8.1
cemetery 8
season 7.8
empty 7.7
old fashioned 7.6
memorial 7.6
house 7.5
frame 7.5
brown 7.4
wall 7.3
detail 7.2
black 7.2
trees 7.1
paper 7.1
grandfather 7
scenic 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 88.4
black 66.3
old 59.7
posing 45.6
stone 5.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 25-35
Gender Female, 100%
Sad 79.2%
Calm 36%
Happy 17.1%
Surprised 6.8%
Fear 6.2%
Angry 4.3%
Disgusted 3.8%
Confused 0.6%

AWS Rekognition

Age 26-36
Gender Male, 100%
Sad 99.9%
Disgusted 9.6%
Fear 7.4%
Surprised 6.7%
Calm 5.8%
Confused 3.8%
Angry 3%
Happy 0.4%

AWS Rekognition

Age 25-35
Gender Female, 88.1%
Calm 51%
Angry 11.2%
Disgusted 10.5%
Surprised 9.8%
Fear 7.2%
Confused 7%
Sad 6.9%
Happy 3%

Microsoft Cognitive Services

Age 37
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.3%
Baby 94.8%

Categories

Imagga

paintings art 93.9%
people portraits 4.3%