Human Generated Data

Title

Untitled (studio portrait of two well-dressed older women)

Date

1910s-1920s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3883

Human Generated Data

Title

Untitled (studio portrait of two well-dressed older women)

People

Artist: Unidentified Artist,

Date

1910s-1920s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3883

Machine Generated Data

Tags

Amazon
created on 2019-11-10

Military 99.4
Human 99.2
Person 99.2
Military Uniform 99.2
Person 98.3
Officer 88.7
Armored 88.4
Army 88.4
People 84.1
Soldier 81.7
Person 55.3

Clarifai
created on 2019-11-10

people 99.7
wear 95.7
man 95
retro 94.5
adult 93.8
child 92.9
portrait 92
military 91.5
two 90.2
art 89.4
war 89.3
woman 88.9
monochrome 88.6
group 88.3
outfit 88.3
uniform 87.6
vintage 85.6
soldier 84.6
one 84.6
music 83

Imagga
created on 2019-11-10

statue 56.3
memorial 40.6
cemetery 39.9
sculpture 39.4
stone 39.1
gravestone 38.5
structure 29.3
architecture 27.4
art 24.1
monument 23.4
ancient 23.4
marble 21.3
building 20.6
history 20.6
culture 19.7
old 19.5
travel 19
city 18.3
tourism 18.2
fountain 17.9
historic 16.5
landmark 16.3
religion 16.1
historical 14.1
antique 13.9
god 13.4
kin 12.5
palace 11.6
detail 11.3
decoration 11
tourist 10.9
man 10.8
roman 10.7
famous 10.2
traditional 10
column 9.8
portrait 9.7
heritage 9.7
sky 9.6
religious 9.4
face 9.2
outdoor 9.2
figure 8.2
carving 8
mythology 7.9
people 7.8
male 7.8
catholic 7.8
museum 7.8
golden 7.7
spirituality 7.7
old fashioned 7.6
capital 7.6
world 7.6
head 7.6
vintage 7.5
church 7.4
dress 7.2
romantic 7.1
day 7.1

Google
created on 2019-11-10

Microsoft
created on 2019-11-10

clothing 98.1
person 97.7
text 96.3
man 85.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-69
Gender Female, 54.7%
Surprised 45%
Disgusted 45%
Fear 45%
Sad 45%
Calm 54.9%
Confused 45%
Angry 45%
Happy 45%

AWS Rekognition

Age 32-48
Gender Female, 52.9%
Fear 45%
Disgusted 45%
Angry 45%
Sad 45%
Happy 45%
Surprised 45%
Confused 45%
Calm 55%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Categories

Imagga

paintings art 95.8%
interior objects 3.8%

Captions

Microsoft
created on 2019-11-10

an old photo of a man 78.5%
old photo of a man 76.5%
a black and white photo of a man 67.3%