Human Generated Data

Title

Untitled (view of mausoleum "Honore Mercier")

Date

c. 1905-1915, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6022

Human Generated Data

Title

Untitled (view of mausoleum "Honore Mercier")

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6022

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.5
Human 99.5
Person 99.2
Military Uniform 98.3
Military 98.3
Person 97.6
Army 94
Armored 94
People 92.3
Footwear 87.9
Apparel 87.9
Shoe 87.9
Clothing 87.9
Soldier 87.4
Shoe 84
Shoe 72
Officer 57.9
Troop 56

Clarifai
created on 2019-11-16

people 99.9
group 97.9
adult 97.4
man 97.3
group together 97.1
street 97
monochrome 96.2
leader 93.9
administration 92.9
many 92.4
cemetery 92.4
war 91.9
woman 91.5
military 91.1
portrait 89.9
soldier 88.1
child 87.1
wear 86.8
three 85.4
two 85.1

Imagga
created on 2019-11-16

fountain 52.9
structure 51
cemetery 45.9
architecture 42.5
statue 35.6
sculpture 30.9
building 30.6
history 29.5
monument 29
old 26.5
stone 25
religion 24.2
city 24.1
landmark 21.7
memorial 21.5
tourism 20.6
historic 20.2
culture 19.7
famous 19.5
ancient 19
art 17.7
church 17.6
travel 16.9
marble 16.5
column 16.1
historical 15.1
temple 13.5
god 12.4
window 12.1
tourist 12
palace 11.9
religious 11.2
facade 11
sky 10.8
arch 10.1
traditional 10
cathedral 9.8
house 9.7
outdoors 9.7
architectural 9.6
antique 9.5
exterior 9.2
gravestone 9.2
decoration 8.7
faith 8.6
vintage 8.3
catholic 7.8
university 7.7
heritage 7.7
world 7.7
style 7.4
street 7.4
carving 7.3
trees 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

man 95.7
text 93.8
standing 91.3
black and white 89.2
clothing 89
person 88.1
grave 75.2
posing 74.4
monochrome 69.7
cemetery 68.7
old 53.7
tree 50.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 12-22
Gender Male, 54%
Disgusted 45%
Calm 46.7%
Fear 45%
Happy 53.2%
Confused 45%
Sad 45%
Surprised 45%
Angry 45%

AWS Rekognition

Age 22-34
Gender Male, 54.5%
Fear 45%
Sad 45%
Happy 54.9%
Disgusted 45%
Calm 45%
Confused 45%
Angry 45%
Surprised 45%

AWS Rekognition

Age 24-38
Gender Male, 54.7%
Disgusted 45.1%
Calm 50.4%
Confused 45.1%
Sad 45.1%
Happy 49.3%
Angry 45.1%
Fear 45%
Surprised 45.1%

Microsoft Cognitive Services

Age 35
Gender Male

Microsoft Cognitive Services

Age 32
Gender Male

Microsoft Cognitive Services

Age 30
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Shoe 87.9%

Categories