Human Generated Data

Title

Untitled (deceased woman in coffin surrounded by women in white)

Date

c. 1945

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3589

Human Generated Data

Title

Untitled (deceased woman in coffin surrounded by women in white)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-04

Person 99.4
Human 99.4
Person 99
Person 98.6
Person 98.2
Person 97.9
Person 97.7
Person 97.5
Funeral 96.7
Person 96.7
Person 96.7
Person 88.9
Tomb 88.4
Crowd 75.5
Clothing 70.1
Apparel 70.1
Tombstone 68.8
People 66.8
Text 60.3
Overcoat 57.9
Coat 57.9

Imagga
created on 2022-02-04

cemetery 32.6
dark 20
silhouette 19.9
black 16.2
night 16
man 15.4
light 15.4
old 13.9
person 13.6
horror 13.6
people 13.4
stall 12.6
evil 11.7
art 11.4
symbol 11.4
male 10.6
mystery 10.6
wall 10.3
window 10.2
religion 9.9
scary 9.7
clothing 9.7
grunge 9.4
industrial 9.1
dirty 9
darkness 8.8
fear 8.7
men 8.6
evening 8.4
fantasy 8.1
spooky 7.8
musical instrument 7.6
landscape 7.4
protection 7.3
sun 7.2
metal 7.2
color 7.2
dress 7.2
outfit 7.2
sunset 7.2
spectator 7.1

Google
created on 2022-02-04

Microsoft
created on 2022-02-04

grave 96.2
text 95.5
cemetery 93.2
black and white 89.7
funeral 84.2
clothing 79.7
person 74
street 50.4
clothes 16.7

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 94.7%
Calm 43.7%
Sad 31.8%
Surprised 18.8%
Confused 2.2%
Happy 1.1%
Disgusted 0.9%
Angry 0.9%
Fear 0.7%

AWS Rekognition

Age 38-46
Gender Male, 97.7%
Calm 99.6%
Sad 0.1%
Confused 0.1%
Angry 0.1%
Fear 0.1%
Disgusted 0%
Surprised 0%
Happy 0%

AWS Rekognition

Age 47-53
Gender Male, 92.6%
Sad 66.7%
Fear 28%
Calm 3.7%
Disgusted 0.7%
Angry 0.3%
Confused 0.3%
Happy 0.2%
Surprised 0.1%

AWS Rekognition

Age 33-41
Gender Female, 74.5%
Calm 68.1%
Sad 29.7%
Happy 0.6%
Disgusted 0.5%
Fear 0.4%
Confused 0.4%
Surprised 0.2%
Angry 0.1%

AWS Rekognition

Age 35-43
Gender Female, 94%
Sad 73.8%
Calm 21.6%
Surprised 2.4%
Fear 0.7%
Confused 0.6%
Angry 0.4%
Happy 0.4%
Disgusted 0.2%

AWS Rekognition

Age 41-49
Gender Male, 94.1%
Calm 94.7%
Sad 2.4%
Confused 1%
Angry 0.6%
Fear 0.5%
Surprised 0.4%
Disgusted 0.3%
Happy 0.2%

AWS Rekognition

Age 26-36
Gender Female, 75.7%
Calm 88%
Sad 7.2%
Confused 1.5%
Happy 1.4%
Surprised 0.7%
Fear 0.6%
Disgusted 0.4%
Angry 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a group of people standing in front of a building 68.8%
a group of people standing in front of a store 55.4%
a group of people in front of a building 55.3%

Text analysis

Amazon

VIVA
212
E
os
N
II
SIGM E VIVA
N we 212
SIGM
Juckhart
we

Google

SICME
212
VIVA
Luckhart
SICME E VIVA N 212 Luckhart
E
N