Human Generated Data

Title

Untitled (first communion studio portrait of girl with dress and veil holding bible)

Date

c. 1905-1910, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5975

Human Generated Data

Title

Untitled (first communion studio portrait of girl with dress and veil holding bible)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1910, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Clothing 100
Apparel 100
Person 98.7
Human 98.7
Person 93.7
Sun Hat 85.5
Pet 80.8
Animal 80.8
Cat 80.8
Mammal 80.8
Hat 78.3
Suit 63
Coat 63
Overcoat 63
Finger 58.4
Plant 57.3
Sleeve 57
Veil 56.1
Long Sleeve 55.2

Clarifai
created on 2019-11-16

people 99.8
monochrome 97.4
woman 96.5
two 94.5
adult 94.2
wear 94
man 93.5
veil 93.1
music 92
wedding 86.7
outfit 86.6
child 85.8
room 85
group 84.8
street 81.1
portrait 81
one 80.3
actress 80.2
dress 80.1
actor 79.5

Imagga
created on 2019-11-16

covering 36.3
cloak 36
man 26.9
people 26.8
male 24.8
black 24.1
person 23.6
hat 22.3
adult 21.1
clothing 17.6
business 17
portrait 16.8
businessman 15
office 13.9
men 13.7
sitting 12.9
piano 12.9
fashion 12.1
model 11.7
silhouette 11.6
hand 11.4
modern 11.2
alone 11
cowboy hat 10.9
dark 10.8
dress 10.8
face 10.6
couple 10.4
one 10.4
grand piano 10.3
window 10.2
headdress 10
stringed instrument 9.9
attractive 9.8
job 9.7
interior 9.7
garment 9.6
building 9.6
suit 9.5
musical instrument 9.2
back 9.2
human 9
fun 9
worker 8.8
love 8.7
work 8.6
travel 8.4
professional 8.3
percussion instrument 8.3
holding 8.2
keyboard instrument 8.2
room 8.2
music 8.2
computer 8
chair 8
working 7.9
robe 7.9
women 7.9
architecture 7.8
old 7.7
iron 7.6
light 7.3
home appliance 7.3
cheerful 7.3
indoor 7.3
laptop 7.3
pose 7.2
lifestyle 7.2
looking 7.2
home 7.2
smile 7.1
table 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

clothing 96.1
text 95.5
wall 95.4
person 95.2
indoor 93.3
wedding dress 92.3
black 88.2
black and white 83.4
bride 82.8
dress 82.8
window 81.2
woman 75.9
white 63.6
old 58.8

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 44-62
Gender Female, 76.2%
Surprised 0%
Fear 12.6%
Happy 0%
Sad 85.8%
Calm 1%
Disgusted 0.1%
Angry 0.2%
Confused 0.2%

AWS Rekognition

Age 24-38
Gender Female, 53.6%
Disgusted 45%
Sad 45.1%
Fear 45%
Angry 45.7%
Confused 45.2%
Happy 45.1%
Calm 53.8%
Surprised 45.1%

Microsoft Cognitive Services

Age 31
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Cat 80.8%
Hat 78.3%

Captions

Microsoft

a black and white photo of a person 89.2%
a person sitting in front of a window 76.8%
a black and white photo of a person 76.7%