Human Generated Data

Title

Untitled (Phyllis Moore with her African-American nurse Clara )

Date

1910s

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12383

Human Generated Data

Title

Untitled (Phyllis Moore with her African-American nurse Clara )

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

1910s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12383

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Furniture 99.5
Chair 99.2
Human 90.5
Person 87.9
Painting 81.4
Art 81.4
People 80.9
Face 79.3
Person 78.4
Portrait 66.8
Photography 66.8
Photo 66.8

Clarifai
created on 2023-10-26

people 99.7
portrait 99.6
child 99.3
sepia 99.2
son 98.6
baby 97.1
two 96.8
retro 96.1
sepia pigment 95.8
offspring 95.5
sit 94.8
wear 94.5
family 94.3
vintage 94.3
adult 93.5
facial expression 93.3
woman 93.1
man 93
affection 92.5
seat 91.5

Imagga
created on 2022-01-22

mother 76
parent 67.8
child 52.7
family 45.4
father 43.2
home 35.9
happy 35.1
male 34.9
dad 30.6
man 30.3
people 29.6
couple 28.8
love 26.8
together 25.4
daughter 23.9
adult 23.3
portrait 22
son 21.5
happiness 21.2
smiling 21
senior 19.7
smile 19.3
husband 19.1
kid 18.6
lifestyle 18.1
old 17.4
boy 17.4
baby 16.6
casual 16.1
kin 15.9
indoors 15.8
wife 15.2
couch 14.5
married 14.4
cute 14.4
sitting 13.8
fun 13.5
childhood 13.4
children 12.8
joy 12.5
elderly 12.5
interior 12.4
face 12.1
bonding 11.7
affectionate 11.6
person 11.5
sofa 11.5
loving 11.5
looking 11.2
brother 11.2
mature 11.2
domestic 11.2
two 10.2
leisure 10
holding 9.9
caring 9.8
attractive 9.8
living room 9.8
grandmother 9.8
grandfather 9.8
cheerful 9.8
little 9.7
retired 9.7
affection 9.7
relationship 9.4
playing 9.1
room 9.1
care 9.1
grandma 9
parents 8.8
newborn 8.8
bride 8.6
youth 8.5
togetherness 8.5
relaxed 8.5
human 8.3
indoor 8.2
aged 8.2
dress 8.1
life 7.9
women 7.9
sister 7.8
two people 7.8
play 7.8
men 7.7
infant 7.7
30s 7.7
retirement 7.7
pair 7.6
wedding 7.4
offspring 7.2
black 7.2
romantic 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.4
baby 98.8
human face 98.2
clothing 97
toddler 96.9
person 95.7
child 94.3
old 89.2
smile 79.3
posing 72.4
boy 57.6
vintage 39.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-33
Gender Male, 92.5%
Calm 58%
Happy 10.8%
Confused 9.3%
Sad 7.5%
Surprised 4.2%
Fear 4.2%
Angry 3.7%
Disgusted 2.3%

AWS Rekognition

Age 0-6
Gender Male, 95.1%
Calm 34.6%
Disgusted 22.9%
Happy 21%
Angry 6.2%
Confused 5.6%
Fear 5.4%
Sad 2.6%
Surprised 1.7%

Microsoft Cognitive Services

Age 31
Gender Female

Microsoft Cognitive Services

Age 1
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 87.9%
Painting 81.4%

Categories

Imagga

paintings art 89.4%
people portraits 10.5%