Human Generated Data

Title

Untitled (mother and child)

Date

c. 1905

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.804

Human Generated Data

Title

Untitled (mother and child)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1905

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.804

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 97.5
Human 97.5
Newborn 96.5
Baby 96.5
Clothing 86.7
Apparel 86.7
Floor 64.8
Photography 64.5
Photo 64.5
Portrait 64.5
Face 64.5
Female 56.3

Clarifai
created on 2023-10-26

people 99.3
one 98.8
child 98.3
portrait 98.1
girl 97.7
sepia 97.6
wear 97.1
two 93.9
woman 93.8
sepia pigment 93
son 90.5
adult 90.1
art 89.5
vintage 88.1
dog 88
family 87.9
nude 87.4
beach 85.9
retro 85.4
old 83.8

Imagga
created on 2022-01-22

sketch 75.3
drawing 54.5
representation 46.2
sexy 28.9
portrait 23.9
body 23.2
attractive 22.4
sand 22.4
adult 21.3
skin 21.2
pretty 21
fashion 19.6
lady 19.5
model 18.7
hair 18.2
person 18.1
people 17.8
face 17.8
relaxation 17.6
sensuality 17.3
spa 17
vessel 16
groom 15.4
bathtub 14.8
bath 14.2
shower 14.2
sensual 13.6
make 13.6
love 13.4
clean 13.4
soil 13
cute 12.9
water 12.7
erotic 12.3
feminine 12.1
man 12.1
human 12
health 11.8
care 11.5
posing 10.7
style 10.4
cosmetics 10.3
lifestyle 10.1
beach 10.1
makeup 10.1
earth 10
male 10
smile 10
vintage 9.9
blond 9.9
wet 9.8
nude 9.7
washbasin 9.6
bride 9.6
women 9.5
elegance 9.2
wedding 9.2
bathroom 9.1
relaxing 9.1
dress 9
retro 9
happy 8.8
naked 8.7
antique 8.7
head 8.4
old 8.4
pure 8.3
treatment 8.3
fun 8.2
gorgeous 8.2
art 8.1
brunette 7.8
sexual 7.7
basin 7.7
healthy 7.6
lying 7.5
paper 7.3

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 96.9
wall 95.6
drawing 94.1
sketch 94.1
human face 94
woman 91.4
text 85.5
clothing 82.8
wedding dress 80.9
girl 75.1
painting 62.3
bride 59.7
portrait 50.9
hair 41.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Female, 99.7%
Sad 59.4%
Calm 13.6%
Confused 9.6%
Happy 8.9%
Fear 3.1%
Surprised 2%
Angry 1.8%
Disgusted 1.6%

AWS Rekognition

Age 7-17
Gender Female, 77.3%
Calm 96.5%
Happy 2.7%
Disgusted 0.2%
Confused 0.2%
Sad 0.1%
Fear 0.1%
Surprised 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.5%

Categories

Imagga

paintings art 100%

Captions