Human Generated Data

Title

Grace and Ruth Moore

Date

c. 1905

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.803

Human Generated Data

Title

Grace and Ruth Moore

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1905

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.803

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Furniture 99.9
Person 99.4
Human 99.4
Person 97.7
Cradle 66.5
Photography 63.6
Photo 63.6
Face 63.6
Portrait 63.6
Crib 59.1
Clothing 55.6
Apparel 55.6

Clarifai
created on 2023-10-26

people 99.6
child 99.1
portrait 98.8
retro 98
wear 97.7
sepia 96.9
two 96.9
art 95.3
family 95.3
son 95
baby 92.8
old 91.9
offspring 91.8
documentary 91.7
sepia pigment 91.7
man 91
nostalgia 89.5
facial expression 88.9
woman 88.3
adult 87.5

Imagga
created on 2022-01-22

child 27.2
old 25.8
vintage 20.7
envelope 19.9
dad 19.7
father 18.4
antique 18.2
paper 17.4
parent 16.5
grunge 16.2
retro 15.6
wall 14.5
flower 13.8
portrait 13.6
aged 13.6
decoration 13.4
card 13.1
ancient 12.1
happy 11.9
texture 11.8
container 11.8
people 11.7
kin 11.2
happiness 11
design 10.7
mother 10.3
business 10.3
culture 10.3
floral 10.2
page 10.2
money 10.2
man 10.1
face 9.9
financial 9.8
bouquet 9.8
family 9.8
world 9.4
blank 9.4
frame 9.3
art 9.2
dirty 9
one 9
smiling 8.7
leaf 8.6
smile 8.5
head 8.4
fun 8.2
product 8.2
valentine 8.2
dress 8.1
currency 8.1
detail 8
newspaper 8
holiday 7.9
black 7.8
creation 7.8
album 7.8
pretty 7.7
tree 7.7
worn 7.6
finance 7.6
savings 7.5
symbol 7.4
brown 7.4
banking 7.3
letter 7.3
religion 7.2
childhood 7.2
material 7.1
romantic 7.1
love 7.1
kid 7.1
textured 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

room 99.9
gallery 99.9
scene 99.8
wall 99.6
clothing 99.2
person 98.8
woman 88.1
text 85.5
human face 84.5
smile 70.9
black 69.6
white 66.4
old 62
posing 61.3
vintage clothing 56.9
photograph 53.7
vintage 35.6
envelope 26.4
picture frame 10.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 0-3
Gender Female, 84.4%
Happy 99.8%
Calm 0%
Confused 0%
Surprised 0%
Angry 0%
Disgusted 0%
Sad 0%
Fear 0%

AWS Rekognition

Age 1-7
Gender Female, 100%
Calm 91.9%
Sad 5.8%
Angry 0.7%
Fear 0.4%
Confused 0.4%
Surprised 0.4%
Disgusted 0.3%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories

Imagga

paintings art 100%

Captions

Text analysis

Amazon

1637
New