Human Generated Data

Title

Untitled (Eugenie Stoll Ragan and her children)

Date

c. 1955

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21795

Human Generated Data

Title

Untitled (Eugenie Stoll Ragan and her children)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Indoors 98.2
Room 94.9
Fireplace 93.1
Furniture 92.5
Chair 92.5
Human 91.7
Person 91.7
Living Room 90.2
Person 87.7
Person 81.3
Chair 68.4
Apparel 62.1
Shoe 62.1
Footwear 62.1
Clothing 62.1
Female 60.3
Girl 60.3
Bedroom 58.9
Photography 57.3
Photo 57.3
Chair 56.8
Couch 56.7
People 56.7
Hearth 55.6

Imagga
created on 2022-03-11

chair 31
room 23.6
architecture 20.5
seat 20.1
furniture 18.9
sculpture 18.1
building 17.9
world 17.5
interior 16.8
history 16.1
monument 15.9
rocking chair 15.9
art 15.3
tourism 14.8
travel 14.8
old 14.6
statue 14.3
home 13.6
decoration 12.3
people 12.3
symbol 12.1
ancient 12.1
column 12
sitting 12
historic 11.9
person 11.5
marble 11.4
famous 11.2
luxury 11.1
culture 11.1
arch 11
house 10.9
religion 10.7
man 10.7
design 10.7
antique 10.6
stone 10.4
style 10.4
wall 10.3
upright 10.1
table 10.1
tourist 10.1
structure 10
city 10
piano 9.9
keyboard instrument 9.8
indoors 9.7
musical instrument 9.6
living 9.5
training 9.2
business 9.1
vintage 9.1
home appliance 9.1
landmark 9
male 8.5
floor 8.4
traditional 8.3
fireplace 8.3
silhouette 8.3
stringed instrument 8.3
sewing machine 8.2
device 8.1
memorial 7.8
scene 7.8
office 7.8
classroom 7.7
percussion instrument 7.6
nation 7.6
elegance 7.6
historical 7.5
sport 7.4
vacation 7.4
ornate 7.3
icon 7.1
portrait 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

fireplace 99.7
text 99.2
indoor 91.2
furniture 88.6
black and white 74.6
old 45.4

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 99.4%
Surprised 73.7%
Calm 19.5%
Happy 2.5%
Disgusted 1.5%
Confused 1.2%
Fear 0.9%
Angry 0.4%
Sad 0.3%

AWS Rekognition

Age 23-33
Gender Female, 88.9%
Calm 95.9%
Sad 2.3%
Angry 0.4%
Happy 0.3%
Confused 0.3%
Surprised 0.3%
Disgusted 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Fireplace 93.1%
Chair 92.5%
Person 91.7%
Shoe 62.1%

Captions

Microsoft

a man sitting in front of a building 55.1%
a man sitting in front of a store 46%
a man sitting at a table in front of a building 45.9%