Human Generated Data

Title

Untitled (mother and baby)

Date

c. 1910

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.808

Human Generated Data

Title

Untitled (mother and baby)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1910

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.808

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 97.9
Human 97.9
Art 94.9
Clothing 80.6
Apparel 80.6
Painting 75.8
Hat 62.9

Clarifai
created on 2023-10-26

paper 99.2
retro 99.1
portrait 98.9
wear 98.9
sepia pigment 98.6
art 98.1
old 98.1
painting 97.8
dirty 97.3
antique 97.2
vintage 97
sepia 97
people 96.3
nostalgia 95.5
cardboard 93.4
one 92.7
parchment 91.4
picture frame 91.2
man 90.6
blank 89.7

Imagga
created on 2022-01-22

book jacket 75.3
jacket 59.6
wrapping 44.5
covering 30.6
product 28.6
old 27.9
vintage 27.3
retro 23.8
ancient 23.4
creation 22.3
envelope 22.1
newspaper 21.5
art 21.5
paper 21.4
stamp 18.2
sketch 17.9
book 17.5
letter 17.4
antique 17.3
grunge 17
mail 16.3
drawing 16
aged 15.4
postmark 14.8
postage 13.8
card 13.7
money 13.6
culture 12.8
cash 12.8
currency 12.6
texture 12.5
philately 11.8
postal 11.8
sculpture 11.5
design 11.3
detail 11.3
symbol 10.8
old fashioned 10.5
representation 10.4
empty 10.3
monument 10.3
dollar 10.2
banking 10.1
decoration 10.1
frame 10
wealth 9.9
history 9.8
portrait 9.7
statue 9.6
bill 9.5
famous 9.3
binding 9.3
container 9.1
financial 8.9
circa 8.9
printed 8.9
united 8.6
face 8.5
finance 8.4
head 8.4
landmark 8.1
bank 8.1
shows 7.9
architecture 7.8
dollars 7.7
wallpaper 7.7
post 7.6
historical 7.5
pattern 7.5
decorative 7.5
savings 7.5
element 7.4
close 7.4
economy 7.4
closeup 7.4
cover 7.4
brown 7.4
artwork 7.3
message 7.3
material 7.1
travel 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

gallery 97.7
scene 97.1
room 96.5
clothing 96.3
person 95.6
human face 92.7
text 92.6
drawing 88.4
sketch 83.1
man 67.6
old 44.2
picture frame 8.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Female, 97.6%
Happy 80.6%
Calm 14.2%
Sad 1.6%
Surprised 0.9%
Disgusted 0.7%
Angry 0.6%
Fear 0.6%
Confused 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.9%

Categories

Imagga

paintings art 97.9%

Captions

Microsoft
created on 2022-01-22

an old photo of a box 50.6%
a close up of a box 50.5%
old photo of a box 44.7%

Text analysis

Amazon

N.d.
Sr