Human Generated Data

Title

Untitled (couple with baby)

Date

c. 1910

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.807

Human Generated Data

Title

Untitled (couple with baby)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1910

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.807

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 98.9
Human 98.9
Person 98.3
Person 95.8
Text 93.1
Art 89.6
Painting 80.6

Clarifai
created on 2023-10-26

paper 99.2
retro 99
painting 98.7
wear 98.5
art 98.5
sepia pigment 98.4
nostalgia 97.4
people 97.3
portrait 97.3
cardboard 97.3
vintage 96.4
old 96.3
sepia 95.7
antique 95.1
man 94.8
moment 94.2
dirty 94
blank 93.1
two 92.7
memory 91.9

Imagga
created on 2022-01-22

book jacket 87.3
jacket 68.9
wrapping 51.6
envelope 37.3
covering 35.7
currency 33.2
money 32.3
vintage 29.9
cash 28.4
paper 27.5
stamp 26
sketch 26
old 24.4
drawing 23.6
bank 22.4
dollar 22.3
mail 22
retro 21.3
banking 21.2
finance 21.1
postmark 19.7
postage 19.7
bill 19
financial 17.8
letter 17.4
dollars 17.4
container 17.1
business 17
savings 16.8
postal 16.7
exchange 16.2
wealth 16.2
philately 14.8
ancient 14.7
representation 14.6
banknote 14.6
close 13.7
bills 13.6
one 13.4
antique 13
grunge 12.8
binding 12.8
book 12.5
post 12.4
economy 12.1
note 12
circa 11.8
collection 11.7
art 11.7
pay 11.5
loan 11.5
symbol 11.5
design 10.9
printed 10.8
market 10.7
hundred 10.7
notes 10.6
united 10.5
rich 10.2
product 10.2
investment 10.1
global 10
aged 10
face 9.9
states 9.7
card 9.5
stamps 8.9
shows 8.9
banknotes 8.8
pound 8.7
payment 8.7
notebook 8.2
message 8.2
creation 8.1
closeup 8.1
history 8.1
twenty 7.9
funds 7.8
queen 7.8
us 7.7
finances 7.7
great 7.7
international 7.6
texture 7.6
head 7.6
sign 7.5
man 7.4
object 7.3
portrait 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 91.4
clothing 91.3
human face 90.8
gallery 90.7
text 89.8
room 74.4
scene 71.6
man 61.3
handwriting 60
drawing 55.5
old 52.6
envelope 50.4
painting 15.1
picture frame 13.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Male, 92.7%
Calm 99.8%
Angry 0.1%
Confused 0%
Sad 0%
Surprised 0%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 27-37
Gender Female, 97%
Happy 66.3%
Calm 26.7%
Sad 2%
Disgusted 1.5%
Surprised 0.9%
Confused 0.9%
Angry 0.9%
Fear 0.8%

AWS Rekognition

Age 0-3
Gender Male, 99.4%
Sad 65.5%
Calm 25.1%
Angry 2.6%
Fear 2.3%
Confused 2%
Disgusted 1%
Happy 0.9%
Surprised 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 98.9%

Categories

Imagga

paintings art 100%

Captions

Microsoft
created on 2022-01-22

an old photo of a sign 57.2%
a close up of a sign 57.1%
a sign for a photo 49.7%

Text analysis

Amazon

St.
Museu