Human Generated Data

Title

Daniel Chamberlain Payne, Boston (1837-1868)

Date

1858

People

Artist: John Adams Whipple, American 1822 - 1891

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Arthur S. Eldredge, 2.2002.2032

Human Generated Data

Title

Daniel Chamberlain Payne, Boston (1837-1868)

People

Artist: John Adams Whipple, American 1822 - 1891

Date

1858

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 98.6
Person 98.6
Tie 98.4
Accessory 98.4
Accessories 98.4
Text 97
Art 84.3
Painting 75.7
Tie 73.3
Drawing 71.4
Handwriting 68.6

Imagga
created on 2022-01-08

currency 38.6
money 37.4
cash 31.1
finance 30.4
mug shot 24.8
financial 23.2
bank 22.7
dollar 21.3
banking 21.1
coin 21
photograph 20.3
close 20
one 19.4
representation 19
museum 18.6
us 18.3
old 17.4
wealth 17
art 16.8
business 16.4
pay 16.3
bill 16.2
dollars 15.4
vintage 14.5
coins 14.5
creation 14.4
depository 13.9
bust 13.8
paper 13.5
black 12.8
planet 12.6
object 12.4
ancient 12.1
rich 12.1
economy 12
retro 11.5
man 11.4
metal 11.3
portrait 11
antique 10.8
facility 10.7
finances 10.6
exchange 10.5
human 10.5
historical 10.3
sculpture 10.3
savings 10.2
closeup 10.1
symbol 10.1
hundred 9.7
face 9.2
male 9.2
person 9.2
collection 9
history 8.9
funds 8.8
success 8
numismatics 7.9
banknotes 7.8
bills 7.8
banknote 7.8
states 7.7
investment 7.3
plastic art 7.2
concepts 7.1

Google
created on 2022-01-08

Face 98.3
Eyebrow 93.7
Handwriting 82.6
Collar 81.5
Oval 81.3
Art 77.7
Tints and shades 76.4
Font 75.9
Facial hair 75.8
Vintage clothing 74.9
Signature 72.5
Moustache 72.2
Collectable 70.5
Suit 66.3
Sleeve 65.6
Autograph 65.4
Paper product 60.4
History 59.9
Blazer 59.9
Visual arts 58.5

Microsoft
created on 2022-01-08

text 98.5
human face 97
book 94.9
person 94.8
handwriting 93.8
man 91.2
clothing 86.9
old 73.8
portrait 60.2
drawing 56.3
vintage 42.5

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-26
Gender Male, 99.2%
Calm 98.7%
Sad 0.7%
Confused 0.2%
Angry 0.2%
Happy 0.1%
Surprised 0.1%
Disgusted 0%
Fear 0%

Microsoft Cognitive Services

Age 32
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%
Tie 98.4%

Captions

Microsoft

a vintage photo of a person 67.8%

Text analysis

Amazon

Daniel
Daniel G. Payne
Payne
G.
11.
1837
Fref. 11. 1837
Boston
Fref.

Google

11.18
Brita
Feh.
87
Tagne
Damil G. Tagne Brita Feh. 11.18 87
Damil
G.