Human Generated Data

Title

Lyman Beecher (1775-1863)

Date

c. 1855

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2

Human Generated Data

Title

Lyman Beecher (1775-1863)

People

Artist: Unidentified Artist,

Date

c. 1855

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Art 100
Painting 100
Face 100
Head 100
Photography 100
Portrait 100
Person 99.3
Adult 99.3
Male 99.3
Man 99.3

Clarifai
created on 2018-05-11

people 100
print 99.8
art 99.8
painting 99.7
one 99.5
illustration 99.4
portrait 99.2
adult 99.1
old 97.5
antique 97.3
leader 96.7
engraving 96.6
man 96.1
side view 95.7
wear 95.2
vintage 94.7
writer 94.3
scientist 93.8
politician 91.6
elderly 90.9

Imagga
created on 2023-10-06

call 44.5
currency 30.5
money 29.8
dollar 26
cash 25.6
finance 22
bill 21.9
one 21.6
banking 20.2
business 19.4
bank 18.8
dollars 18.3
painter 18.2
paper 18.2
wealth 17.9
financial 17.8
close 17.1
art 16.6
us 16.4
portrait 16.2
statue 15.4
religion 15.2
savings 14.9
man 14.1
banknote 13.6
god 13.4
blackboard 13.2
culture 12.8
face 12.8
hundred 12.6
rich 12.1
sculpture 12
old 11.8
telephone 11.5
pay 11.5
church 11.1
closeup 10.8
male 10.6
loan 10.5
book jacket 10.4
symbol 10.1
franklin 9.8
history 9.8
banknotes 9.8
finances 9.6
exchange 9.5
architecture 9.4
religious 9.4
head 9.2
person 9.1
temple 8.9
concepts 8.9
success 8.8
stone 8.8
bills 8.7
holy 8.7
profit 8.6
people 8.4
investment 8.2
jacket 8.1
masterpiece 7.9
ancient 7.8
adult 7.8
golden 7.7
pay-phone 7.7
artist 7.7
spirituality 7.7
notes 7.7
serious 7.6
historical 7.5
commerce 7.5
vintage 7.4
economy 7.4
note 7.3

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

book 92.5
text 91.1
bus 90
indoor 87.3
old 65.9
open 48.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 66-76
Gender Male, 98.4%
Calm 65%
Sad 62.4%
Surprised 6.3%
Fear 5.9%
Angry 3.2%
Disgusted 0.6%
Confused 0.4%
Happy 0.1%

Microsoft Cognitive Services

Age 70
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Adult 99.3%
Male 99.3%
Man 99.3%

Categories

Captions