Human Generated Data

Title

Untitled (man with beard, bust-length, front view)

Date

c. 1854-mid 1860s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Janet and Daniel Tassel, 2007.219.54.27

Human Generated Data

Title

Untitled (man with beard, bust-length, front view)

People

Artist: Unidentified Artist,

Date

c. 1854-mid 1860s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Janet and Daniel Tassel, 2007.219.54.27

Machine Generated Data

Tags

Amazon
created on 2019-11-05

Person 98.9
Human 98.9
Art 94.2
Painting 88.4
Wood 61.6
Face 57.1
Portrait 57.1
Photography 57.1
Photo 57.1
Drawing 55.9

Clarifai
created on 2019-11-05

people 100
adult 99.7
portrait 99.7
print 99.6
one 99.6
art 99.4
painting 99.1
facial hair 98.9
man 98.8
mustache 98.2
wear 97.9
monarch 97.9
leader 97.9
engraving 97.6
illustration 96.8
royalty 96.7
veil 92.9
administration 92.7
writer 91.4
scientist 90.8

Imagga
created on 2019-11-05

stringed instrument 29.5
lute 29.1
device 24.5
musical instrument 23.2
currency 20.6
money 20.4
dollar 18.5
old 18.1
cash 16.4
paper 15.8
harp 15.4
art 14.8
one 14.2
finance 13.5
bill 13.3
face 12.8
business 12.7
close 12.5
vintage 12.5
financial 12.4
portrait 12.3
wealth 11.6
bank 11.6
symbol 11.4
antique 11.4
banking 11
frame 10.8
closeup 10.8
banknote 10.7
pick 10.2
gold 9.8
man 9.4
culture 9.4
letter 9.2
stamp 8.7
icon 8.7
dollars 8.7
mail 8.6
post 8.6
black 8.4
people 8.4
church 8.3
note 8.3
retro 8.2
person 8
postmark 7.9
postage 7.9
wall 7.7
pay 7.7
grunge 7.6
texture 7.6
exchange 7.6
head 7.5
painter 7.5
savings 7.4
rich 7.4
economy 7.4
inside 7.3
design 7.3
aged 7.2
office 7.2
adult 7.1
phonograph record 7.1
male 7.1

Google
created on 2019-11-05

Picture frame 92.1
Gentleman 83.6
Portrait 79.9
Painting 79.5
Facial hair 75.1
Art 58.1

Microsoft
created on 2019-11-05

text 98.3
human face 98
person 97.7
wall 97
clothing 96.1
man 95.4
mirror 94.7
reflection 68.1
sign 67.5
old 62.6
picture frame 29.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 44-62
Gender Male, 99.5%
Surprised 0%
Fear 0%
Happy 0%
Confused 0.2%
Angry 0.1%
Sad 8.8%
Disgusted 0%
Calm 90.7%

Microsoft Cognitive Services

Age 50
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Categories

Imagga

paintings art 71.7%
food drinks 26.8%