Human Generated Data

Title

Belle Boyd

Date

19th century

People

Artist: Campbell & Ecker, American active mid- to late-19th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Kenyon C. Bolton III Fund, 2019.152

Human Generated Data

Title

Belle Boyd

People

Artist: Campbell & Ecker, American active mid- to late-19th century

Date

19th century

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-10-04

Human 99
Person 99
Text 95.5
Book 78.7
Novel 75.8
Page 73.5
Art 66.8
Diary 65.9
Face 57.9

Clarifai
created on 2019-10-04

people 99.5
one 98.6
adult 98.1
print 97.2
art 96.9
illustration 96.6
portrait 96.5
wear 94.9
retro 94
book bindings 92.9
painting 92.6
man 91.6
antique 90.2
leader 88.5
engraving 88.2
woman 86.7
text 86.4
profile 85.3
vintage 80.5
old 79.8

Imagga
created on 2019-10-04

portrait 26.5
money 18.7
hair 18.2
currency 18
face 17.8
sexy 16.9
cash 16.5
model 16.3
old 15.3
body 15.2
black 15.1
paper 14.9
vintage 14.9
adult 14.9
fashion 14.3
bill 14.3
close 14.3
person 14.1
pretty 14
dollar 13.9
art 13.9
banking 13.8
attractive 13.3
people 12.8
antique 12.8
skin 12.7
child 12.6
ancient 12.1
human 12
bookmark 11.9
bank 11.8
erotic 11.3
one 11.2
grunge 11.1
finance 11
retro 10.7
exchange 10.5
style 10.4
sculpture 10.3
rich 10.2
economy 10.2
head 10.1
dress 9.9
financial 9.8
lady 9.7
business 9.7
banknote 9.7
closeup 9.4
historical 9.4
cute 9.3
wealth 9
book jacket 8.8
man 8.7
brunette 8.7
naked 8.7
clothing 8.6
statue 8.6
figure 8.5
male 8.5
card 8.3
makeup 8.2
sensual 8.2
bow tie 8.1
symbol 8.1
religion 8.1
representation 7.8
payment 7.7
culture 7.7
expression 7.7
pay 7.7
studio 7.6
savings 7.5
necktie 7.4
note 7.4
blond 7.3
covering 7.3
history 7.2
posing 7.1
market 7.1
stucco 7
look 7

Google
created on 2019-10-04

Microsoft
created on 2019-10-04

human face 98.8
person 97.9
drawing 95.3
clothing 93.7
sketch 88.7
woman 87.4
text 85.4
handwriting 65.2
portrait 62
painting 53.1
photograph 52.5
picture frame 27.5

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 19-31
Gender Female, 93.3%
Calm 99.3%
Sad 0.1%
Angry 0.1%
Disgusted 0%
Happy 0.3%
Fear 0%
Surprised 0.1%
Confused 0.1%

Microsoft Cognitive Services

Age 32
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Captions

Microsoft

an old picture of a person 57.6%
a person posing for the camera 57.5%
a close up of a womans face 57.4%

Text analysis

Amazon

Souisvitie:
Ecker
CCeampbel's
Hy
CCeampbel's S Ecker Souisvitie: Hy
S

Google

amphellcker
amphellcker