Human Generated Data

Title

Bill-Sub-Chief to Cap' Jack

Date

1866-1899

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.4077

Human Generated Data

Title

Bill-Sub-Chief to Cap' Jack

People

Artist: Unidentified Artist,

Date

1866-1899

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-10

Apparel 99.4
Clothing 99.4
Coat 99.4
Person 99
Human 99
Art 95.2
Advertisement 81.4
Poster 78.7
Art Gallery 73.2
Drawing 70.1
Painting 63.2

Clarifai
created on 2019-11-10

no person 93.1
people 92.2
illustration 90.7
man 89.5
business 88.4
paper 87.4
retro 87.3
art 87.1
indoors 86.3
ancient 85.8
woman 85.5
spirituality 85
design 84.8
Buddha 83.9
travel 82.8
old 82.2
architecture 82.2
symbol 81.3
religion 81.2
antique 80

Imagga
created on 2019-11-10

old 22.3
vintage 21.7
gold 21.4
art 18.9
light bulb 18.8
paper 17.3
decoration 16.7
design 16.4
electric lamp 15.1
retro 14.7
golden 14.6
frame 13.3
antique 13.2
lamp 13.1
classic 13
ancient 13
amulet 12.4
device 12.1
envelope 11.7
currency 11.7
symbol 11.4
texture 11.1
fastener 11
container 10.9
card 10.8
charm 10.8
architecture 10.5
grunge 10.2
money 10.2
covering 9.9
royal 9.7
ornament 9.5
floral 9.4
finance 9.3
buckle 9.3
letter 9.2
cash 9.1
mask 9.1
black 9
religion 9
bank 8.9
object 8.8
close 8.6
culture 8.5
stamp 8.5
face 8.5
religious 8.4
crown 8.4
elegance 8.4
dollar 8.3
one 8.2
pattern 8.2
style 8.2
brown 8.1
wealth 8.1
history 8
detail 8
celebration 8
interior 8
structure 7.8
source of illumination 7.8
gift 7.7
entrance 7.7
luxury 7.7
blank 7.7
wall 7.7
mail 7.7
museum 7.6
post 7.6
restraint 7.5
banking 7.3
metal 7.2
stucco 7.1
financial 7.1
icon 7.1
wooden 7

Google
created on 2019-11-10

Microsoft
created on 2019-11-10

text 99.5
drawing 98.5
clothing 98.3
sketch 96.9
person 96.8
man 90.4
human face 68.4
painting 66.5
white 61.9

Face analysis

Amazon

Google

AWS Rekognition

Age 23-37
Gender Male, 53.6%
Happy 45.1%
Confused 45.2%
Disgusted 45%
Surprised 45.1%
Calm 52.5%
Sad 45.4%
Fear 45.1%
Angry 46.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Coat 99.4%
Person 99%

Captions

Microsoft

a close up of a box 69.8%
a black and white photo of a box 52.9%
close up of a box 52.8%

Text analysis

Amazon

Jack
Cap
to
BILL-Sub-CLief to Cap Jack
BILL-Sub-CLief

Google

to
BILL-Sub-Clief to Cap Jack
BILL-Sub-Clief
Jack
Cap