Human Generated Data

Title

Races, Negroes: United States. Virginia. Hampton. Hampton Normal and Industrial School

Date

c. 1903

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.2427

Human Generated Data

Title

Races, Negroes: United States. Virginia. Hampton. Hampton Normal and Industrial School

People

Artist: Unidentified Artist,

Date

c. 1903

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.2427

Machine Generated Data

Tags

Amazon
created on 2019-06-04

Apparel 99.4
Clothing 99.4
Person 99.3
Human 99.3
Coat 99
Overcoat 99
Person 98.8
Suit 98.8
Person 98.7
Face 90
Head 84.9
Accessory 84.3
Accessories 84.3
Tie 84.3
Tuxedo 81.3
Text 79.1
Attorney 72.5
Photo 71
Photography 71
Portrait 70.4
Advertisement 69.3
Poster 69.3
Man 63.2
Sitting 57.3

Clarifai
created on 2019-06-04

people 99.9
group 98.8
adult 98.4
portrait 98.2
man 96.8
leader 96.6
administration 94.5
wear 93.8
art 93.5
print 92.9
one 92.8
woman 92
music 91.9
four 90.4
three 90
two 89.9
retro 89.4
facial expression 86.8
many 86.5
indoors 86.1

Imagga
created on 2019-06-04

mug shot 95.9
photograph 77.5
representation 61.2
creation 41.2
business 21.9
paper 20.4
money 18.7
man 18.1
male 17.7
old 16.7
bill 16.2
currency 16.2
envelope 15.5
portrait 14.9
black 14
cash 13.7
one 13.4
vintage 13.2
office 12.9
bank 12.6
person 12.6
book 12.5
financial 12.5
people 12.3
bow tie 12.2
face 12.1
dollar 12.1
finance 11.8
necktie 11.7
banking 11
adult 11
symbol 10.8
art 10.7
holding 10.7
close 10.3
economy 10.2
happy 10
wealth 9.9
history 9.8
sign 9.8
blank 9.4
board 9.3
smile 9.3
box 9
home 8.8
design 8.7
post 8.6
expression 8.5
head 8.4
savings 8.4
investment 8.3
businessman 7.9
book jacket 7.9
stamp 7.9
work 7.8
banknote 7.8
empty 7.7
dollars 7.7
grunge 7.7
professional 7.6
manager 7.4
note 7.4
object 7.3
sketch 7.3
looking 7.2
drawing 7.2
container 7.1

Google
created on 2019-06-04

Photograph 96.5
Snapshot 82
Collection 75.8
Art 72.1
Portrait 67.3
Event 62.7
History 62.6
Photography 62.4
Gentleman 57.3

Microsoft
created on 2019-06-04

scene 99.6
wall 99.3
room 99.2
gallery 98.7
human face 96.1
man 96.1
indoor 94.8
person 92.8
drawing 85.4
clothing 83.9
posing 66.5
picture frame 61.6
old 58.1
painting 53.8
different 35.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-45
Gender Female, 69.8%
Confused 2.8%
Happy 2.2%
Angry 5.3%
Sad 46.9%
Calm 33.7%
Disgusted 6.4%
Surprised 2.7%

AWS Rekognition

Age 48-68
Gender Male, 98.9%
Sad 12.8%
Disgusted 15.1%
Angry 14.1%
Surprised 3.8%
Calm 41.1%
Confused 9.7%
Happy 3.5%

AWS Rekognition

Age 48-68
Gender Male, 97.7%
Calm 70.6%
Confused 2%
Sad 5%
Angry 6.1%
Happy 1.3%
Disgusted 13.3%
Surprised 1.6%

Microsoft Cognitive Services

Age 39
Gender Female

Microsoft Cognitive Services

Age 39
Gender Male

Microsoft Cognitive Services

Age 39
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Suit 98.8%
Tie 84.3%

Categories

Imagga

paintings art 99.8%

Text analysis

Amazon

cwy)

Google

SAMUEL CHAPMAN ARMSTRONG Fedr of1tmpto t BOOKER T. WAsaIENGTON, HAMPTON GRADUATE ) Fsde Tskne lstit
SAMUEL
CHAPMAN
ARMSTRONG
Fedr
of1tmpto
t
BOOKER
T.
WAsaIENGTON,
HAMPTON
GRADUATE
)
Fsde
Tskne
lstit