Human Generated Data

Title

Races, Negroes: United States. Virginia. Hampton. Hampton Normal and Industrial School: Training in Scientific Agriculture: Learning Household Handicrafts at Hampton. This is often called the "Gumption Class."

Date

c. 1903

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.3537.4

Human Generated Data

Title

Races, Negroes: United States. Virginia. Hampton. Hampton Normal and Industrial School: Training in Scientific Agriculture: Learning Household Handicrafts at Hampton. This is often called the "Gumption Class."

People

Artist: Unidentified Artist,

Date

c. 1903

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.3537.4

Machine Generated Data

Tags

Amazon
created on 2019-06-05

Person 99.2
Human 99.2
Person 99.2
Person 98.5
Person 97.9
Person 92.4
Building 89
Factory 88.5
Workshop 58.3
Assembly Line 57.5
Lab 55

Clarifai
created on 2019-06-05

people 100
group 99.8
adult 99.4
print 98.6
group together 98.6
vehicle 98.2
man 97.5
administration 97.2
watercraft 96.8
two 96.1
child 96
furniture 95.8
piano 95.1
woman 93.4
leader 93.3
medical practitioner 93.1
instrument 93.1
war 92.6
music 92.4
musician 90.9

Imagga
created on 2019-06-05

marimba 100
musical instrument 100
percussion instrument 100
vibraphone 27.8
man 17.5
people 17.3
device 16.3
chair 16
interior 15
male 14.2
sitting 13.7
business 13.4
room 13.3
couple 12.2
old 11.8
silhouette 11.6
indoors 11.4
table 11.2
men 11.2
adult 11
house 10.9
black 10.8
building 10.3
women 10.3
glass 10.1
holding 9.9
family 9.8
home 9.6
lifestyle 9.4
stringed instrument 9.2
person 8.9
office 8.8
day 8.6
floor 8.4
inside 8.3
smiling 8
modern 7.7
togetherness 7.5
style 7.4
light 7.3
window 7.3
cheerful 7.3
dress 7.2
holiday 7.2
love 7.1
businessman 7.1
travel 7
architecture 7

Google
created on 2019-06-05

Microsoft
created on 2019-06-05

indoor 96.1
drawing 96.1
wall 95.4
sketch 93.5
person 87.3
clothing 78.3
black 76.7
chair 72.1
old 70.4
painting 62
working 59.4
furniture 56.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Female, 97.2%
Angry 5.3%
Sad 52.8%
Surprised 0.8%
Happy 0.8%
Calm 33%
Disgusted 1.6%
Confused 5.7%

AWS Rekognition

Age 48-68
Gender Female, 63.2%
Calm 0.4%
Confused 0.1%
Happy 4.2%
Disgusted 0.2%
Angry 0.3%
Surprised 0.1%
Sad 94.6%

AWS Rekognition

Age 38-59
Gender Male, 51.2%
Disgusted 45.1%
Calm 48.1%
Confused 45.4%
Happy 45.1%
Sad 50.8%
Angry 45.3%
Surprised 45.1%

AWS Rekognition

Age 35-52
Gender Female, 51.8%
Sad 46.2%
Disgusted 46.6%
Happy 49.8%
Angry 45.7%
Surprised 45.4%
Calm 46%
Confused 45.2%

AWS Rekognition

Age 20-38
Gender Male, 54.3%
Surprised 45.8%
Sad 46.9%
Calm 47.3%
Confused 46.3%
Disgusted 45.7%
Happy 45.8%
Angry 47.1%

Microsoft Cognitive Services

Age 28
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%