Human Generated Data

Title

Charity, Hospitals: United States. New York. New York City. Harlem Hospital, Manhattan: Harlem Hospital, Manhattan (New York City Almshouse System): Part of House Staff

Date

c. 1900

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.830.1

Human Generated Data

Title

Charity, Hospitals: United States. New York. New York City. Harlem Hospital, Manhattan: Harlem Hospital, Manhattan (New York City Almshouse System): Part of House Staff

People

Artist: Unidentified Artist,

Date

c. 1900

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.830.1

Machine Generated Data

Tags

Amazon
created on 2019-06-07

Person 99
Human 99
Person 98.9
Person 98.7
Person 94.5
Apparel 82.6
Clothing 82.6
People 81.2
Sitting 80.8
Face 70.5
Outdoors 67.8
Porch 60.9
Suit 58.8
Coat 58.8
Overcoat 58.8

Clarifai
created on 2019-06-07

people 99.9
adult 98.4
group 98.2
administration 97.5
man 96.1
group together 95.9
leader 95.6
woman 95
two 92.2
several 88.8
three 87.8
many 86.6
four 85.4
wear 82.5
sit 81.9
five 81.6
chair 80.9
portrait 79.5
music 78.9
street 78.7

Imagga
created on 2019-06-07

kin 72.4
statue 23.4
sculpture 22.2
architecture 19.6
ancient 18.2
monument 17.7
old 17.4
history 16.1
art 15.1
man 14.1
world 13.7
currency 13.5
religion 13.4
stone 13.3
building 13
dollar 13
historic 12.8
tourism 12.4
people 12.3
antique 12.1
money 11.9
cash 11.9
marble 11.6
portrait 11
business 10.9
bank 10.9
city 10.8
face 10.7
religious 10.3
culture 10.3
male 9.9
vintage 9.9
travel 9.9
love 9.5
closeup 9.4
famous 9.3
finance 9.3
banking 9.2
one 9
groom 8.9
family 8.9
person 8.7
god 8.6
historical 8.5
church 8.3
room 8.3
landmark 8.1
newspaper 7.8
government 7.8
hundred 7.7
bill 7.6
memorial 7.3
group 7.3
catholic 7.2
paper 7.1

Google
created on 2019-06-07

Microsoft
created on 2019-06-07

clothing 99
person 98.8
outdoor 95.9
smile 92.7
man 84.7
human face 84.5
woman 83.5
old 60.9
people 59.8
posing 50.1
crowd 0.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Male, 95.5%
Calm 34.7%
Confused 2.8%
Happy 3.5%
Disgusted 1.6%
Sad 51.6%
Angry 3.6%
Surprised 2.2%

AWS Rekognition

Age 48-68
Gender Male, 79.3%
Happy 6.1%
Sad 16.8%
Surprised 3%
Angry 8.1%
Calm 43.3%
Disgusted 19.8%
Confused 2.9%

AWS Rekognition

Age 26-43
Gender Male, 95%
Sad 6.5%
Calm 72.6%
Disgusted 7.1%
Confused 4.3%
Surprised 2.2%
Happy 2%
Angry 5.3%

AWS Rekognition

Age 30-47
Gender Female, 50.9%
Sad 54%
Confused 45.1%
Calm 45.4%
Angry 45.1%
Disgusted 45.1%
Happy 45.4%
Surprised 45%

Microsoft Cognitive Services

Age 44
Gender Male

Microsoft Cognitive Services

Age 44
Gender Male

Microsoft Cognitive Services

Age 54
Gender Male

Microsoft Cognitive Services

Age 27
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Categories

Text analysis

Amazon

18

Google

3 18
3
18