Human Generated Data

Title

Charity, Organizations: United States. Massachusetts. Boston. Publicity for Social Work: Booklets: The Co-operative Workrooms for Handicapped Women

Date

1920s

People

Artist: Unidentified Artist,

Classification

Archival Material

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.32.16

Human Generated Data

Title

Charity, Organizations: United States. Massachusetts. Boston. Publicity for Social Work: Booklets: The Co-operative Workrooms for Handicapped Women

People

Artist: Unidentified Artist,

Date

1920s

Classification

Archival Material

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.32.16

Machine Generated Data

Tags

Amazon
created on 2019-06-06

Person 99.4
Human 99.4
Building 98.9
Factory 98.6
Assembly Line 98.5
Person 97.3
Person 91.6
Sewing 77.5
Person 72.7
Person 69.8

Clarifai
created on 2019-06-06

people 100
adult 99.6
group 99.5
man 97.7
many 97.2
group together 96.6
administration 96.5
monochrome 95
leader 94.5
woman 93.2
war 91.6
several 91.3
military 90.5
sit 88.2
child 86.9
one 86.5
merchant 86.3
two 86
four 85.1
commerce 84.9

Imagga
created on 2019-06-06

architecture 20.3
industry 19.6
iron lung 19.4
building 19.1
old 18.1
house 17.5
city 17.4
respirator 16.5
factory 14.7
industrial 14.5
device 14.4
man 14.1
equipment 13.4
travel 13.4
construction 12
street 12
breathing device 11.8
machine 11.6
sky 11.5
town 11.1
person 11
manufacturing 10.7
antique 10.6
metal 10.5
iron 10.4
people 10
vintage 9.9
history 9.8
steel 9.7
structure 9.6
stone 9.6
light 9.3
modern 9.1
ancient 8.6
work 8.6
business 8.5
design 8.4
tourism 8.2
window 8.2
landscape 8.2
dirty 8.1
day 7.8
black 7.8
male 7.8
scene 7.8
roof 7.7
gas 7.7
winter 7.7
power 7.5
outdoors 7.5
smoke 7.4
floor 7.4
life 7.2
snow 7.1
working 7.1
part 7

Google
created on 2019-06-06

Stock photography 59.4
History 54.1
Art 50.2

Microsoft
created on 2019-06-06

drawing 96.5
clothing 87.4
person 85.3
sketch 80.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Female, 50%
Angry 49.6%
Calm 49.5%
Confused 49.5%
Happy 49.5%
Surprised 49.5%
Disgusted 49.5%
Sad 50.4%

AWS Rekognition

Age 38-59
Gender Female, 50.5%
Disgusted 49.5%
Happy 49.7%
Confused 49.5%
Sad 49.9%
Angry 49.5%
Surprised 49.5%
Calm 49.7%

AWS Rekognition

Age 17-27
Gender Female, 50.3%
Disgusted 49.5%
Surprised 49.6%
Sad 49.8%
Confused 49.6%
Calm 49.9%
Happy 49.6%
Angry 49.6%

AWS Rekognition

Age 26-43
Gender Female, 53.7%
Calm 53.3%
Angry 45.2%
Disgusted 45.1%
Happy 45.2%
Confused 45.2%
Sad 45.8%
Surprised 45.2%

AWS Rekognition

Age 23-38
Gender Male, 50.4%
Surprised 49.5%
Angry 49.8%
Confused 49.5%
Sad 49.8%
Calm 49.5%
Disgusted 49.9%
Happy 49.5%

AWS Rekognition

Age 19-36
Gender Female, 50.2%
Angry 49.6%
Confused 49.6%
Disgusted 49.6%
Surprised 49.6%
Calm 49.6%
Sad 49.8%
Happy 49.7%

AWS Rekognition

Age 19-36
Gender Female, 52%
Confused 45.5%
Sad 46%
Angry 45.6%
Disgusted 47.2%
Happy 45.5%
Surprised 45.4%
Calm 49.8%

Feature analysis

Amazon

Person 99.4%

Captions

Text analysis

Amazon

The
Handicapped
The Co-operative Workrooms
Co-operative
Workrooms
Women
tor

Google

The Co-operative Workrooms for Handicapped Women
The
Co-operative
Workrooms
for
Handicapped
Women