Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Two Women, Ware, Massachusetts Factory

Date

20th century

People

Artist: Jerry Liebling, American 1924 - 2011

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, National Endowment for the Arts Grant, P1983.8

Human Generated Data

Title

Two Women, Ware, Massachusetts Factory

People

Artist: Jerry Liebling, American 1924 - 2011

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, National Endowment for the Arts Grant, P1983.8

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Wood 99.8
Machine 99.1
Person 98.9
Human 98.9
Person 98
Plywood 97.9
Lathe 96.1
Workshop 95.2
Person 83.2
Building 82.9
Wheel 60.2
Factory 57.4

Clarifai
created on 2023-10-29

grinder 99.6
industry 99.3
production 99.2
people 98.9
one 97.3
machine 97
raw material 96
adult 95.9
indoors 95.1
group 94.3
man 92.8
worker 92.1
artisan 92
skill 89.9
woman 89.7
bench 89.1
dig 89
loom 88.8
saw 86.6
metalwork 86.4

Imagga
created on 2022-02-26

factory 77.2
machine 66.6
sawmill 62.1
power saw 60.9
power tool 45.8
plant 40.7
industry 30.7
industrial 29.9
tool 27
building complex 24.7
man 24.2
work 22.7
equipment 21.6
steel 20.3
metal 20.1
worker 18.7
working 18.5
power 18.5
loom 17.9
job 17.7
mechanic 17.6
mechanical 17.5
machinery 16.3
structure 16.1
male 15.6
textile machine 15.6
engineering 15.2
device 14.8
person 14.8
manufacturing 14.6
workshop 14.1
tools 13.3
iron 13.1
circular saw 12.7
building 12.7
labor 12.6
repair 12.4
car 12
construction 12
occupation 11.9
wood 11.7
heavy 11.4
wheel 11.3
men 11.2
people 11.1
old 11.1
safety 11
inside 11
interior 10.6
skill 10.6
engine 10.6
indoors 10.5
technology 10.4
modern 9.8
business 9.7
profession 9.6
vehicle 9.3
energy 9.2
enterprise 8.8
manufacture 8.8
production 8.7
pipe 8.7
lathe 8.7
gear 8.7
transportation 8.1
handsome 8
lifestyle 7.9
wooden 7.9
engineer 7.9
garage 7.9
helmet 7.7
professional 7.6
sit 7.6
shop 7.5
environment 7.4
training 7.4
indoor 7.3
room 7.3
protection 7.3
dirty 7.2
cut 7.2
adult 7.1
carpenter 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

indoor 97
floor 92.7
person 76.9
clothing 76.2
woman 62.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-33
Gender Female, 99%
Calm 85.6%
Sad 6.1%
Angry 3.5%
Confused 2.6%
Surprised 0.9%
Disgusted 0.4%
Happy 0.4%
Fear 0.3%

Feature analysis

Amazon

Person
Person 98.9%

Categories