Human Generated Data

Title

Untitled (Pennsylvania?)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4281.4

Human Generated Data

Title

Untitled (Pennsylvania?)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4281.4

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Gravel 99.5
Road 99.5
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 98.5
Male 98.5
Man 98.5
Person 98.5
Machine 96.4
Spoke 96.4
Person 94.9
Wheel 94.5
Outdoors 89.3
Soil 89.3
Nature 77.6
Head 74
Face 67.1
Coal 57.3
Field 57
Garden 57
Gardener 57
Gardening 57
Tarmac 56.9
Mining 56.7
Alloy Wheel 56.4
Car 56.4
Car Wheel 56.4
Tire 56.4
Transportation 56.4
Vehicle 56.4
Clothing 56.4
Hat 56.4
Worker 56.4
Architecture 55.4
Building 55.4
Factory 55.4
Agriculture 55.2
Countryside 55.2

Clarifai
created on 2018-05-10

people 99.7
adult 97.6
one 95.2
man 93.9
woman 90
group 89.6
group together 89.1
veil 89
two 87.8
wear 86.8
music 84.5
vehicle 84.4
portrait 83
street 83
art 82.6
child 80.9
musician 80
outfit 79.5
monochrome 77
actor 76.3

Imagga
created on 2023-10-06

man 18.1
water 18
machine 17.7
chain saw 16.3
outdoors 15.8
tool 14.8
rock 13.9
sand 13.9
industry 13.7
vehicle 13.3
power saw 13.3
person 13.2
machinist 13
stone 12.8
travel 12.7
adult 12.3
people 12.3
work 11.4
beach 11
sky 10.8
outdoor 10.7
old 10.4
power tool 10.2
park 9.9
worker 9.8
river 9.8
landscape 9.7
country 9.7
tractor 9.6
love 9.5
action 9.3
tree 9.2
field 9.2
industrial 9.1
working 8.8
rural 8.8
car 8.7
device 8.6
construction 8.6
male 8.5
two 8.5
structure 8.3
environment 8.2
wheeled vehicle 8.2
bulldozer 8.1
road 8.1
to 8
day 7.8
happiness 7.8
summer 7.7
attractive 7.7
safety 7.4
equipment 7.4
protection 7.3
black 7.2
wet 7.2

Microsoft
created on 2018-05-10

posing 36.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-29
Gender Male, 91.2%
Sad 72.4%
Calm 53.1%
Confused 10.4%
Surprised 6.4%
Fear 6%
Happy 1.3%
Disgusted 0.7%
Angry 0.5%

Feature analysis

Amazon

Adult 99.1%
Male 99.1%
Man 99.1%
Person 99.1%
Wheel 94.5%

Categories

Imagga

paintings art 99.1%

Captions

Microsoft
created on 2018-05-10

a boy standing in front of a building 30.2%

Text analysis

Amazon

College
Art
and
University
of
(Harvard
Fellows
President and Fellows of Harvard College (Harvard University Art Museums)
Museums)
Harvard
President
P1970.4281
P1970.4281 1.0004
1.0004

Google

@ President and Fellows of Harvard College (Harvard University Art Museums) P1970.4281.0004
@
and
of
President
Fellows
Harvard
College
(Harvard
University
Art
Museums)
P1970.4281.0004