Human Generated Data

Title

Untitled (Jersey Homesteads, New Jersey)

Date

1939

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3579

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Jersey Homesteads, New Jersey)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3579

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Male 98.9
Person 98.9
Boy 98.9
Child 98.9
Person 98.8
Architecture 98.6
Building 98.6
Outdoors 98.6
Shelter 98.6
Person 98.4
Adult 98
Male 98
Man 98
Person 98
Nature 91.3
Clothing 86.9
Hat 86.9
Person 85.8
People 85.1
Face 80.4
Head 80.4
War 69.9
Countryside 69.6
Car 69
Transportation 69
Vehicle 69
Machine 60.4
Wheel 60.4
Rural 57.3
Factory 56.5
Worker 56.1
Hut 55.9
Coat 55.2
Manufacturing 55.2

Clarifai
created on 2018-05-10

people 100
adult 99.5
group 99.3
group together 98.1
war 97.8
military 97
many 96.8
man 95
child 94.5
administration 94
soldier 93.2
two 93.1
vehicle 93
several 91.6
woman 90.9
one 89.8
wear 88.6
home 88.4
waste 88
skirmish 87.4

Imagga
created on 2023-10-06

machine 55.9
thresher 42.1
farm machine 34.9
plow 32.3
stretcher 31.4
tool 29.7
factory 28.7
industry 27.3
vehicle 26.8
industrial 25.4
litter 25
tractor 24.5
machinery 23.5
work 22
device 21.9
dirt 21
conveyance 20.3
heavy 20
construction 19.7
equipment 19.6
wheel 18.9
track 18.8
power 18.5
bulldozer 17.9
old 16.7
earth 16.5
transport 16.4
transportation 16.1
plant 15.3
farm 15.2
grass 14.2
rural 14.1
building 13.5
steel 13.3
shovel 12.9
sky 12.8
danger 12.7
military 12.6
working 12.4
digging 11.8
structure 11.5
metal 11.3
yellow 11.3
iron 11.2
smoke 11.2
land 11.1
car 10.9
field 10.9
excavator 10.9
tree 10.8
bucket 10.7
steam 10.7
engine 10.6
outdoors 10.5
ground 10.4
building complex 10.2
man 10.1
dig 9.9
gun 9.8
agriculture 9.7
sand 9.6
site 9.4
fire 9.4
truck 9.2
safety 9.2
environment 9.1
dirty 9
tire 9
landscape 8.9
hoe 8.9
hydraulic 8.9
loader 8.9
soldier 8.8
war 8.7
build 8.5
locomotive 8
mover 7.9
scoop 7.9
destruction 7.8
disaster 7.8
soil 7.8
outside 7.7
train 7.7
pollution 7.7
farming 7.6
backhoe 7.6
action 7.4
farmer 7.4
back 7.3
protection 7.3
road 7.2
black 7.2
hay 7.2
engineer 7.2
worker 7.1
job 7.1
architecture 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 96.4
old 70.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Male, 91%
Happy 62.4%
Calm 14.4%
Fear 9.8%
Surprised 8%
Confused 3.8%
Angry 3.6%
Sad 3.1%
Disgusted 1.7%

AWS Rekognition

Age 19-27
Gender Male, 99.2%
Calm 60.8%
Happy 37.7%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Confused 0.3%
Disgusted 0.3%
Angry 0.2%

AWS Rekognition

Age 23-31
Gender Male, 99.9%
Calm 56.5%
Sad 41.6%
Fear 10.9%
Surprised 6.5%
Confused 2.9%
Angry 2.4%
Disgusted 1.7%
Happy 1.7%

AWS Rekognition

Age 25-35
Gender Male, 79.9%
Surprised 70.6%
Calm 57.6%
Fear 5.9%
Sad 2.2%
Disgusted 0.3%
Angry 0.1%
Happy 0.1%
Confused 0.1%

Feature analysis

Amazon

Adult 99.1%
Male 99.1%
Man 99.1%
Person 99.1%
Boy 98.9%
Child 98.9%
Car 69%
Wheel 60.4%