Human Generated Data

Title

Untitled (Greenbelt, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1902

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Greenbelt, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1902

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Carpenter 100
Wood 99.3
Person 98.9
Worker 98.8
Person 97.7
Adult 97.7
Male 97.7
Man 97.7
Construction 97.4
Person 95.1
Person 90.1
Adult 90.1
Male 90.1
Man 90.1
Clothing 79.2
Hat 79.2
Animal 76.8
Horse 76.8
Mammal 76.8
Water 76.1
Waterfront 76.1
People 57.5
Lumber 55.8
Oilfield 55.7
Outdoors 55.7

Clarifai
created on 2018-05-11

people 99.6
group together 99.2
many 97.2
group 97.2
adult 95.8
man 94.9
vehicle 91.8
recreation 87
military 85.8
war 85.6
crowd 85.3
competition 84.8
administration 78.9
wear 77.8
woman 77.7
several 76
transportation system 75.3
soldier 74.2
watercraft 74
action 72.1

Imagga
created on 2023-10-06

stretcher 55.1
litter 43.2
conveyance 42.5
shopping cart 24.9
sky 20.4
shopping 20.2
metal 19.3
cart 18.5
handcart 17.7
building 17.5
wheeled vehicle 16.2
buy 15
industry 14.5
sale 13.9
steel 13.8
construction 13.7
structure 13.5
business 13.4
market 13.3
shop 13.1
urban 13.1
industrial 12.7
supermarket 12.7
water 12.7
people 12.3
empty 12
park 11.9
trolley 11.8
power 11.7
architecture 11.7
retail 11.4
store 11.3
work 11.1
man 10.8
tower 10.7
tract 10.6
container 10.3
chair 10
city 10
outdoors 9.7
rural 9.7
outdoor 9.2
basket 8.9
equipment 8.8
machine 8.7
gas 8.7
bridge 8.6
outside 8.6
push 8.6
fence 8.5
old 8.4
metallic 8.3
transport 8.2
technology 8.2
vehicle 8
women 7.9
device 7.8
pipe 7.8
purchase 7.7
fuel 7.7
trade 7.6
two 7.6
clouds 7.6
stile 7.6
energy 7.6
commercial 7.5
commerce 7.5
brass 7.4
yellow 7.3
support 7.3
holiday 7.2
river 7.1
day 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

black 88.2
outdoor 85.3
standing 82.4
old 54.5
vintage 26.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-28
Gender Male, 89.1%
Calm 84.8%
Sad 8.2%
Surprised 6.7%
Fear 6%
Confused 2%
Angry 1.6%
Disgusted 0.5%
Happy 0.2%

Feature analysis

Amazon

Person 98.9%
Adult 97.7%
Male 97.7%
Man 97.7%
Horse 76.8%

Categories