Human Generated Data

Title

Untitled (Greenbelt, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1888

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Greenbelt, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1888

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Machine 100
Spoke 100
Adult 98.1
Male 98.1
Man 98.1
Person 98.1
Wheel 97.5
Clothing 96.2
Outdoors 95.6
Person 94.3
Adult 88.8
Male 88.8
Man 88.8
Person 88.8
Wood 80.7
Water 80.4
Waterfront 80.4
Nature 77
Architecture 65
Building 65
Factory 65
Manufacturing 65
Face 64.1
Head 64.1
Transportation 64
Vehicle 64
Hat 58.9
Carpenter 57.4
Coat 56.8
Amusement Park 56.1
Construction 55.9
Cap 55.8
Axle 55.7
Bicycle 55.6
Oilfield 55.6
Shelter 55.4

Clarifai
created on 2018-05-11

people 99.8
adult 98.2
transportation system 97
vehicle 96.8
man 95.2
group 93.1
two 91
group together 87.6
one 86.3
child 85
nostalgia 83.2
woman 82.2
war 82.2
driver 81.7
monochrome 81.5
recreation 81.1
cart 80.7
chair 80.5
military 80
wear 78

Imagga
created on 2023-10-06

machine 22.9
park 20
old 19.5
building 18.4
wheeled vehicle 15.7
sky 15.3
structure 15.3
work 14.4
tree 14.2
tract 13.7
architecture 13.4
device 13.2
vacation 13.1
wheel 13
wicker 12.7
travel 12.7
factory 12.6
water 12
construction 12
industry 11.9
landscape 11.9
wagon 11.5
rural 11.5
house 10.9
vintage 10.7
outdoor 10.7
industrial 10
outdoors 9.7
equipment 9.7
vehicle 9.5
mechanical device 9.4
iron 9.3
holiday 9.3
steel 9.1
snow 9.1
transportation 9
river 8.9
trees 8.9
sun 8.9
metal 8.8
country 8.8
cold 8.6
mechanism 8.5
power 8.4
container 8.1
product 7.8
plant 7.7
winter 7.7
city 7.5
countryside 7.3
transport 7.3

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 98.4
tree 98.3
black 75.8
pulling 30.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 43-51
Gender Male, 99.9%
Sad 100%
Surprised 6.9%
Happy 6%
Fear 5.9%
Confused 2.2%
Disgusted 2%
Angry 1.4%
Calm 0.2%

Feature analysis

Amazon

Adult 98.1%
Male 98.1%
Man 98.1%
Person 98.1%
Wheel 97.5%
Hat 58.9%