Human Generated Data

Title

Untitled (Greenbelt, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1883

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Greenbelt, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1883

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Person 98.3
Person 98
Adult 98
Male 98
Man 98
Person 98
Person 97.8
Machine 97.8
Wheel 97.8
Person 97.6
Construction 94.3
Wheel 93.1
Water 92.5
Waterfront 92.5
Person 91
Spoke 87.9
Outdoors 85.6
Arch 85
Architecture 85
Oilfield 78
Wheel 77.9
Wheel 70.2
Person 67.7
Transportation 66.8
Vehicle 66.8
Head 61.8
Worker 60.5
Person 57.3
Building 56.8
Factory 56.8
Weapon 55.4

Clarifai
created on 2018-05-11

people 99.9
vehicle 99.8
group together 99.6
group 99.4
transportation system 99.2
adult 98.6
cavalry 98
many 97
wagon 97
carriage 96
man 95.2
military 95.2
driver 94.7
war 93.7
soldier 91.7
several 91.5
skirmish 87.2
two 86.9
cart 85
railway 83.3

Imagga
created on 2023-10-06

architecture 25.1
sky 23.6
device 22.8
park 22.5
city 22.4
machine 21.2
thresher 20.5
building 19.5
travel 19
cannon 16.9
farm machine 16.4
landscape 16.4
history 16.1
tract 16.1
statue 16.1
old 16
silhouette 14.1
tourism 12.4
monument 12.1
gun 12.1
landmark 10.8
outdoor 10.7
sculpture 10.6
pollution 10.6
outdoors 10.5
smoke 10.2
church 10.2
man 10.1
house 10.1
weapon 10.1
male 9.9
urban 9.6
locomotive 9.6
people 9.5
historical 9.4
famous 9.3
industrial 9.1
vehicle 8.8
outside 8.6
culture 8.5
capital 8.5
industry 8.5
wheeled vehicle 8.5
structure 8.5
horizontal 8.4
sunset 8.1
transportation 8.1
tower 8.1
roof 8.1
sun 8.1
water 8
bulldozer 7.9
chimney 7.8
steam locomotive 7.8
tree 7.8
construction 7.7
musical instrument 7.4
symbol 7.4
environment 7.4
vacation 7.4
historic 7.3
metal 7.2
weathercock 7.1
night 7.1
factory 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

old 88.9
black 78.7
white 77.9
group 58.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 27-37
Gender Male, 98.3%
Calm 34.2%
Sad 27.4%
Happy 24.6%
Confused 12.9%
Surprised 7.4%
Fear 6.3%
Disgusted 3.9%
Angry 1.2%

AWS Rekognition

Age 24-34
Gender Female, 93.9%
Calm 90.9%
Fear 6.4%
Surprised 6.4%
Sad 3.9%
Happy 1.2%
Disgusted 0.9%
Angry 0.8%
Confused 0.3%

Feature analysis

Amazon

Adult 98.9%
Male 98.9%
Man 98.9%
Person 98.9%
Wheel 97.8%