Human Generated Data

Title

Untitled (Greenbelt, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1850

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Greenbelt, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1850

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Adult 99.4
Male 99.4
Man 99.4
Person 99.4
Water 99
Waterfront 99
Person 98.9
Person 98.7
Wood 97.7
Person 96.4
Outdoors 86.7
Bicycle 86.4
Transportation 86.4
Vehicle 86.4
Hat 79.6
Machine 79.6
Wheel 79.6
Wheel 77.2
Wheel 76.5
Hardhat 70.3
Helmet 70.3
Sun Hat 57.9
Carpenter 56.9
Worker 56.7
Cap 56.7
Pants 56.2
Wheel 56.1
Spoke 56
Construction 55.9
Face 55.2
Head 55.2
Coat 55.1

Clarifai
created on 2018-05-11

people 100
group together 99.3
adult 98.9
vehicle 98.1
group 97.8
man 97.7
many 95.3
transportation system 95.3
military 95.2
war 92.8
soldier 92.6
uniform 90.7
police 90
two 88.9
administration 88.7
woman 85.6
wear 85.6
outfit 83.5
veil 82.6
street 79.8

Imagga
created on 2023-10-06

factory 48.6
industry 29
plant 24.5
industrial 23.6
man 22.2
work 22
building 21.5
machine 21.5
worker 21.1
construction 19.7
building complex 19
working 17.7
structure 17.2
people 16.7
job 15
equipment 15
city 15
old 14.6
machinery 14.6
steel 14.2
site 13.1
men 12.9
machinist 12.8
helmet 12.6
person 12.1
male 12.1
vehicle 11.8
power 11.8
house 11.7
labor 11.7
uniform 11.5
builder 11.3
business 10.9
heavy 10.5
device 10.3
safety 10.1
occupation 10.1
transportation 9.9
metal 9.7
architecture 9.4
iron 9.3
street 9.2
dirty 9
to 8.8
urban 8.7
dirt 8.6
tool 8.5
action 8.3
protection 8.2
mechanic 7.8
marimba 7.7
repair 7.7
fire 7.5
percussion instrument 7.4
engineer 7.2
road 7.2
home 7.2
activity 7.2
adult 7.1
travel 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

tree 98.1
outdoor 97.8
person 95.7
standing 81.1
black 73.1
posing 67

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-28
Gender Male, 91.8%
Sad 99.9%
Fear 12.4%
Surprised 7%
Calm 5.9%
Disgusted 2.3%
Confused 1.6%
Angry 1.5%
Happy 1.2%

Feature analysis

Amazon

Adult 99.4%
Male 99.4%
Man 99.4%
Person 99.4%
Bicycle 86.4%
Hat 79.6%
Wheel 79.6%
Helmet 70.3%

Categories

Imagga

paintings art 95.3%
people portraits 3.4%

Captions

Text analysis

Amazon

65