Human Generated Data

Title

Untitled (Branchville, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1873

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Branchville, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1873

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

War 99.3
People 98.5
Transportation 98.4
Truck 98.4
Vehicle 98.4
Person 98.1
Person 97.6
Person 96.6
Person 95.3
Person 95.2
Machine 94.9
Wheel 94.9
Person 94.9
Person 94.5
Person 93.8
Wheel 93.2
Person 92.5
Person 91
Person 90.1
Person 89.7
Person 83.4
Accessories 81.7
Bag 81.7
Handbag 81.7
Person 81.3
Person 79.1
Car 74.7
Smoke 74.5
Person 73.8
Person 73.7
Bus 72.1
Person 69.2
Car 68.5
Person 67.5
Person 64.6
Person 64.5
Outdoors 62.1
Clothing 60.6
Hat 60.6
Person 59
Footwear 58.5
Shoe 58.5
Construction 57.7
Construction Crane 57.7
Road 57.5
Person 55.8
License Plate 55.7

Clarifai
created on 2018-05-11

people 99.8
vehicle 99.6
group together 98.7
transportation system 98.2
many 97.6
group 96.5
car 95.9
adult 95.1
police 94.3
driver 93.9
truck 92
street 91.6
war 91.3
road 91
military 90.4
man 89.8
crowd 89
uniform 88.7
administration 87.3
soldier 86.1

Imagga
created on 2023-10-06

vehicle 100
steamroller 54.2
truck 51
motor vehicle 37.6
conveyance 36.7
machine 36
wheeled vehicle 28.3
tractor 28.2
equipment 27.6
transportation 26
machinery 25.3
transport 24.7
work 23.6
farm 23.2
half track 22.7
industry 22.2
industrial 21.8
sky 21
tow truck 20.3
military vehicle 20.2
car 19.9
landscape 19.3
grass 19
road 19
tracked vehicle 18.5
rural 18.5
agriculture 18.4
dirt 18.1
field 17.6
heavy 16.2
wheel 16.1
golf equipment 15.5
tire 14.9
construction 14.5
outdoor 13.8
power 13.4
drive 13.2
driving 12.6
auto 12.4
ground 12.3
site 12.2
sports equipment 11.6
summer 11.6
working 11.5
land 11.1
farmer 11
danger 10.9
automobile 10.5
old 10.5
farming 10.4
earth 10.1
engine 9.6
yellow 9.3
lorry 9.1
bulldozer 8.8
move 8.6
business 8.5
harvest 8.5
environment 8.2
job 8
trailer 7.9
dig 7.9
sand 7.9
hay 7.8
soil 7.8
scene 7.8
cargo 7.8
track 7.7
container 7.6
device 7.6
pickup 7.5
house 7.5
outdoors 7.5
action 7.4
man 7.4
street 7.4
speed 7.3
metal 7.2
tool 7.2
activity 7.2
male 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.4
sky 99
old 54.3
golfcart 19.6
car 14.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 28-38
Gender Male, 99.8%
Happy 95.7%
Surprised 6.5%
Fear 6%
Sad 2.2%
Angry 2.1%
Disgusted 0.5%
Calm 0.5%
Confused 0.2%

AWS Rekognition

Age 16-22
Gender Male, 95.3%
Calm 53.4%
Happy 21.9%
Angry 14.3%
Surprised 7%
Fear 6.1%
Sad 4.5%
Disgusted 1.5%
Confused 1.4%

AWS Rekognition

Age 43-51
Gender Female, 70.7%
Angry 90%
Surprised 8.9%
Fear 6.5%
Sad 2.3%
Disgusted 1.9%
Calm 0.7%
Happy 0.4%
Confused 0.2%

AWS Rekognition

Age 12-20
Gender Male, 84.2%
Calm 51.2%
Disgusted 18%
Angry 9.8%
Surprised 9.2%
Confused 7.6%
Fear 6.3%
Happy 4.8%
Sad 3.2%

Feature analysis

Amazon

Truck 98.4%
Person 98.1%
Wheel 94.9%
Handbag 81.7%
Car 74.7%
Bus 72.1%
Hat 60.6%
Shoe 58.5%

Categories