Human Generated Data

Title

Untitled (Branchville, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1891

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Branchville, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1891

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

War 99.5
People 99.3
Transportation 98.2
Truck 98.2
Vehicle 98.2
Person 96.5
Person 96.4
Person 96.3
Person 96.2
Person 95.3
Person 94.3
Person 94.1
Machine 94.1
Wheel 94.1
Person 94
Wheel 93.8
Person 93.8
Person 92.1
Person 88.8
Person 86.1
Car 85
Person 77.4
Face 76.2
Head 76.2
Person 72.9
Person 72.6
Person 72.5
Person 70.7
Outdoors 69.3
Person 69
Person 67.5
Person 66.8
Smoke 64.3
Road 62.3
Person 57.9
Person 57.1
License Plate 57.1

Clarifai
created on 2018-05-11

people 99.7
vehicle 99.3
group together 98.6
many 97.9
transportation system 96.4
group 95.9
police 95.2
military 94.1
war 93.1
adult 93
car 92.2
crowd 91
truck 90.4
driver 89.2
administration 88.5
man 87.6
uniform 87.5
soldier 87
road 86.2
street 85.5

Imagga
created on 2023-10-06

vehicle 84
truck 59.3
motor vehicle 39.8
steamroller 35.3
machine 30.8
tow truck 29.7
tractor 27
farm 26.8
wheeled vehicle 26
equipment 25.3
car 24
conveyance 22.4
landscape 22.3
field 21.8
transportation 21.5
machinery 21.4
work 21.2
rural 21.2
sky 21.1
agriculture 20.2
road 19.9
transport 19.2
industrial 19.1
industry 18.8
tire 18.4
half track 18
military vehicle 17.9
plow 17.9
grass 17.4
wheel 15.2
tracked vehicle 14.8
dirt 14.3
heavy 14.3
farming 13.3
land 13
farmer 12.8
drive 12.3
tool 12
construction 12
danger 11.8
power 11.8
golf equipment 11.8
summer 11.6
working 11.5
auto 11.5
outdoor 11.5
driving 10.6
automobile 10.5
ground 10.4
trailer 9.7
harvest 9.4
site 9.4
countryside 9.1
environment 9
sports equipment 8.9
wheels 8.8
hay 8.8
agricultural 8.8
old 8.4
lorry 8.3
earth 8.2
4x4 7.9
sand 7.9
soil 7.8
scene 7.8
cargo 7.8
engine 7.7
safety 7.4
yellow 7.3
trees 7.1
job 7.1
steel 7.1
day 7.1
country 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.7
sky 99.2
people 62
old 47.8
golfcart 34.1
car 16.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-29
Gender Male, 100%
Angry 38%
Happy 20.1%
Sad 15.3%
Disgusted 11.5%
Fear 9%
Surprised 7.8%
Calm 3.2%
Confused 2.8%

AWS Rekognition

Age 24-34
Gender Male, 100%
Sad 95.7%
Calm 46.5%
Surprised 6.3%
Fear 5.9%
Happy 2.7%
Angry 1.8%
Confused 0.3%
Disgusted 0.1%

AWS Rekognition

Age 24-34
Gender Male, 90.9%
Calm 86.2%
Fear 8.2%
Surprised 6.5%
Happy 3.3%
Sad 2.5%
Angry 2.3%
Disgusted 1.3%
Confused 0.2%

AWS Rekognition

Age 24-34
Gender Male, 97%
Calm 70.6%
Surprised 9.3%
Angry 7.6%
Happy 6.8%
Fear 6.1%
Disgusted 5.5%
Sad 3.2%
Confused 1%

Feature analysis

Amazon

Truck 98.2%
Person 96.5%
Wheel 94.1%
Car 85%

Categories

Text analysis

Amazon

2020