Human Generated Data

Title

Untitled (Branchville, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1869

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Branchville, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1869

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Transportation 99.5
Truck 99.5
Vehicle 99.5
Person 98.8
Person 97.7
Adult 97.7
Male 97.7
Man 97.7
Person 96.1
Person 95.7
Machine 95.3
Wheel 95.3
Wheel 95.2
Person 91.3
Person 91.2
Person 91.2
Person 90
Person 88.7
Outdoors 87.9
Wheel 87.7
Person 86.4
Tire 78.5
Person 74.2
Face 70.9
Head 70.9
Nature 66.3
People 57.8
Spoke 57.1
Alloy Wheel 55.8
Car 55.8
Car Wheel 55.8

Clarifai
created on 2018-05-11

vehicle 99.2
transportation system 98.5
people 98.3
group together 97.8
truck 94.8
group 94.6
war 93.5
many 92.6
military 90.6
driver 89.1
car 89
street 88.3
adult 87.9
soldier 87.5
wagon 87
man 86.8
train 85.3
road 84.9
army 83.1
administration 82.6

Imagga
created on 2023-10-06

vehicle 100
truck 77.6
military vehicle 67.8
half track 65
tracked vehicle 52.2
machine 43.1
wheeled vehicle 38.8
car 37
transportation 34.1
motor vehicle 34
transport 27.4
road 27.1
conveyance 24.3
auto 23
automobile 21.1
concrete mixer 20.5
tractor 19.1
wheel 18.9
machinery 18.5
fire engine 18.2
old 18.1
industrial 16.4
drive 16.1
garbage truck 15.4
work 14.9
sky 14.7
device 14.6
dirt 14.3
tire 13.4
farm 13.4
heavy 13.4
rural 13.2
driving 12.6
equipment 12.3
industry 12
landscape 11.9
grass 11.9
tow truck 11
danger 10.9
wheels 10.8
military 10.6
travel 10.6
rusty 10.5
summer 10.3
vintage 9.9
working 9.7
lorry 9.7
engine 9.6
construction 9.4
power 9.2
land 9.2
field 9.2
farmer 9
4x4 8.9
abandoned 8.8
agriculture 8.8
cargo 8.7
yellow 8.6
outdoor 8.4
retro 8.2
trailer 7.9
load 7.9
sand 7.9
motor 7.8
tree 7.7
move 7.7
race 7.6
speed 7.3
trailer truck 7.3
metal 7.2

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.7
sky 99.3
transport 84.8
old 64.1
military vehicle 58.3
pulling 34.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 37-45
Gender Male, 99.7%
Calm 98%
Surprised 6.3%
Fear 5.9%
Sad 2.5%
Angry 0.5%
Confused 0.2%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 23-33
Gender Male, 92.6%
Fear 80.6%
Surprised 18.8%
Calm 11%
Sad 5.2%
Angry 3.2%
Confused 1%
Disgusted 0.8%
Happy 0.7%

AWS Rekognition

Age 22-30
Gender Male, 97.4%
Surprised 94.4%
Sad 11.1%
Happy 10.6%
Fear 6.3%
Calm 5.4%
Disgusted 2.5%
Confused 2%
Angry 1.9%

AWS Rekognition

Age 52-60
Gender Male, 99.9%
Calm 99.5%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Disgusted 0.1%
Happy 0.1%
Angry 0.1%
Confused 0.1%

AWS Rekognition

Age 25-35
Gender Male, 99.1%
Calm 46.6%
Fear 35.2%
Sad 11%
Surprised 7%
Angry 2.9%
Disgusted 2.1%
Happy 2%
Confused 0.6%

AWS Rekognition

Age 23-33
Gender Female, 86.4%
Calm 88.3%
Fear 7.7%
Surprised 6.4%
Angry 4%
Sad 2.4%
Disgusted 1.2%
Happy 0.7%
Confused 0.3%

Feature analysis

Amazon

Truck 99.5%
Person 98.8%
Adult 97.7%
Male 97.7%
Man 97.7%
Wheel 95.3%

Categories

Imagga

cars vehicles 96.6%
paintings art 3.2%

Text analysis

Amazon

HADE
DE