Human Generated Data

Title

Untitled (Branchville, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1921

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Branchville, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1921

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Transportation 99.2
Truck 99.2
Vehicle 99.2
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Machine 96.7
Wheel 96.7
Person 96.4
Person 95.2
Person 95.1
Person 94.2
Wheel 94.1
Person 91.9
Person 91.4
Person 89.5
Person 87.4
People 87.3
Person 79.8
Accessories 75.9
Bag 75.9
Handbag 75.9
Person 75.7
Person 73.8
Person 71.6
Clothing 70.7
Hat 70.7
Face 67.7
Head 67.7
Car 65.4
Outdoors 65.3
Coat 57.1
Armored 55.8
Half Track 55.8
Military 55.8
Worker 55.5

Clarifai
created on 2018-05-11

vehicle 99.7
people 98.9
group together 98.2
transportation system 97.7
war 97.5
truck 96.2
military 95.8
military vehicle 95
group 92.6
soldier 90.7
driver 90.7
adult 90.3
skirmish 90.1
car 89.7
man 89.1
many 88.8
army 83.5
tank 83.3
several 79.5
weapon 78.1

Imagga
created on 2023-10-06

half track 100
military vehicle 100
vehicle 100
tracked vehicle 97.6
truck 66.5
wheeled vehicle 58
conveyance 52.4
car 39.5
transportation 37.7
transport 30.2
wheel 29.3
tow truck 28.7
machine 27.9
motor vehicle 25
road 24.4
machinery 24.4
tractor 23.3
heavy 22.9
equipment 22.9
tire 21.6
industry 21.4
driving 20.3
power 20.2
auto 20.1
work 19.7
wheels 19.6
industrial 18.2
engine 17.3
dirt 16.3
drive 16.1
old 16.1
motor 15.5
farm 15.2
dirty 14.5
automobile 14.4
agriculture 14.1
sky 14.1
construction 13.7
cargo 13.6
speed 12.8
tires 12.8
war 12.6
field 12.6
race 12.4
rural 12.4
working 11.5
metal 11.3
lorry 11.3
land 11.1
4x4 10.9
load 10.8
farmer 10.8
sport 10.7
military 10.6
yellow 10.6
object 10.3
pickup 9.8
diesel 9.8
landscape 9.7
tank 9.5
grass 9.5
earth 9.2
gun 8.8
dust 8.8
army 8.8
jeep 8.7
rusty 8.6
dangerous 8.6
adventure 8.6
farming 8.5
vintage 8.3
retro 8.2
activity 8.1
job 8
off road 7.9
bumper 7.9
agricultural 7.8
outdoor 7.7
danger 7.3
steel 7.1
summer 7.1
trailer 7.1
travel 7.1

Microsoft
created on 2018-05-11

truck 99.3
sky 98.9
outdoor 98.4
transport 83.3
military vehicle 65.2
old 65.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 33-41
Gender Male, 95.7%
Calm 90.4%
Surprised 7.1%
Fear 6%
Angry 3.2%
Sad 2.7%
Disgusted 1.6%
Happy 1.1%
Confused 0.3%

AWS Rekognition

Age 29-39
Gender Male, 99.7%
Calm 94.5%
Surprised 6.4%
Fear 6.1%
Sad 3.1%
Happy 0.8%
Disgusted 0.5%
Angry 0.5%
Confused 0.3%

AWS Rekognition

Age 25-35
Gender Female, 92.7%
Sad 99.7%
Calm 21.3%
Fear 7.1%
Surprised 6.4%
Happy 4.8%
Confused 2.7%
Disgusted 0.5%
Angry 0.5%

AWS Rekognition

Age 11-19
Gender Female, 61.2%
Sad 85%
Calm 29.7%
Angry 16.9%
Disgusted 7.8%
Surprised 7.1%
Fear 6.4%
Happy 3.1%
Confused 0.6%

Feature analysis

Amazon

Truck 99.2%
Adult 99.1%
Male 99.1%
Man 99.1%
Person 99.1%
Wheel 96.7%
Handbag 75.9%
Hat 70.7%
Car 65.4%

Categories

Imagga

cars vehicles 99.8%