Of algorithms, automated trucks, and ethics

Dec. 10, 2015
Of algorithms, automated trucks, and ethics

When you started getting down into the weeds of autonomous vehicle technology, some rather strange – and potentially scary – discussions start to take place.

For example, picture this scenario: A truck in autonomous mode traveling a heavily congested roadway suddenly comes upon completely stopped traffic. To the right, is a passenger car; dead in front is a transit bus; and on the left, a school bus.

What does the vehicle do?

Does it have enough time to slam on the brakes? If it does not and must maneuver to avoid rear-ending a packed transit bus, does it go left or right? Go left, and you hit a school bus loaded with children. Yet maybe, since it is a bigger vehicle, it can better absorb the crash and only cause injuries – for if it swerves right, hitting the passenger car, the chances of a fatality are much higher.

Then again, the children in the school bus don’t have seat belts. Does that elevate the risk of serious injuries or even for fatalities?

Wow. Talk about a difficult ethical problem – one not easily solved by mathematics.

The Massachusetts Institute of Technology (MIT) delved into this problematic topic with a rather provocative headline back in October: Why Self-Driving Cars Must be Programmed to Kill.

Experts gathered here at the Texas Motor Speedway to attend the North American Automated Trucking Conference also touched on this tricky subject as well.

Stephan Keese, senior partner with Roland Berger Strategy Consultants, posed those kinds of dilemmas, asking if automated trucks need to be programmed with algorithms that automatically choose the larger vehicle in case of a crisis situation. If so, how might laws need to be changed to reflect such crisis decision making?

“Should the driver even be allowed to overrule a vehicle in such a case? This is one of the reasons why the least of the hurdles facing truck automation are on the technology side,” he said. “We’re still early in the in the development of legal and ethical requirements for automated trucks.”

However, Bill Kahn – principal engineer and manager of advanced concepts for Peterbilt Motors Co. – stressed that those types of decisions will probably never have to be made, simply because machines can react faster than humans in such “crisis situations” on the highway.

“We’ll just slam on the brakes and stop,” he explained, noting that current research indicates it takes one to two seconds for a human driver to “hit the brakes,” whereas a machine can do so in 1/100th of a second.

Kahn also referenced Google’s experience to date with crashes involving its autonomous cars: that it’s typically other human-controlled vehicles hitting the self-driving cars, not the other way around.

“Those [self-driving] cars get run into – they don’t hit other vehicles,” he said. “We expect [autonomous] trucks will operate in a similar mode.”

About the Author

Sean Kilcarr 1 | Senior Editor

Voice your opinion!

To join the conversation, and become an exclusive member of FleetOwner, create an account today!

Sponsored Recommendations

Streamline Compliance, Ensure Safety and Maximize Driver's Time

Truck weight isn’t the first thing that comes to mind when considering operational efficiency, hours-of-service regulations, and safety ratings, but it can affect all three.

Improve Safety and Reduce Risk with Data from Route Scores

Route Scores help fleets navigate the risk factors they encounter in the lanes they travel, helping to keep costs down.

Celebrating Your Drivers Can Prove to be Rewarding For Your Business

Learn how to jumpstart your driver retention efforts by celebrating your drivers with a thoughtful, uniform-led benefits program by Red Kap®. Uniforms that offer greater comfort...

Guide To Boosting Technician Efficiency

Learn about the bottom line and team building benefits of increasing the efficiency of your technicians in your repair shop.