Of algorithms, automated trucks, and ethics

Dec. 10, 2015
Of algorithms, automated trucks, and ethics

When you started getting down into the weeds of autonomous vehicle technology, some rather strange – and potentially scary – discussions start to take place.

For example, picture this scenario: A truck in autonomous mode traveling a heavily congested roadway suddenly comes upon completely stopped traffic. To the right, is a passenger car; dead in front is a transit bus; and on the left, a school bus.

What does the vehicle do?

Does it have enough time to slam on the brakes? If it does not and must maneuver to avoid rear-ending a packed transit bus, does it go left or right? Go left, and you hit a school bus loaded with children. Yet maybe, since it is a bigger vehicle, it can better absorb the crash and only cause injuries – for if it swerves right, hitting the passenger car, the chances of a fatality are much higher.

Then again, the children in the school bus don’t have seat belts. Does that elevate the risk of serious injuries or even for fatalities?

Wow. Talk about a difficult ethical problem – one not easily solved by mathematics.

The Massachusetts Institute of Technology (MIT) delved into this problematic topic with a rather provocative headline back in October: Why Self-Driving Cars Must be Programmed to Kill.

Experts gathered here at the Texas Motor Speedway to attend the North American Automated Trucking Conference also touched on this tricky subject as well.

Stephan Keese, senior partner with Roland Berger Strategy Consultants, posed those kinds of dilemmas, asking if automated trucks need to be programmed with algorithms that automatically choose the larger vehicle in case of a crisis situation. If so, how might laws need to be changed to reflect such crisis decision making?

“Should the driver even be allowed to overrule a vehicle in such a case? This is one of the reasons why the least of the hurdles facing truck automation are on the technology side,” he said. “We’re still early in the in the development of legal and ethical requirements for automated trucks.”

However, Bill Kahn – principal engineer and manager of advanced concepts for Peterbilt Motors Co. – stressed that those types of decisions will probably never have to be made, simply because machines can react faster than humans in such “crisis situations” on the highway.

“We’ll just slam on the brakes and stop,” he explained, noting that current research indicates it takes one to two seconds for a human driver to “hit the brakes,” whereas a machine can do so in 1/100th of a second.

Kahn also referenced Google’s experience to date with crashes involving its autonomous cars: that it’s typically other human-controlled vehicles hitting the self-driving cars, not the other way around.

“Those [self-driving] cars get run into – they don’t hit other vehicles,” he said. “We expect [autonomous] trucks will operate in a similar mode.”

About the Author

Sean Kilcarr 1 | Senior Editor

Voice your opinion!

To join the conversation, and become an exclusive member of FleetOwner, create an account today!

Sponsored Recommendations

The Road Ahead: 2025 Trucking and Fleet Insights

Discover how fleet operators are impacted by challenges like driver onboarding delays and complex compliance, and the critical need for technology to boost efficiency and cut ...

Driving Growth: How to Manage More Freight

Ready to grow your trucking business? Whether you have 25 or 200 trucks, this guide offers practical tips and success stories to help you expand with confidence. Discover how ...

How to Maximize Fleet Management with Vehicle Bypass

Join us on February 18th to learn how truck weigh station bypass systems boost fleet performance and driver satisfaction.

Optimizing your fleet safety program using AI

Learn how AI supports fleet safety programs with tools for compliance monitoring, driver coaching and incident analysis to reduce risks and improve efficiency.