You, human, and your imprecise, unscientific movements in controlling mobility machines are becoming obsolete in this digital day and age. And by “mobility machines,” we’re talking vehicles—the real mobility devices mankind has relied on to expand physical movement capabilities for more than a century.
They’ve helped change society more in the last 100 years than easily the thousand before that. The real question is, what’s next?
Rather clunky not long ago and now incredibly micronized, computers are becoming capable of controlling those vehicles, including heavy trucks, better than people can in various ways. Some of it is done mechanically, such as moving servos and actuators, but they can be run and governed electronically.
And most of this is now purely digital information, countless ones and zeros, and digital “brains” synching up and communicating on demand with each other—and ultimately becoming more unified.
This is machine-to-machine, or M2M, communication, and by no means is it limited to transportation. But in that vein, vehicles and many other things are being wirelessly connected, and it’s quite a rapid progression in the broader scheme of things. Are you ready for it?
The keys to it all are electricity and connectivity. You could think of it as a process that traces its roots to the early and mid-19th century and the advent of the electric telegraph—the first real form of rapid, complex communication over distances—and it took another step with the onset of the telephone. That added another dimension: live sound.
Suddenly, people could hear someone else in “real time,” even from a very long distance, as long as a wire reached from transmitter to receiver. But then over-the-air broadcasting began sending more information like television and radio signals. Connecting wires have been fading away since the turn of the millenium with the spread and advancement of cellular and wireless communication and information transfer.
The final piece of this puzzle was the Internet and subsequent “Internet of Things,” or IoT. The former provided a means of evermore rapid data connectivity around the globe, while the latter continues to grow and enable microcosms of connectivity via short-range wireless communication.
The IoT—which is said more correctly in the plural, IoTs—can be seen as M2M communication, but it’s more like organs that play a role in a larger “body” of M2M. As long as you’ve got power, you can use wireless-transmitting sensors to do all kinds of things: monitor if a truck trailer’s door is shut and locked; check the temperature in the cab, or even temperature and pressure in tires; watch the road ahead and all around using video, radar and/or laser radar (lidar); monitor a football player’s vital signs from inside a helmet.
The possibilities of IoT systems can arguably be called limitless, and many surely have yet to be imagined. But link up all that information to a long-range wireless connection, fold in the Internet, web-based software, and the cloud, and connect the various software systems with application program interfaces (APIs) or other means, and you can truly allow machines to talk to machines.
Robots talk to robots
A good example of M2M communication popped up when Fleet Owner went along on a road trip of Mack Trucks’ new Anthem highway tractor. The latest trucks are becoming very technologically advanced and more computer-run, and the Anthem is no exception.
The truck features as standard Bendix Commercial Vehicle Systems’ Wingman Fusion collision avoidance technology. As part of its functions, the system uses radar and video to monitor the road ahead and can apply the brakes if the truck is closing too quickly on another vehicle, for example. Similarly, Wingman Fusion has an Active Cruise with Braking feature that can slow or stop the truck if it’s in cruise control and runs up on a car ahead.
Mack’s mDRIVE automated manual transmission, meanwhile, also boasts an optional, computer-controlled Predictive Cruise feature, which stores topographical information from a truck’s routes when engaged. As the truck is tooling along in cruise, the Predictive Cruise system might have information stored that a large ascent is approaching, and the mDRIVE might select a lower gear to optimize fuel efficiency.
There you have two advanced, software- and hardware-based systems—machines with their own IoTs and complex computer brains—that can sometimes dance a bit awkwardly together. The mDRIVE might be optimizing for a hill ahead just as a passenger car cuts in front of the truck and sets off Wingman Fusion’s collision-avoidance protections.
Autonomous cars will need to have the same faculties that allow humans to drive effectively, Balasubramanian said. That includes not just a single viewing “eye” but a stereo camera view, just like humans’ two eyes, enabling object detection, recognition and tracking; situational analysis; and the ability to develop and execute a reaction strategy.
Putting it simply, self-driving vehicles can’t simply detect, say, 80% of a pedestrian crowd. These vehicles need to get to 100%.
That means gathering and processing an enormous amount of data, so vehicle makers and equipment suppliers may need to approach that information differently. For example, Balasubramanian noted that a stereo camera system might “see” 500,000 3D points or pixels; if it instead detected “stixels”—three-dimensional, vertical bars making up objects and environments—that could reduce the number of 3D data points to 500-1,000.
Self-driving vehicles’ system architecture also will need to improve, which will come down to better processors, sensors and algorithms applied to all the data funneled through. That at least has been the experience of computers since their inception and has accelerated substantially in the last two decades.
There’s also the issue of all that M2M communication actually producing a vehicle “brain,” usually referred to as artificial intelligence, or AI. Tim Dawkins, an autonomous car specialist with automotive technology research and consulting firm SBD, used the analogy of AI making a vehicle as intelligent as a horse.
“A horse has a basic sense of self-preservation. It won’t walk off a cliff or run into a wall that it comes across,” he said during a recent TU Automotive webinar, AI: The Brain Behind the Autonomous Vehicle. “A horse has, at least on a very basic level, the ability to follow a trail or path with minimal input from the rider. And again, on a basic level, a horse can respond to simple commands—even emotive ones.”
So autonomous vehicles, Dawkins suggested, will require those sorts of capabilities and will need to respond to human input receptively. They must not only “perceive” all relevant elements of the surrounding environment and be able to act upon them, but will have to interact with humans inside them, which approaches artificial personality.
Another element of this technological progression—perhaps the first and foremost consideration—is information and data security. As vehicles become more connected and incorporate more sensors and data inputs, they’ll also have more avenues for cybersecurity breaches. If cars and commercial trucks can drive themselves, it’s imperative that their information can’t be stolen or they can’t be “taken over.”
That’s no small concern, and it might be the area of autonomous vehicles that’s most lacking, even before that level of advancement is upon us. A report just issued from cybersecurity company McAfee, Do You Know Where Your Data Is, found organizations on average take too long to report data security breaches, largely don’t understand the cybersecurity laws that apply to them, and often don’t know where their data is stored.
While the technology to enable autonomous vehicles exists right now, more needs to be done before they can handle all the dynamic, complex situations found in the real world. And unless the technology advances with the requisite data protection and security, putting self-driving cars and trucks on the streets will be a “shoot first, ask questions later” move that could bring very serious repercussions.