MOTOR Magazine

A MOTOR Magazine Newsletter
November 17, 2016

Contributed by Bob Chabot
See, Think and Act

When will automated vehicles surpass humans and become the better road users?

When will automated vehicles finally surpass humans and become the better road users? That is a question that has automakers, suppliers and academic researchers around the world struggling to find an answer to. It's driving acquisitions, joint ventures and partnerships to be formed. Consider ZF Friedrichshafen AG. The Tier 1 supplier recently acquired TRW Automotive, creating the second largest automotive services company in the world. It is also conducting joint research with the Institute for Automotive Engineering at Aachen University to make fully automated driving a reality within a decade.

"The acquisition of TRW in May 2015 has been a perfect fit of products and services," shared Dr. Stefan Sommer, ZF CEO. "We call the blend a portfolio of 'intelligent mechanical systems,' which are used the whole way along the autonomous driving functional chain. Our sensors are capable of detecting the surrounding environment. Our powerful control units then analyze and process this information so they can issue appropriate commands to the vehicle's braking, steering, damping or roll stabilization systems in response to real-time circumstances."

"Our portfolio is also unique because it covers the full range of actuators. In short, we are enabling vehicles to see, think and act. While it will be a little while before totally driverless vehicles become a regular feature of our streets, the number of assistance systems used in vehicles is rapidly proliferating — harbingers, as it were, of the automated road traffic of the future. They significantly improve safety by helping drivers perform repetitive driving maneuvers, as well as by making the right decisions in hazardous situations."

"The perceptual abilities of autonomous systems are steadily improving," Sommer noted. "They can now assist, or in some cases, completely replace experienced human drivers. Together with our research partners at the Institute for Automotive Engineering (IAE) at Aachen University, ZF is well on its way to building vehicles that see, think and act."

SAE's J3016 standard defines six levels of automated driving as shown above. ZF's academic research partners say drivers are fully responsible for driving for levels 0, 1 and 2. But at level 3, drivers are allowed to be less attentive; that's where every traffic situation needs to be automated and thoroughly tested. (All images — ZF Friedrichshafen AG)

Who Sees Better: The Driver or the Automobile?
At the IAE, Professors Lutz Eckstein and Maximilian Schwalm each specialize in different aspects of automated driving aimed at answering this question. Professor Eckstein is the pragmatist. Besides directing the IEA, Eckstein is focused on implementing autonomous driving. Professor Schwalm is a psychologist who heads the Driver Experience & Performance department and studies interactions between driver assistance and information systems.

"Human sensory perception is the product of thousands of years of evolution, and it still is ahead of current technology, but the gap is closing rapidly," noted Eckstein. "The remaining challenge is to fully understand all situations and predict outcomes quickly enough."

"As of today, humans are better able to separate what's important from what's unimportant," added Schwalm. "Automated driving will work only if we combine different sensor principles, meet functional safety requirements and determine what form interaction between people and technologies will take. Only then can we take responsibility for the vehicle away from the driver."

"In experiments, for example, we discovered that drivers can't suddenly take back control of their vehicles; ultimately, drivers always need several seconds before they're ready," explained Eckstein. "If you're in an automated car, the vehicle should bear the responsibility and risk. The driver can't. The driver's and vehicle's responsibilities must be clearly differentiated."

"Re-engaging the driver is only one of the issues," Schwalm countered. "Even more important is how we win drivers' trust and confidence. We need some sort of guidance display that shows drivers how the vehicle will behave so they don't suddenly interfere if and when they think they've spotted a dangerous situation."

"Standards organizations such as SAE and ISO have already distinguished between multiple levels of automation," advised Eckstein. "At levels 1 and 2, the lowest levels of automated driving, you drive with assistance, but the driver is ultimately responsible. The next stage, level 3, is very tricky, we haven't yet been able to automate every traffic situation, but we're close. The higher levels require much more work to be done. At higher levels, we'll need steer-by-wire, a mechanically decoupled steering system that simply disconnects the driver."

"It's at level 3 that we allow drivers to be inattentive," Schwalm agreed. "And that's where we have to be able to automate every traffic situation. We also have to reckon with trust, and in cognitive terms, the latency time it takes a driver to re-engage. To some extent, the industry has already made very considerable progress in gaining social acceptance of such systems. In aviation, for example, it wouldn't occur to anyone that a 12-hour transatlantic flight under the manual control of a human pilot would be safer than a flight controlled by an autopilot. In fact, with the technology available, autopilot is significantly safer. We're getting there with automated driving."

Sensors, cameras, actuators and electronic control are the four cornerstones to implementing fully autonomous driving.

Vehicles That Think Can Learn to Do Anything
"It's just a question of managing resources," stated Dr. Hans-Gerd Krekels, vice president of ZF's Active Driver Assistance Systems Engineering. "Development engineers at ZF are working to keep tight control of the spiraling volumes of data needed to overcome this challenge."

"Radar and camera sensors are fitted all around the our test vehicle to provide 360 degrees of sensor coverage, plus a forward-facing camera for good measure. The prototype is also equipped with three driver-assist systems: An adaptive cruise control system, a lane-keeping assist system, and a lane-change assistant. The collected data is fed to the various control units, where a situation analysis is carried out before a signal is finally sent to the actuators. In the case of this vehicle, the actuators include a braking system and a steering system."

"When the car starts closing in on the car in front, a signal is sent to the sensors, which then checks whether the left-hand lane is free for overtaking. The vehicle does everything else on its own: Pulling out into a sufficiently large gap, accelerating to an appropriate speed, or pulling back into the right-hand lane at a safe distance from the overtaken vehicle.

"We've begun to touch the future," explained Dr. Karl-Heinz Glander, senior engineering manager for ZF's Automated Driving & Integral Cognitive Safety Program. "ZF has already already developed a Safety Domain ECU (electronic control unit) that acts as a control unit for automated driving that makes it possible for ZF to integrate any number of disparate safety-related software packages. This ECU processes the many millions of bits of data generated by the environmental sensors in order to analyze the status of both vehicle and surrounding traffic. Links with the steering, braking and driveline systems make it possible to combine and precisely coordinate a whole raft of vehicle functions."

"Just as our brains analyze complex situations, interprets them, anticipates movements and estimates distances — we have to duplicate this ability with our software, data and 3-D image processing systems," Glander added. "Fortunately, the know-how, processing chips and software chips that capture, extract and manage this information has been used in the infotainment and gaming industries for more than a decade. We don't have to reinvent the wheel to get vehicles to think. We just have to work with those companies to adapt their products to the rugged environment vehicles operate within."

Krekels and Glander like to compare the conventional Advanced Driver Assistance Systems (ADAS) installed in today's cars with eyes that tell a hand what to do directly, without any intervention by the brain. These systems are only capable of carrying out one highly specialized task relating to a very specific case, such as staying in one lane, or not running into the vehicle in front.

Fully automated driving, they say, is more analogous to the way humans think, act and react. It collects and aggregates information in order to describe the surrounding world. Existing ADAS systems don't do this yet. This ability to describe, understand, perceive and interpret the world will comprise emerging premium ADAS functions that will make fully automated driving reality. While it will still be some time, before we can entrust safety and automated driving to the cloud, we're closing that gap every day. Within the next decade is a reasonable timeline to see it arrive."

[Editor's note: Visit MOTOR for the latest diagnostic and service insights.]

Important Links
MOTOR Current Issue
MOTOR Current Issue
MOTOR Magazine

MOTOR Information Systems • 1301 W. Long Lake Road, Suite 300 • Troy, MI 48098