How safe is “safe enough”? That’s the nagging question that lurked in every discussion of fully autonomous vehicles. Many auto companies pushed their AV plans to the back burner, and seemingly pushed off the question too. But now, with advanced driving assistance system (ADAS) serving as the preferred autonomous technology throughout the auto industry, that question is reasserting itself. The AAA just said the industry is falling so far short of safe operation with ADAS that OEMs should back off their plans to offer ADAS too.
A report by the American Automobile Association (AAA) last week didn’t exactly give a ringing endorsement to active ADAS features, such as lane keeping and automatic emergency braking, both pushed by many automakers in the name of “saving people’s lives.”
After testing a host of ADAS vehicles both in real-world conditions and in a closed-course setting, AAA issued an alarm that points out the uneven performance among many ADAS vehicles today. They cautioned that ADAS features showed “the lack of consistency” in performance, “far from 100 percent reliable.” The report cited the potential of a “dangerous scenario,” in which “active driving assistance systems often disengage with little notice — almost instantly handing control back to the driver.”
“AAA recommends,” the report said, “manufacturers increase the scope of testing for active driving assistance systems and limit their rollout until functionality is improved to provide a more consistent and safer driver experience.”
Two big failures singled out by AAA are the poor performance of lane keeping assistance and inadequate ADAS alerts provided to drivers.
According to AAA’s automotive researchers, “On public roadways, nearly 73% of errors involved instances of lane departure or erratic lane position” with ADAS vehicles coming too close to other vehicles or guardrails.
AAA noted that the specific nature of lane-keeping events were varied, but they listed common events as follows:
- Abrupt disengagements (sometimes during critical situations)
- Failure to engage
- Erroneous disengagements in terms of perceived inattention
- “Ping-ponging”within the lane
- Becoming uncomfortably close to other vehicles or guardrails in the lateral direction
- Complete lane departures within: curves, pavement transitions, upon encountering exit ramps/on-ramps
AAA’s closed-course testing found that the systems performed mostly as expected. However, when those vehicles approached a “soft test vehicle”(a dummy car), “in aggregate, ADAS vehicles collided with it 66% of the time, and the average impact speed was 25 mph.”
AAA also found disengagements by active driving assistance systems dangerous. Cars with the feature tend to transfer the driving task back to a human driver at a moment’s notice — typically when drivers are paying little attention or have become too dependent on the system.
“AAA has repeatedly found that active driving assistance systems do not perform consistently, especially in real-word scenarios,” said Greg Brannon, director of automotive engineering and industry relations, in a statement. “Manufacturers need to work toward more dependable technology, including improving lane keeping assistance and providing more adequate alerts.”
OEMs off the hook
After the AAA issued the report, most car OEMs, unsurprisingly, offered similar retorts. Instead of discussing specifically why they think their ADAS functioned unreliably, they said ADAS vehicles are “designed to assist the driver, not replace the driver.” If their vehicles encountered hazards on the road and couldn’t handle it, the fault would fall to a driver who didn’t stay engaged in driving.
In short, the line is already drawn in the sand. As Phil Magney, founder of VSI Labs, told EE Times:
There are two kinds of automated vehicles — supervised and unsupervised. Anything less than [Level] 3 is considered supervised automation meaning the driver is fully responsible for the safe operation of that vehicle. This means they have to pay close attention to what is happening and intervene when necessary.
If your vehicle’s ADAS features perform erratically, crashing into a car parked on the side of a road, for example, it’s on you. If you’d been paying attention in the driver seat, you could have compensated your vehicle’s sub-optimal ADAS performance.
Fair or unfair, the car OEM who developed a not so “advanced” after all, driver assistance system, that didn’t tell a parked car from a garbage bag, is off the hook.
What about benchmarking?
However, it’s possible that the variability of ADAS performances could eventually catch up with car OEMs. While there are a lot of efforts going on to standardize the nomenclature of ADAS (which is a good thing), ADAS features that don’t meet certain performance levels can confuse consumers, who don’t know what to expect of ADAS features regardless of how they are called.
Is there a benchmarking system that every car company could use to meet the high-standard spec in developing its ADAS features?
Magney acknowledged that none exists today. “None of the testing agencies have a benchmark defined for automated driving. They do however have pretty good protocol for testing ADAS features, but AV performance is still outside their scope of research.”
While he said he’s a little surprised that more has not been done, Magney added, “I suppose it is because nobody can agree on what is safe or not.”
Others have different opinions. Colin Barnden, principal analyst at Semicast Research, for example, told EE Times that benchmarking should be “the role of the New Car Assessment Program, better known as NCAP or the 5 Star Crash Ratings program.”
NCAP is “the government’s premier consumer information program for evaluating vehicle safety performance,” as stated by the U.S. Department of Transportation’s National Highway Traffic Safety.
NHTSA is known to have avoided any regulations or controversial discussions that might make them look too hard on the automotive industry. The Center for Auto Safety, a consumer advocacy group based in Washington DC, blames NHTSA for having wasted ten years failing to upgrade NCAP. The group said NHTSA has sidelined its best tool for protecting consumers.
The agency gives 98% of all new cars either four or five stars for their new vehicles, as if they are all participants getting orange slices after a kindergarten soccer game no matter the score. Instead of being horrified at NCAP’s lack of useful comparative crash data, NHTSA has apparently decided press releases alone will protect people in car crashes.
“The real crux of the issue,” as Barnden pointed out, is in that “the testing protocols aren’t anywhere near rigorous enough.” Automakers can target minimum compliance and still come out with a top rating.
In his opinion, “It is not that consumers should lack confidence in ADAS, it is that they should lack confidence in NHTSA to take road safety seriously.”
While the National Transportation Safety Board (NTSB) has been making recommendations related to automation complacency for the last three years, NHTSA continues to do nothing meaningful to resolve the issues, Barnden explained.
On the other hand, compared to the United States, “Europe is the world leader in the development of suitable testing protocols for ADAS, and it is many years ahead of the U.S.,” noted Barnden. He called Euro NCAP and Thatcham Research “the two bodies at the forefront of this work.”
NCAP could do the same.
“If [US] NCAP really can’t specify anything meaningful, they could simply adopt the Euro NCAP protocols, for instance AEB Vulnerable Road User (VRU) Test Protocol v3.0.3, valid from June 2020, which is the most demanding AEB test protocols to date. That would be a great start,” said Barnden.
So, which vehicles were used in AAA’s tests? The cars selected for testing, using a defined set of criteria, included a 2019 BMW X7 with “Active Driving Assistant Professional,” a 2019 Cadillac CT6 with “Super Cruise,” a 2019 Ford Edge with “Ford Co-Pilot36,” a 2020 Kia Telluride with “Highway Driving Assist” and a 2020 Subaru Outback with “EyeSight.”
Looking at test results, VSI Labs’ Magney said, “The level 2 systems evaluated in this [AAA] study perform poorly in my opinion. The lane keeping failures surprised me.”
At VSI Labs, he said, “We have been testing state-of-the-art lane keeping for years and the lane keeping shown in these tests remind me of our first lane keeping application where we used OpenCV. Nowadays, state of the art lane keeping is done with AI and the resulting performance is much better. Tesla’s vision-based lane keeping is a good as it gets because they have trained and retrained their lane keeping networks with millions of miles of customer data.”
So, how to explain one of the AAA findings — that ADAS vehicles, when approaching a soft test vehicle, ran into it two out of three times?
This is not surprising, Magney said. “This particular use case is hard for lane keeping systems.” To the lane keeping systems, the lane is either blocked or not, said Magney. “In this case of partial lane blockage confuses even state-of-the-art systems like Tesla.”
But why is it confusing to ADAS? Magney explained, “The sensors don’t have enough confidence in the lateral position or lateral size (object boundaries) to initiate a ‘positive’ action. If they did they would be braking too often creating hazards.”
Could it be also because the soft test vehicle was parked on the side of the road, not moving?
“The stationary nature of the object is another reason. AEB often filters out radar for the same reason,” he noted. Further, “For AI based systems (and some of these are), you don’t have enough training on these types of edge cases.”
Magney added, “But this is really not an edge case anymore. Hopefully it is a known unsafe condition, per SOTIF (Safety Of The Intended Functionality) practices.”
What about ‘human factors’?
With ADAS, if accidents happen, car OEMs can legally blame human drivers for not having been fully engaged in driving.
But shouldn’t carmakers also acknowledge “human factors”? For the average driver, it is not always crystal clear what an ADAS car intends to do. More important, it is far from certain at which point the ADAS car decides the human driver should take the wheel.
Barden sees one big problem in the latest AAA report.
“The full AAA report does not even mention the terms DMS, driver monitoring or human factors,” he stressed. “That’s actually where the safety debate is, not so much ADAS.”
Indeed, handover problems from ADAS features to the human driver shouldn’t even be discussed without considering vision-based driver monitoring systems (DMS).
Barnden laid out a practical dilemma that highlights the problem with existing driver-assistance systems.
- ADAS is too active when the driver is alert and fully engaged in the driving task.
- ADAS is too passive in a split-second emergency situation, particularly when the driver is distracted, fatigued or impaired.
“The solution is real-time variance in the responsiveness of the driver-assistance systems, based on permanently monitoring the driver’s engagement level and attention state,” noted Barnden. “By the end of this decade, almost every new light vehicle produced will use DMS as the primary safety system, helping to keep the driver engaged in the driving task and providing warnings for distraction, fatigue and impairment.”
In his opinion, “that leaves driver-assistance as the secondary safety system, correcting minor control errors and providing longitudinal and lateral interventions only when absolutely necessary.”
The distractions and interference identified by the AAA research will almost completely disappear as the sophistication and use of DMS increases, he concluded.
Meanwhile, the buyer of an ADAS-equipped car has ample reason to beware, not just in the showroom but behind the wheel.