As more automakers move toward partially autonomous driving systems, the Insurance Institute for Highway Safety announced Thursday it will push the for adequate safeguards to make certain that the drivers of those vehicles continue to be watching the road.
IIHS announced that it's creating a new ratings program to evaluate these safeguards, and expects to issue its first set of ratings in 2022. Additionally, it warned that no OEM yet meets all of its pending criteria.
“Partial automation systems could make long drives seem like a lesser burden, but there's no evidence that they make driving safer,” IIHS President David Harkey said inside a statement. “In fact, the opposite could be the case if systems lack adequate safeguards.”
The new ratings program is based on “anecdotal observation of how quite a few users are intentionally misusing scalping strategies, and several high-profile crashes which have been linked to driver inattention,” Joe Young, director of media relations for IIHS, told Repairer Driven News.
The safeguards is going to be rated good, acceptable, marginal or poor. Generally, to earn a good rating, IIHS said, systems will need to make certain that the driver's eyes take presctiption the road as well as their hands either around the wheel or prepared to snap it up all the time.
Semi-autonomous systems can also get to use an escalating series of alerts, and become ready to take appropriate emergency procedures, including bringing the vehicle to a stop along the side of the street, if the driver does not react.
‘Misleading messaging’ on capabilities
The institute noted that, regardless of what it really called “misleading messaging” from some OEMs, truly autonomous vehicles aren't yet open to Americans. Instead, the systems available on the market offer “partial automation,” able to profit the driver by steering, accelerating and braking by themselves, but never fully in control of the automobile.
To date, most systems, like Tesla's Autopilot, Mercedes-Benz's Driving Assistance Package and General Motors' Super Cruise and upcoming Ultra Cruise, happen to be semi-autonomous, requiring the driver to become watching the road at all times and ready to go ahead and take wheel or hit the brakes.
GM’s Super Cruise, for example, employs a “Driver Attention Camera” and a display directly in the driver's type of vision that’s meant to help drivers keep a clear head on the highway.
Advanced Driver Assistance System capabilities, employing radar, cameras and lidar for lane-departure warning and lane keeping, adaptive cruise control and automatic emergency braking , have been presented as both security features and necessary components for autonomous driving.
Volvo has announced that its new Ride Pilot system, to become incorporated with the OEM’s EV flagship to be launched in 2022, will a minimum of approach Level 3 capability, where the car, rather than the human, does the driving. “The name 'Ride Pilot' implies what the driver can get: when the car is driving on its own, Volvo Cars takes responsibility for the driving, offering the driver comfort and reassurance,” the OEM has said.
Ride Pilot won't be made available to be used, Volvo said, until it has received all necessary regulatory approval and passed the OEM’s safety tests. It's said that the feature is going to be launched first in California.
Misuse by drivers
IIHS said hello is encouraging the adoption of systems to prevent both the intentional and unintentional misuse of self-driving capabilities. Existing technology can monitor an individual's gaze, head posture or hand position to ensure they are in line with somebody that is actively engaged in driving, it said.
Other safety advocates have also begun watching the need for driver monitoring, IIHS said. For instance, it noted that Consumer Reports has announced it'll begin awarding points for partially automated driving systems, as long as they have adequate driver monitoring systems, and will factor in IIHS safeguard ratings after they become available.
IIHS said it cannot provide precise timing because of its testing program, because ongoing supply chain issues have made it harder to obtain vehicles for testing.
The Institute took some OEMs to job for “over[selling] the capabilities of the systems, prompting drivers to treat the systems as if they can drive the car by themselves.” In some extreme cases, the Institute noted, drivers result watching videos, doing offers on their own cellphones or perhaps taking naps while speeding on the highway.
It cited a high-profile example from 2022, in which a Tesla Model X driver was killed once the Autopilot-engaged vehicle accelerated into a collapsed safety barrier. The nation's Transportation Safety Board discovered that the motive force was probably depressed by a cellphone gaming at the time.
Unintentional misuse can also be an issue, said IIHS Research Scientist Alexandra Mueller, who's responsible for the brand new ratings program.
“The way many of these systems operate gives people the impression that they're able to do a lot more than they are really,” Mueller said inside a statement. “But even when drivers comprehend the limitations of partial automation, their brains can still wander. As humans, it's tougher for us to remain vigilant when we're watching and waiting for a problem to occur than it is when we're doing all of the driving ourselves.”
In October, Los Angeles County prosecutors filed two counts of vehicular manslaughter from the driver of a Tesla on Autopilot that ran a sore point and struck another car, killing its two occupants in 2022. It is believed to be the very first time a person continues to be faced with a felony for any fatal crash involving a semi-autonomous driving system.
Drivers taking risks
In a current interview concerning the top 5 trends impacting P&C insurance in 2022, CCC industry analyst Susanna Gotsch told Repairer Driven News that early data has shown that self-driving technology might be encouraging drivers to take on more risk.
“There should be more work around the human machine interface,” Gotsch said. She said some drivers may not be conscious that, even if the self-driving systems are operating, they still need anticipate to dominate at a moment’s notice.
“That are the biggest challenge right now” she said. “The liability always lies with the driver, since the technologies are structured in a manner that says you, as the driver, are always in charge, you are accountable for the operation of this vehicle. Even if you’re while using system, you have to be capable of taking over from that technology very quickly, in in a matter of seconds.”
Gotsch noted one recent illustration of the technology’s limitations, when a vehicle with ACC and lane-keeping assist was being tested in Virginia. When the lane markings disappeared, “the vehicle started to abide by it off the road,” she said. “The motive force in that particular case was able to take over quickly simply because they were inside a test mode, however your average driver might not be able to do so.”
IIHS said its ratings wouldn't address how well individual ADAS systems operate in identifying how well their cameras or radar sensors identify obstacles, which it noted are also factors that could bring about crashes.
IIHS ratings criteria
In its announcement, the Institute sketched out its ratings criteria. To earn a great rating, a system will need to use multiple alerts to remind the driver to return their eyes to the road as well as their hands towards the wheel, after they’ve neglected these duties for “too long.”
“Evidence shows that the more types of alerts a person receives, the more likely they'll see them and respond. These alerts must begin and escalate quickly,” IIHS said. They might include chimes, pulsing the brakes, tugging on the driver’s seat belt or vibrations.
If the driver doesn’t respond to the alerts, the machine should slow the automobile to some crawl, or perhaps a full stop, and send an indication to a manufacturer “concierge” who are able to call emergency services.
“Once this escalation occurs, the driver should be locked from the system through out the drive, before the engine is switched off and started again,” IIHS said.
Requirements for lane-keeping and lane-changing technologies are included as well. “All automated lane changes should be initiated or confirmed through the driver, for example. When traffic ahead causes ACC to bring the vehicle to a complete stop, it should not automatically resume when the driver isn't exploring the road or the vehicle has been stopped for too long. And the lane centering feature should encourage the driver to share within the steering rather than switching off automatically whenever the motive force adjusts the wheel, which effectively discourages them from taking part in the driving,” IIHS said.
Finally, the Institute said, system shouldn't allow the driver use partial automation features when their seat belt is unfastened, or when AEB or lane-departure prevention is disabled.
The NTSB, which has investigated several crashes of production and prototype vehicles designed with automated driving systems, has previously recommended that federal agencies “develop a performance standard for driver monitoring systems and mandate their implementation.”
IIHS had signaled its concerns in June 2022, when Harkey warned that driver-assist technologies “have the possibility to create new risks.”