In-Cabin Monitoring Critical Piece of Autonomous-Vehicle Puzzle
Knowing precisely where occupants are and what they are doing will be as critical to automated driving’s future as to whether the vehicle can safely negotiate its way down the highway or around the block.
NOVI, MI – Perfecting the technology that will allow a car to drive itself safely from Point A to Point B clearly is the No.1 engineering challenge for autonomous-vehicle developers. But a close second is understanding – and controlling – what is going on inside the cabin while it does so.
The still-nascent field of in-vehicle occupant monitoring is expected to accelerate quickly in the next few years as new vehicles add higher-tech advanced-driver-assistance features and make what is expected to be a slow and cautious walk up to Level 3-and-beyond AVs.
Technology such as eye and head tracking, already appearing in some vehicles as ADAS systems advance, as well as biometrics to measure the health – and perhaps sobriety – of the driver, will be needed “in a world where you’re not actually driving (all the time),” Richard Vaughan, creative director of interiors supplier CGT, says in a discussion of “The Growing Role of Occupant and Interior Sensing” at the WardsAuto User Experience conference here.
Although the industry is working toward Level 4 AVs that can pilot themselves within geo-fenced areas and, ultimately, go-anywhere Level 5 self-driving vehicles, many industry insiders say most of the industry’s near-term focus is locked on development of more limited Level 2-plus ADAS features.
Interior-systems supplier Autoliv expects Level 3 systems that allow drivers to take their hands off the wheel and eyes off the road and more sophisticated Level 4 vehicles to account for only about 10% of global vehicle sales by 2030, says Rich Matsu, senior director-engineering. That figure will grow to 30% sometime beyond the next decade, when Level 5 vehicles are seen reaching only about a 5% share of new-vehicle sales annually.
“We see Level 2 as being dominant for the foreseeable future,” Matsu notes. “It’s a big focus of our organization and the industry overall.”
Level 2 technology requires the driver to remain alert enough to quickly take over control from the automated system when needed, and that means driver detection will become increasingly paramount the more Level 2 systems begin to approach Level 3 capability.
That movement up the technology ladder already is under way, with General Motors, Tesla and others beginning to offer advanced adaptive-cruise-control systems that take much of the highway driving load off the human driver.
As these highway-pilot systems get more sophisticated and allow drivers to focus away from the road for longer periods, it becomes tougher to regain their attention when needed. Matsu estimates at least seven seconds will be needed to alert the driver to retake control of the vehicle. That sounds like a lot of time, he says, but it might not be, based on how far down the road vehicles will be able to “see into the future.”
“Is the driver going to be ready?” he asks.
To that end, developers are trying to understand what will work best to detect the driver’s state of mind and what signals the car should provide to make sure he is alerted in time to resume control.
Both Matsu and fellow panelist Michael Godwin, North American director-visible LED products for Osram, point to the car’s steering wheel as the new human-machine-interface focal point. The best example of that to date is GM’s Super Cruise Level 2-plus ACC system that uses an infrared sensor to monitor the driver’s eyes and lights up along the steering wheel to signal if all systems are go or about to shut down because the driver’s eyes aren’t on the road.
“We think this is going to be a more prevalent feature in the future,” Matsu says of the interactive steering wheel.
Next-gen models also will see more vivid head-up displays with wider fields of view that will keep the driver informed and engaged, Godwin says. In-cabin lighting may be employed in new ways to signal drivers if the auto-piloting vehicle needs intervention.
A lot of human-factors study is going on to determine what color lighting is most effective, where it should be placed and how fast it should pulse, plus what audible signals should accompany those visual cues, Godwin says. But he notes, near term, the steering wheel will remain the most likely HMI point of focus for ADAS developers, because of its low cost and ease of integration and implementation.
“We’re still dealing with all the usual norms of vehicle (development), so for now it’s the steering wheel,” Godwin says.
Even so, much work is under way on more advanced biometrics and other forms of sensing to better understand the position of occupants and their mental and physical condition.
As autonomy advances to Level 3 and beyond, occupants may be sleeping in cars or facing rearward, so detection will be even more critical when it comes to handing back control to a human or triggering airbags efficiently and effectively in the event of a crash. But development remains in the early stages and technical and behavioral challenges are huge.
“I’m not sure how much we’ll be able to fully control what people do as we take that big step (from Level 2 to Level 3),” Matsu says. “That’s why I’m a fan of pushing Level 2 to Level 2.99, because as soon as you go to Level 3, it’s a question of how safe are you going to be?”
About the Author
You May Also Like