Future HMI Systems Linked to Autonomous Driving Trends
Human-machine interface systems should be designed to enhance, not distract, the driving experience, panelists here say, acknowledging mistakes have been made and that future trends point to simpler interfaces.
DEARBORN, MI – Autonomous driving, still a ways off but at the top of nearly every OEM’s list of concerns, is the new focus for interior designers, say panelists speaking at the WardsAuto Interiors Conference being held here today.
Human-machine-interface systems should be designed to enhance, not distract, the driving experience, panel members say, acknowledging mistakes have been made trying to add several technologies into the center stack and console, when future driving trends point to simpler interfaces.
“Just because you can doesn’t mean you should,” Dave Lyon, former chief interior designer at General Motors who runs his own consultancy, says of adding controls to the interface.
He compares the ’12 Buick LaCrosse, which has more than 50 buttons and three knobs, with the ’14 model that cuts the number of buttons to 30 and knobs to two.
“When people are driving their car, they really don’t want to use the center stack,” Lyon says. Earlier whiz-bang technology that initially sells cars on dealer lots – voice control, for example – wears thin after regular usage.
“I do not like talking to my car very much,” Lyon says, describing a common occurrence where a driver’s favorite song is interrupted by a tedious voice command. “I press the button, and it says ‘Say a command.’ I know I’m about to say a command.”
However, simplifying HMI doesn’t have to mean no driver input. Tejas Desai, head of interiors for Continental, says there needs to be a balance in the flow of information coming in: Enough to keep the driver engaged, but not so much as to overwhelm. “What we want to do is have different building blocks that give you information when it’s needed.”
Continental suggests systems centered on the driver’s face that would predict driver reaction and help keep eyes on the road.
In the works, Desai says, is a “comet” light circling around the vehicle cabin with different levels of intensity. In the event of a rear-end collision, for example, the light alternates between white, red and flashing red.
But how the light responds to collision avoidance depends on where the driver’s eyes are focused. If the system detects the driver is facing away from the road, perhaps quieting a child in the backseat or reaching for something across the cabin, the light subliminally would guide the driver back to attention. If the driver already is alert and facing forward, the light commands would be more sensitive.
Such a system also would integrate outside factors, such as oncoming traffic, road signs and other hazards and guides. “The thing that we achieve should not really be an instinct. Those are the types of things that we need to bring into the car,” Desai says.
Lyon says his firm is studying an interface that reads hand and fingertip movement. A driver would simply “wave” or mimic general commands for functions such as volume control or changing a station.
As auto makers create new infrastructure, they must keep up with the changing technologies offered by suppliers and other developers, panelists agree.
Danny Shapiro, director of NVIDIA’s automotive division, points to the company’s development of in-car computers that easily update and adapt to new software over time without having to replace the entire unit. The technology already is used by Volkswagen, Audi and Tesla, he says.
“As we start to move toward this kind of autopilot system, these kinds of systems are going to become critical,” Shapiro says.
All agree that younger drivers are key to the long-term planning of HMI systems. “Everything is based around the phone,” Lyon says. “I don't know if you've ever seen a 25-year-old without their phone on them, but it's like a child looking for candy."