Trends in car Human-Machine Interface (HMI) – Part 1
It’s hard not to overstate just how different the world’s vehicles could become if the connected, autonomous, shared and electric vision of the future fully materializes.
One major aspect of the experience that will change hugely is the human-machine interface (HMI) – an essential part of the process of ensuring that the ever-advancing technology baked into each new generation of car can be understood and operated by anyone who takes to the road.
It’s an evolving situation, and as a result we've started to record significant trends being explored by carmakers and suppliers that point to the version 2.0 passenger experience. We’ll update this article when significant developments occur.
By definition, HMI is simply the interaction between a human and the hardware and software of a computer. But with increasing sophistication come more chances for the human to control more functions of the car.
Steer by wire and brake by wire
Shifting to electric control of the accelerator, the power steering and braking system is key to the functioning of advanced driver assistance systems (ADAS), and paves the way for full self-driving. The staged introduction of such technology is important in gradually reducing the driver’s involvement in the driving process.
However, by completely removing the connections between the car’s fundamental controls – its ability to go, stop and steer – a future car HMI effectively acts as a replacement for the physical controls within the vehicle once they become redundant:
- The steering wheel doesn’t need to remain in a fixed location, and can be a completely different shape, or even completely different interface – all of which were explored by the Rinspeed Budii concept, shown in 2015
- With autonomy, the accelerating, steering and braking go from being the most active, physical elements of driving, to being completely in the hands of the vehicle’s systems. Tesla showed what its Autopilot system can ‘see’ – which, in HMI terms, appears to be a key step in allowing people to accept that the car is in control
According to consultants Frost and Sullivan, voice control will become the second most-prevalent interface by 2022, with forecasts predicting that it will be included in 80 percent of car HMIs, up from 48 percent in 2016, and not including voice assistants already included in the vast majority of smartphones.
- Mercedes’ MBUX system is a front-runner in this regard, including an AI-powered voice control for the infotainment system. However, control of all vehicle functions – including destination and style of driving are likely to be key parts of future HMI
- In addition to vehicle control, Mercedes also touts MBUX’s ability to build relationships with the car, beginning with learning the driver’s personal preferences, and growing to include the ability to respond and even alter a driver’s mood or behavior, although the ethical implications of the latter are significant to say the least!
- If voice becomes the de facto interaction method in future – how very human – it could also be used to create personalized messages to each seat of a ride sharing service. This would build on experiments such as Citymapper’s bus service in London, which used proprietary software to assign each user an emoji that was displayed when that particular rider’s stop was reached
- When the current BMW 7 Series debuted, its HMI featured voice, gesture, touch and rotary inputs – all bases covered. However, as a result of this lack of convergence on input methods, today’s HMI is often distractingly complex, with some functions having to be locked out of use when the car is moving, or being included whether self-driving tech is as advanced as it needs to be, as with Tesla’s announcements that Netflix and YouTube will soon be coming to its cars
Before the user interface becomes yesterday’s news and a car’s passengers are able to have separate, meaningful relationships with its HMI, just like the 2013 movie Her, some sort of touch-based interaction will still be required. But while touchscreens are commonplace, from phones to fridges, they’re not especially suited to automotive applications once you take bumps in the road into account. So how do you get around it?
- Although the touchscreen is the primary input for smartphones, Google’s Project Soli researches how to use radar for gesture-based inputs, including buttons, dials and sliders, and is thus able replace just about any contemporary control in today's cars. The technology has been confirmed to make its public debut in the Pixel 4 smartphone, and an automotive application is expected. The firm says its tech is the most advanced, so will be interesting to see how it stacks up against that already offered by BMW, and if it is as engaging and intuitive as that offered by a phone’s touchscreen
- UK-based firm Ultrahaptics aims to make the sensation of touch happen in mid-air. The tech uses ultrasound to create a virtual touch feeling, allowing a screen to be manipulated without touching it. The firm claims it’s working with automotive clients to bring its technology to market
- There’s a time and a place – it seems the touchpad has been deemed a success when used on steering wheels, as debuted on the Mercedes E-Class. However, both Hyundai and supplier ZF have introduced touchscreen control in prototype form to steering wheels, the latter in conjunction with level 3 autonomy in mind