Trends in car Human-Machine Interface (HMI) – Part 2
Connected, autonomous, shared, and electric (CASE) vehicles are broad topics that define the future of the car.
However, their success depends on the successful relationship between man and machine remaining intact despite increasing technological sophistication.
The human-machine interface (HMI) is key to ensuring that the ever-advancing technology can be understood and operated by anyone who takes to the road. Here’s part two of our collection of ideas likely to influence the future of HMI.
An average day, particularly in the western world, involves interacting with huge numbers of screens – phones, computers, tablets, PCs, and, of course, those in your car. The latter form an essential part of your car’s HMI, and increasingly your relationship with it.
- The central infotainment screen has jumped from a basic head unit to the current largest, the 48-inch pillar to pillar display fitted to the forthcoming Byton M-Byte, in very short time. The M-Byte certainly represents another step towards autonomy, with a large front screen supplemented by separate smaller screens for each passenger. Allowing both communal and personal enjoyment covers both private and shared ridership
- If a screen display can be made smart, why can’t other surfaces? Supplier Yanfeng is one of many already working on bringing HMI functionality to just about any surface in the car. Especially useful once a car doesn’t need to be driven, and interiors can be reconfigured
- We’re used to the idea of cars having outward-facing cameras, but those pointing at occupants will become common. Teslas already feature the tech, intended for when owners can share their cars when they don’t require them. Karma Automotive’s SC1 Vision concept features camera capable of biometric identification of occupants, and also scans the driver for signs of tiredness, using an AI system to take control of the car should the driver fall asleep. The camera here effectively becomes a contactless interface, requiring no active participation between human and machine
- Screens also find a use to effectively remove physical barriers from the car. Door mirrors that use cameras are now a production reality on the Audi E-Tron, while Nissan offers its Intelligent Rear-view Mirror on the X-Trail SUV, so the driver can always see behind regardless of vehicle passengers or luggage. Taking this as step further, Jaguar has experimented with screens that wrap the A-pillars, giving a clear field of vision, while Land Rover has a concept augmented reality hood, effectively allowing the driver to see through the whole front of the car when driving off road
- Aside from infotainment screens, driver information displays are becoming more complex, with reconfigurable digital displays the expected norm in premium cars. Bosch recently announced it is working on a gauge pack that has passive 3D, thus requiring no special lenses to see the effect. The firm claims it’s better for drawing attention to hazards
- Head-up displays offer an interesting take on the hierarchy of information given to the driver – center screens being occasional use, gauge clusters often, leaving what’s in the HUD as essential. However, as a result, its usage is limited given its safety-critical nature
- HUDs may see better use in level 5 vehicles, however, in combination with AR, to make a vehicle’s windows smart. In 2011, Toyota showed its Window to the World idea developed in partnership with the Copenhagen Institute of Interaction design. It allowed a car’s rear windows to act like a touchscreen, although its usage was curtailed by the need for the passenger to turn through 90 degrees in order to use it, all while secured with a seatbelt. A tablet makes more sense for now, but give it time
Once the preserve of an elite few, the concierge is being democratized thanks to digitalization. Google Assistant, Amazon Alexa and Apple’s Siri are all here to at least attempt to do your bidding, and are set to become an increasingly integrated part of your automotive life, harvesting your location data as they go. We already covered voice control for HMI, but here we look at virtual and physical companions in the car.
- A quote attributed to Austrian sociologist Ludwig von Mises states “The luxury of today is the necessity of tomorrow.” And that’s rather fitting of our relationship with the car – once representing freedom for the relative few, something those lucky enough to have loved and gave a name to, it’s now something taken for granted in the same way you might any other consumer electronic good. However, key to rekindling that relationship with the car appears to be the personal assistant
- BMW ConnectedDrive has included a pay-yearly dial-a-concierge service for years. Simply press a button and you’re connected to an agent who you can ask for directions, restaurant reservations and more. The service is built into the car, so doesn’t use your personal phone to connect. From this year, BMW added its Intelligent Personal Assistant, which allows the driver to build a relationship with a virtual assistant that learns their preferences, reacts to their mood, and even provides company
- At the Tokyo Motor Show in 2015, Toyota showed Kirobo Mini, a 10cm high robot companion for the car that the driver could converse with. The firm said the AI-powered robot would be “in tune with the driver’s mood, suggesting places to visit, routes to travel and music to listen to… [it] could help driving become a physically and emotionally transformative experience
Outside the car industry, the emotional connection between human and machine is an essential part of the success of the algorithms that control our digital lives, and perhaps offer significant food for thought when it comes to HMI development.
Netflix for example uses AI to calculate subscriber viewing preferences and surfaces viewing suggestions (including promotional images customized to your preferences) based on how you’re likely to be feeling. Spotify’s playlist model is biased towards ‘happy’ as you find it, in order to keep you listening longer. eBay has a mood marketing tool. Facebook has experimented with mood manipulation in the past.
All of these platforms are already in your car on your phone, and soon will be even further embedded in the HMI of your car. Imagine using your digital assistant to plan a route, and the car suggests a playlist that’s designed to pluck just the right emotional heart string when passing the right spot that makes you jump online to make a purchase. It hasn’t happened. Yet.