GTM No Script

The future of automotive human-machine interfaces

How passengers and drivers interact with their vehicles differs from just a few years ago. The future is here and we are rapidly moving towards a world of safe, shared and self-driving cars. Today, automotive manufacturers implement a range of Human-Machine Interface (HMI) technologies, including voice commands, touchscreens, interior-facing cameras, head-up displays and multifunctional controllers to create a better automotive experience. 

The technology is evolving exponentially due to its focus on speech recognition, augmented reality, safety and interaction without distraction. Dig deeper into why these advancements are becoming more intriguing and sophisticated and what that will mean for the future of cars in this interview from Star’s Automotive & Mobility experts.

Check out the report to learn more about trends and use cases in HMI market

Download report

Interview with Star’s Design Director about vehicle HMI

During Car HMI Europe, Star’s Automotive & Mobility team gave an interview about which technologies and smart mobility concepts will disrupt the HMI design market over the next few years. 

What are the main factors influencing current automotive HMI development?

The automotive sector is converging ever more strongly with other consumer, prosumer and lifestyle domains. This convergence, in turn, leads to growing pressure from non-automotive companies in terms of the usability and performance of digital services. As a result, some OEMs now entirely rely on synergies with UI existing frameworks, such as Android Auto.

Another macro trend is the shared ownership of cars. That extends the customer journey (through constant on- and offboarding) and requires more transparent asset management (by showing when and where potential cars are located).

HMI design

Which new automotive software, technologies and smart mobility concepts do you think will disrupt the HMI market and why?

Visual HMI development seems to have reached a level of maturity, having made the computer personal. The next frontier, as we see it, will be to make the computer a person. Already today, voice assistance is a bustling melting pot for machine learning and audio-driven interfaces. Another main disruptor for the HMI market will be sophisticated telepresence technology, including AR. 

Besides creating more immersive experiences, they might cut out the necessity to travel in the first place. Data and machine learning will play an increasingly important role in providing connectivity, seamlessness, comfort, personalization and efficiency during our time on-the-go. 

Whoever succeeds in bridging the gap between OEMs, mobility service providers, and other moving parts in the mobility ecosystem, will disrupt the market. 

Regardless of the ownership model, cars will remain relevant as modes of transportation, thanks to unparalleled comfort levels and last-mile navigation. Presumably, even passenger drones will not deliver the same kind of grid resolution as vehicle HMI any time soon due to legal constraints.

What role do you feel automation will play in developing future human-machine interface design and user experience strategies?

As a first step, automation will require dedicated efforts to avoid and disperse occupants’ fears. Once there is trust, and drivers dare to take their hands off the wheel, occupants should find new ways to bond with and take charge of the car. One key way to address both these points is to build a credible and trustworthy car personality.

What are the main challenges at the moment with regards to human-machine interface and automotive HMI development?

Many automotive players think of their competitors exclusively as other automotive players. This way of thinking can lead to a self-perpetuating mental model that underestimates the innovation, sophistication and pleasure of non-automotive digital service experiences. It also underestimates breakthroughs in the IoT sector. 

As a result, digital facets face underrepresentation in the innovation process. Interactions across mobility touchpoints are incongruent too, or they don’t even cover the entire product life cycle.

What would you say is the most iconic automotive HMI technology from the past 50 years?

Firstly, Nomi in NIO ES8. It is a manifestation of the car’s intelligence that puts a charming and easy-to-read face on an otherwise abstract or overwhelming process. As we see it, a game-changer in the automotive industry!

Secondly, the Tesla Model 3. The no-nonsense UI doesn’t distract the user with ornamental interior details. Instead, the threshold to using digital services is minimal. 

Plus, updating UI hardware post-purchase is easy. This UI promises to age gracefully.

Spotlight on Nomi

Nomi, mentioned earlier, won the 2020 Red Dot Design Award as the lovable robot that connects the car to its user in an emotional way. A co-creation of Star and the NIO UI/UX team, Nomi is today’s benchmark in emotional in-car artificial intelligence.

Learn more about improving automotive human-machine interface design

The in-car user experience continues to face design challenges, including limitations with voice. Find out what the automotive industry can learn from the advancement of the smartphone in this intriguing read on in-vehicle UX that parallels the two and provides insights on how to leverage new technologies and create truly holistic consumer journeys.