Until just a few years ago the in-car interface was physical — buttons, knobs, switches, etc. Then came touch and voice. Now, in 2025, it’s evolving into something more intuitive: conversation
Breakthroughs in large language models (LLMs), contextual sensing and cloud-edge architectures have created this new generation of in-car digital assistants, the heart of a new mobility experience where artificial intelligence meets emotional intelligence.
Drivers don’t interact with static systems anymore, they engage in dialogue. These virtual personal assistants (VPAs) don’t just respond to commands — they anticipate, interpret and adapt. With the global market currently valued at $1 billion, OEMs are realizing that the next competitive edge isn’t horsepower, it’s digital services that resonate.
These days, the car isn’t just a machine that gets us from A to B. It’s becoming a space that moves with us. It learns, adapts and responds in ways that feel increasingly human, thanks to the rise of in-car digital assistants.
The in-car digital assistant market
Thanks to advancements in conversational AI and connected vehicle technologies, the automotive digital assistant market is experiencing rapid growth, increasing consumer usage and widespread OEM adoption. VPAs are opening new business frontiers such as continuous customer engagement and new revenue streams. Some stats to consider:
- The global in-car digital assistant market is projected to reach $5billion by 2035 — to a compound annual growth rate of 17.5%
- The installation rate of automotive voice systems surpassed 83% globally last year
- Over two-thirds of drivers with access to in-car voice assistants use them at least monthly
- OEM and Tier-1 suppliers account for nearly 72% of the market
Because of this, automotive brands are racing to define their voice in this new era:
- Mercedes-Benz recently unveiled its MBUX Virtual Assistant, built on Google’s Automotive AI. The human-like interface is capable of providing natural dialogue and proactive suggestions

- Volkswagen has integrated ChatGPT into its IDA assistant, blending on-device processing for essential tasks with cloud-based generative reasoning for richer conversation

- BMW has embedded the Alexa Custom Assistant, in partnership with Amazon, to combine automotive expertise with Alexa’s established equity and linguistic fluency

- NIO co-designed the groundbreaking NOMI, the world’s first in-car companion, in order to create a lovable bond between driver and vehicle and fulfill consumer expectations for emotional digital services

Design principles for delightful in-car assistants
Trust, reliability and contextual intelligence define the credibility of a digital assistant. Behind this new level of sophistication lies several key enablers:
- Agentic AI & LLMs transform rigid intent-based systems into open-domain conversational agents capable of reasoning, inferring intent and holding context-rich dialogue
- Trust & transparency should align with global regulations (like GDPR) through clear consent, on-device processing options and transparent data handling
- Multimodal understanding combines voice, touch, gestures and visual cues to interpret driver behavior and environment holistically
- Selective personalization uses privacy-conscious frameworks to remember preferences and routines, which creates a genuinely personal companion
Designing these systems requires a mindset shift away from the “one-and-done” approach towards a continuous evolution. One of the most important realizations for OEMs is that a digital assistant is never “finished.”
Unlike static UI features, in-car assistants are living systems that evolve over time and they have expanding capabilities that adapt across vehicles and regions, deepening relationships with drivers in the process. A VPA has to function but also has to grow technically, emotionally and contextually.
To achieve this, designers and engineers can follow six evolution parameters that define how an assistant matures across its lifecycle:
1. Temporal evolution
Assistants are companions, not static tools. A VPA that evolves over time builds a sense of trust and familiarity, much like a relationship that grows. Their capabilities should expand gradually:
- Progressive manifestation: Start minimal and become more expressive as the user gains comfort
- Feature unlocking: Introduce new functions over time to create a sense of growth
- Seasonal and contextual themes: Adapt visuals and sounds to reflect time of day, weather or cultural events
2. Touchpoint-based manifestation
The assistant should manifest differently depending on context and device. Consistency matters. But as does adaptability. Each touchpoint should feel like the same assistant, expressed appropriately for its medium:
- In-vehicle: Screen-based avatars, ambient lighting or voice-first approaches
- Smartphone: Simplified, Siri-like experiences tied to the automotive brand
- Dealership or showroom: Physical kiosks or embodied assistants for onboarding
- Smart home: Voice-only integration within smart speakers
3. Vehicle-specific variation
Not all cars are the same — a high end SUV feels different to a compact sedan and so their in-car digital assistants need to reflect this. Designing with variation in mind avoids the trap of “one-size-fits-all” assistant:
- Model tiering: Premium models might offer richer avatars, while entry models rely on simpler, light-based cues
- Vehicle type: A family SUV more likely suits a friendly companion, while a commercial van requires an efficiency-driven assistant
4. User-centric personalization
Drivers are individuals, not demographics. VPAs should flex accordingly, that way the assistant feels truly theirs, rather than a generic interface:
- Experience level: Novices will want more guidance, whereas experts a minimal, toned-down assistant
- Preferences: Style, tone and even avatar characters can be personalized
- Usage behavior: The assistant adapts whether the user speaks often, prefers touch or relies on automation
5. Situational and contextual modulation
Driving is a fluid experience, therefore VPAs should be equally dynamic. This keeps the assistant safe, relevant and intuitive — no matter the situation:
- Driving vs. stationary: Voice-only and minimal cues while driving and richer, visual engagement when parked
- User mood and stress levels: Adapt tone and behavior in response to biometric or behavioral signals
- Multimodal input prioritization: Context-driven flexibility aka gestures when hands are free and voice when eyes are on the road
6. Cultural and regional adaptation
An in-car digital assistant is also a digital, cultural ambassador. A culturally adaptive assistant should resonate as well as communicate:
- Localization: Personas, voice tones and interaction models tuned to local norms
- Brand messaging: Bring alignment with cultural expectations around politeness, authority and formality
By designing for growth, adaptability, and cultural resonance, OEMs can ensure their assistants remain relevant and trusted long after launch.
The best in-car digital assistants don’t just execute commands, they converse with empathy and foster a connection. Achieving that means designing around human conversation rather than machine syntax:
- Human-centered conversation design: Use natural phrasing, confirm understanding subtly and adapt tone to situation and driver mood
- Anticipatory UX: VPAs should know when to speak and when not to. Smart assistants suggest a fuel stop or reroute when relevant but remain silent when focus is needed
- Brand personality: Voice and tone become brand signatures, whether it’s calm, confident, playful or professional
- Edge first reliability: Safety-critical features need to function seamlessly offline too in order to maintain user confidence even without connectivity
Given these parameters, what’s the best way to get started? The answer is with a duel mindset:
- Think big: Define a long-term vision of the digital assistant across all six parameters. Imagine how it might evolve over five years, how it lives across ecosystems and how it adapts to cultural shifts
- Start small: Launch with a focused, high-value core experience. This could be a minimal voice-first assistant in-car or a light-based companion with a few critical functions
These design principles transform in-car assistants from a utility into an emotional interface that, if done correctly, become the “feel” of the brand itself.

The future of virtual personal assistants
The future of in-car digital assistants goes beyond simple conversation. We’re entering the era of cognitive mobility, where your assistant becomes a proactive copilot, capable of perception, reasoning and judgment. Imagine an AI-powered assistant that says:
“You seem tired. Should I find a rest stop?”
“Traffic ahead. Would you like me to reroute?”
“It’s raining. I’ve adjusted your headlights and wipers.”
Such assistants will coordinate across navigation, ADAS and connected ecosystems to create seamless, context-aware experiences that balance safety, wellness and entertainment. This vision does come with challenges like ethical AI behavior, safety validation and emotional design.
But at the same time it also represents an extraordinary opportunity to make the machine understand the human, not the other way around.
Think big, start small, evolve continuously
In-car digital assistants are no longer optional add-ons. They’re strategic pillars of the software-defined vehicle — uniting AI, design and emotion into a single, evolving entity.
The most successful OEMs in the coming years will be those who launch focused, high-value experiences that evolve into a long-term, adaptive ecosystem.




