Nuance brings AI and contextual reasoning to Dragon Drive Auto Assistant

23 Oct 2016 IHS Markit Automotive Expert

Voice-recognition experts at Nuance have enhanced its Dragon Drive Automotive Assistant with artificial intelligence and contextual reasoning to personalise driver interactions.

IHS Markit Perspective

  • Significance: Nuance has announced further developments for its Dragon Drive Automotive Assistant, adding artificial intelligence and contextual reasoning to create a more personalised, natural experience for the driver.
  • Implications: Nuance's system is sold only to OEMs, with Dragon Drive Auto Assistant first announced in January 2016. The contextual reasoning software enables new Smart Domains, eventually providing assistance for finding parking, fuel, points of interest, restaurants, travel guides, music and messaging.
  • Outlook: Nuance announced an update to its Auto Assistant platform this week, expanding the capability of the system and pointing at its future direction. The system is positioned for all level of connected cars, which should translate well to an autonomous car environment as well. The Nuance system is intended to be delivered to consumers only through an OEM; the company is not exploring an aftermarket option. It also appears highly customisable from an OEM perspective. It resides below the OEM user interface, there is ability to adapt to the specific brand experience an automaker wants to deliver.

Nuance Automotive announced the next stage of development for its Dragon Drive Auto Assistant at its annual Auto Forum in Detroit (Michigan, United States) this week, which IHS Automotive attended. Although Nuance's Auto Assistant is not new, the company used the backdrop of the Auto Forum to introduce Dragon Drive's Contextual Reasoning Framework. Nuance's short description is, "The Contextual Reasoning Framework leverages Nuance's advancements in Artificial Intelligence to exploit domain knowledge in a context-sensitive manner to provide a more intelligent experience behind the wheel."

Nuance demonstrated several examples of the application, including how the system can learn a user's preferences for things like parking and fuel, and use that information along with contextual information (car fuel level, distance of trip planned, weather) to deliver options that combine the contextual information with user preferences. Over time, responses will become increasingly tailored to the user. In a press release, Vlad Sejnoha, Nuance's CTO, said, "Today's cars both access and generate a large amount of purposeful data that, when leveraged in the right way, can create a highly personalized and intelligent driving experience. We're applying our latest advances in AI reasoning to automotive data and driver's personal preferences, to give in-car systems the ability to make thoughtful recommendations - just as a collaborative human assistant would."

Arnd Weil, senior vice-president and general manager of Nuance Automotive, said, "Contextual reasoning brings to bear our vision for an automotive assistant that is fully optimized for the in-car experience. We're applying AI to one of the most complex challenges - an intelligent connected car that is conversational, intelligent, proactive and safer to engage. Further, our reasoning engine is fully integrated with other applications and services that matter to drivers - navigation, parking, traffic, in-car diagnostics, music, POI, and more, giving automakers an AI-driven platform that they can fully customize as part of their own branded experience."

Nuance's Contextual Reasoning Framework enables this intelligent system behaviour by adapting system behaviour to changing context and fusing data from content sources to enable better user experience, as Michael Kaisser, principal product manager of AI technologies for Nuance explained to Forum attendees. Natural language understanding and natural language generation are used by the AI for better interaction between car and human. Natural language understanding enables the system to evaluate for expressed and implied intent, as well as allowing drivers to refine recommendations, even to confirm and trigger a follow-up action. For example, confirming a parking location and asking the system to text expected arrival time to someone by saying, "Ok, and let John know my arrival time." Natural language generation results in more conversational responses.

Nuance has identified several key AI technologies necessary to support the services it is developing. Knowledge takes the raw data gained through embedded and cloud data, saved content, vehicle condition data and sensors, fuses the data from these multiple sources and makes that content accessible when it is relevant. Contextualisation (reasoning) adapts the assistant's behaviour to the context of the driver and their car - one example Nuance discussed was the system's ability to understand when the driver has just been given a navigation command (turning in 500 feet) and modifying what other information it might give at the same time. This is much like what a passenger might do, naturally seeing that the driver's attention is needed for the task at hand and slowing the conversation. Personalisation is developed through machine learning (essentially counting and statistics, Kaisser said), to deliver highly targeted predictions and recommendations, to simplify and satisfy. For example, the system could learn if a user prefers cheaper or closer parking to provide better parking alternatives in the future, but also can use the data to help sort out restaurant recommendations for the same user, in absence of already having information about restaurant preferences. The more information the system has about user preferences and context, the better the suggestions will be. The fourth element is Smart Interaction, using speech, natural language understanding (NLU) and dialog technology for high-quality and human-like collaborative dialogs.

Nuance's contextual reasoning is developed through specialised reasoners, Kaisser said. There are basic reasoners - essentially rules for decision making - although more are being developed. Core to these are spatial reasoners, temporal reasoners, contextual, and personal. A restaurant reasoned would have access to the data on restaurants and be able to, as example, understand that asking for a restaurant with Tikka Masala means the user is most likely looking for an Indian restaurant, and serving that option up.

Nuance's Reasoning Framework (see diagram), which resides within Dragon Drive, allows an API access to the functioning of the reasoning engine through a Reasoning Interface.

The reasoners are part of a "reasoning layer" that applies knowledge to the reasoners and their rules to create the Reasoning Interface. The reasoners can be determined by the OEM, effectively setting parameters for the level of features and functionality an OEM wants to offer as part of its connected experience. The Framework is anchored by the Knowledge Repository, which contains rules, world knowledge and augments data. The Knowledge Interface has the job of making the data sources (including car sensors, social, music, points of interest, parking fuel, restaurants) and knowledge repository information available to the reasoners, and translates between data formats.

Nuance says that the Reasoning Framework can be customised as well, as mentioned. While the Dragon Drive Reasoning Framework comes with a predefined set of reasoners, knowledge can be adjusted for OEM needs, either by Nuance or in-house at the OEM.

Nuance has begun making its Smart Domains available to automakers, with Smart Parking and Smart Fuel first. In January 2017, a Smart POI will become available that allows intelligent POI searches, considering car context and personal preferences. In June 2017, a Smart Car Manual will become available, enabling users to ask questions about the car and its functions while driving, leveraging content of the owner's manual and frequently asked questions. Further out, Nuance has promised Smart Messaging (extracts entities and intent from messages, offers intelligent automatic replies and knows which messages are important to the user), Smart Travel Guide (answers questions about surroundings and destination, recommends sights), Smart Music (universal music search and personalised and contextualised music recommendations) and Smart Restaurant (personalised and contextualised recommendations). All of these domains will learn users' preferences over time. As the Auto Assistant launched with the voice biometric feature, a car with two primary drivers will load preferences for the user based on the voice that awoke the system.

Outlook and implications

Nuance announced its Auto Assistant platform at the Consumer Electronics Show in January 2016, with the announcement this week expanding the capability of the system and pointing at where it will go in the future. The system is positioned for all levels of connected cars, which should translate well to an autonomous car environment as well. The Nuance system is intended to be delivered to consumers only through an OEM; the company is not exploring an aftermarket option. It also appears highly customisable from an OEM perspective. It resides below the OEM user interface, there is ability to adapt to the specific brand experience an automaker wants to deliver.

As a long-standing automotive supplier, the ability for customisation and for the system to be used in support of brand experience and objectives the OEM has is in direct contrast to the approach of Apple or Google, which prefer to control the user experience rather than supporting an automotive brand's image. This is interesting, in that Apple and Google have access to a tremendous amount of information and have shown some ability to adapt to user preferences as well, minus the access to the car data and vehicle sensors. It seems that both of those software giants should have the capability to develop systems that provide smart feedback as well. For now, however, Nuance focused on providing support to OEMs, has history in doing so, and can currently provide a more robust solution.

IHS Markit analyst Mark Boyadjis also notes that it is unclear in the long run which solution (OEM-designed versus smartphone-designed) end-consumers will ultimately prefer. An OEM-designed solution will integrate the vehicle much more holistically, for a better and more complete automotive user experience. Smartphone-designed solutions, on the other hand, provide an experience which users are familiar with and can translate between cars, brands, and nameplates, all while being better suited for a long-term vision of shared mobility platforms.

About this article

The above article is from IHS Automotive Same-Day Analysis of automotive news, events and trends, and is a deliverable of the World Markets Automotive Service. The service averages thirty stories per day and also provides competitor and country intelligence. Get a free trial.


Filter Sort