Affectiva, the global leader in Artificial Emotional Intelligence (Emotion AI), and Nuance Communications, Inc. the leader in conversational AI innovations, today announced their work together to further humanize automotive assistants and in-car experiences. Affectiva Automotive AI, the first multi-modal in-cabin AI sensing solution, will be integrated with Nuance’s conversational AI-powered Dragon Drive automotive assistant platform. The integrated solution will deliver the industry’s first interactive automotive assistant that understands drivers’ and passengers’ complex cognitive and emotional states from face and voice and adapts behavior accordingly.
The integration of Affectiva’s technology with Dragon Drive will expand the breadth and depth of contextual, emotional and cognitive data that automotive assistants can detect and account for. Affectiva Automotive AI measures facial expressions and emotions such as joy, anger and surprise, as well as vocal expressions of anger, engagement and laughter, in real-time. Affectiva Automotive AI also provides key indicators of drowsiness such as yawning, eye closure and blink rates, as well as physical distraction or mental distraction from cognitive load or anger.
Nuance’s Dragon Drive powers more than 200 million cars on the road today across more than 40 languages, creating highly customized, fully branded experiences for Audi, BMW, Daimler, Fiat, Ford, GM, Hyundai, SAIC, Toyota, and more. Powered by conversational AI, Dragon Drive enables the in-car assistant to interact with passengers based on verbal and non-verbal modalities, including gesture, touch, gaze detection, voice recognition powered by natural language understanding (NLU), and now, through its work with Affectiva, emotion and cognitive state detection.
n the near-term, Affectiva and Nuance’s integrated solution will enable the automotive assistant to further learn and understand driver and passenger emotion and behavior, as shown through speech incidents or facial expressions of emotion. For example, if an automotive assistant detects that a driver is happy based on their tone of voice, it can mirror that emotional state in its responses and recommendations.
In the future, the solutions are anticipated to address safety-related use cases, particularly in cars across the autonomous vehicle spectrum. Using Affectiva’s and Nuance’s technologies, automotive assistants could detect unsafe driver states like drowsiness or distraction and respond accordingly. In semi-autonomous vehicles, the assistant may take action by taking over control of the vehicle if a driver is exhibiting signs of physical or mental distraction.