Enabling your car to sense your mood
- 저자:Ella Cai
- 에 출시:2018-09-07
Affectiva, the MIT Media Lab spin-off that develops mood-sensing software, has gone in with Nuance, the sprech recognition specialist, to put a combined package of their respective capabilities into cars.
The integration of Affectiva’s technology with Dragon Drive allows the measurement of facial expressions and emotions such as joy, anger and surprise, as well as vocal expressions of anger, engagement and laughter, in real-time.
Affectiva Automotive AI also provides key indicators of drowsiness such as yawning, eye closure and blink rates, as well as physical distraction or mental distraction from cognitive load or anger.
“Leveraging Affectiva’s technology to recognise and analyse the driver’s emotional state will further humanise the automotive assistant experience,” says Nuance’s Stefan Ortmanns.
In the near-term, Affectiva and Nuance’s integrated solution will enable the automotive assistant to further learn and understand driver and passenger emotion and behaviour, as shown through speech incidents or facial expressions of emotion. For example, if an automotive assistant detects that a driver is happy based on their tone of voice, it can mirror that emotional state in its responses and recommendations.
In the future, the solutions are anticipated to address safety-related use cases, particularly in cars across the autonomous vehicle spectrum. Using Affectiva’s and Nuance’s technologies, automotive assistants could detect unsafe driver states like drowsiness or distraction and respond accordingly.
In semi-autonomous vehicles, the assistant may take action by taking over control of the vehicle if a driver is exhibiting signs of physical or mental distraction.
The integration of Affectiva’s technology with Dragon Drive allows the measurement of facial expressions and emotions such as joy, anger and surprise, as well as vocal expressions of anger, engagement and laughter, in real-time.
Affectiva Automotive AI also provides key indicators of drowsiness such as yawning, eye closure and blink rates, as well as physical distraction or mental distraction from cognitive load or anger.
“Leveraging Affectiva’s technology to recognise and analyse the driver’s emotional state will further humanise the automotive assistant experience,” says Nuance’s Stefan Ortmanns.
In the near-term, Affectiva and Nuance’s integrated solution will enable the automotive assistant to further learn and understand driver and passenger emotion and behaviour, as shown through speech incidents or facial expressions of emotion. For example, if an automotive assistant detects that a driver is happy based on their tone of voice, it can mirror that emotional state in its responses and recommendations.
In the future, the solutions are anticipated to address safety-related use cases, particularly in cars across the autonomous vehicle spectrum. Using Affectiva’s and Nuance’s technologies, automotive assistants could detect unsafe driver states like drowsiness or distraction and respond accordingly.
In semi-autonomous vehicles, the assistant may take action by taking over control of the vehicle if a driver is exhibiting signs of physical or mental distraction.