Driverless car study in Germany probes driver reaction times
- Autor:Ella Cai
- Solte em:2017-05-24
A US-based artificial intelligence (AI) firm is working with researchers in Germany to apply conversational and cognitive AI for autonomous systems for self-driving cars.
Nuance Communications and Deutsches Forschungszentrum für Künstliche Intelligenz (DFKI), the German Research Centre for Artificial Intelligence have joint research initiatives, including the relationship between humans and in-car systems, as well as AI applied to healthcare systems and omni-channel customer care.
Google, Microsoft, Intel and BMW have all invested in DFKI, one of the world’s leading AI research centres.
e02_foto_2_01To better understand the many scenarios where transfer of control will need to happen and how, Nuance and DFKI conducted a human-based usability study to identify the most effective ways to grab a passenger’s attention – visual, auditory – or both.
Participants of the study were placed in a simulated autonomous car environment in a variety of situations, such as reading, listening to music and writing an email. The autonomous system would alert them through vibration (haptic), visual and auditory cues to see which of the senses responded the fastest to take the wheel.
Prof. Dr. Wolfgang Wahlster, CEO of DFKI, writes:
“Cognitive and conversational AI are the key technologies driving the second wave of digitalisation, that is based on deep machine understanding of digital data.”
Data indicates that the reaction time is the lowest when the driver is engaged in a listening activity, such as listening to a book or music.
Drivers trust audible and haptic responses from the automotive assistant more than visual cues alone.
The study also showed that, independent of the current driver activity, sound is considered more pleasant and effective than visual cues, leading to faster reactions than simply vibrations or haptic alerts.
The study found that reaction time is the lowest when the driver is engaged in a listening activity, such as listening to a book or music.
This is a concern because a previous study conducted by Nuance in US and the UK among 400 drivers found that the top five activities in the car would be listening to the radio (64%) relaxing (63%), talking on the phone (42%), browsing the Internet (42%) and messaging (36%), which all involve a combination of visual, auditory and haptic tasks.
Drivers trust audible and haptic responses from the automotive assistant more than visual cues alone.
Data indicates that the reaction time is the lowest when the driver is engaged in a listening activity, such as listening to a book or music.
The DFKI study is complementary to a recent survey conducted by Nuance in US and the UK among 400 drivers looking at the type of activities that drivers are planning to do as passengers in an autonomous car.
If alone on a longer trip, respondents cited their top five activities in the car would be listening to the radio (64%) relaxing (63%), talking on the phone (42%), browsing the Internet (42%) and messaging (36%) — all representing a combination of visual, auditory and haptic tasks.
“We will continue to push boundaries on the conversation between humans and smart environments, and ultimately bring to market the next generation of cars, bots, assistants and smart objects that simply make everyday life better and safer,” said Wahlster.
Nuance Communications and Deutsches Forschungszentrum für Künstliche Intelligenz (DFKI), the German Research Centre for Artificial Intelligence have joint research initiatives, including the relationship between humans and in-car systems, as well as AI applied to healthcare systems and omni-channel customer care.
Google, Microsoft, Intel and BMW have all invested in DFKI, one of the world’s leading AI research centres.
e02_foto_2_01To better understand the many scenarios where transfer of control will need to happen and how, Nuance and DFKI conducted a human-based usability study to identify the most effective ways to grab a passenger’s attention – visual, auditory – or both.
Participants of the study were placed in a simulated autonomous car environment in a variety of situations, such as reading, listening to music and writing an email. The autonomous system would alert them through vibration (haptic), visual and auditory cues to see which of the senses responded the fastest to take the wheel.
Prof. Dr. Wolfgang Wahlster, CEO of DFKI, writes:
“Cognitive and conversational AI are the key technologies driving the second wave of digitalisation, that is based on deep machine understanding of digital data.”
The study found that drivers trusted audible and haptic responses from the automotive assistant more than visual cues alone.
Data indicates that the reaction time is the lowest when the driver is engaged in a listening activity, such as listening to a book or music.
Drivers trust audible and haptic responses from the automotive assistant more than visual cues alone.
The study also showed that, independent of the current driver activity, sound is considered more pleasant and effective than visual cues, leading to faster reactions than simply vibrations or haptic alerts.
The study found that reaction time is the lowest when the driver is engaged in a listening activity, such as listening to a book or music.
This is a concern because a previous study conducted by Nuance in US and the UK among 400 drivers found that the top five activities in the car would be listening to the radio (64%) relaxing (63%), talking on the phone (42%), browsing the Internet (42%) and messaging (36%), which all involve a combination of visual, auditory and haptic tasks.
Drivers trust audible and haptic responses from the automotive assistant more than visual cues alone.
Data indicates that the reaction time is the lowest when the driver is engaged in a listening activity, such as listening to a book or music.
The DFKI study is complementary to a recent survey conducted by Nuance in US and the UK among 400 drivers looking at the type of activities that drivers are planning to do as passengers in an autonomous car.
If alone on a longer trip, respondents cited their top five activities in the car would be listening to the radio (64%) relaxing (63%), talking on the phone (42%), browsing the Internet (42%) and messaging (36%) — all representing a combination of visual, auditory and haptic tasks.
“We will continue to push boundaries on the conversation between humans and smart environments, and ultimately bring to market the next generation of cars, bots, assistants and smart objects that simply make everyday life better and safer,” said Wahlster.