Login

Project

#323 Wearable neurotechnology for inferring the driver's attention for assistive driving


Principal Investigator
Pulkit Grover
Status
Completed
Start Date
July 1, 2020
End Date
Aug. 31, 2021
Research Type
Applied
Grant Type
Research
Grant Program
FAST Act - Mobility National (2016 - 2022)
Grant Cycle
2020 Mobility21 UTC
Visibility
Public
Photo:
Download

Abstract

Inferring a driver's attentive state and visual focus will be critical in future transport systems. This project will develop wearable technology and AI algorithms to infer the location and the degree of a driver's visual attention based on Electroencephalography (EEG) electrodes placed on the driver's head while they driving (real-world and simulated VR environments). This attentional signal can be used, in conjunction with vehicular sensors, for automated decisions at critical moments.    
Description
The most scarce resource in the 21st century is our attention.  When it comes to traffic, the driver’s attention is life critical.  Today, this attention is often split across a plethora of attention-seeking devices that are constantly beeping, flashing, and buzzing. This has resulted in reduction of attention on the road and is causing an increasing number of accidents (Beanland et al. 2012).  Consequently, inferring the level of driver's attention, and visual and auditory focus of their attention, is critical in non- and semi-automated transportation systems.  The goal of the proposed work is to translate foundational results of a novel technology -- Ultra Resolution Electroencephalography (UR-EEG) -- developed collaboratively by CMU engineers (the PIs) and neuroscientists, into deployable and comfortable wearable systems that can provide neural signals that are most relevant to inferring and quantifying the driver’s attention. 

Is decoding the level and the location of visual and auditory attention possible through UR-EEG? Our own ongoing preliminary work, relying on the widely used Posner’s attentional paradigm (Posner, Snyder, and Davidson 1980; Posner 1980), shows that  the visual attention of the user can be decoded with high reliability using UR-EEG.  Complementary results using conventional EEG systems in the literature (see (Astrand, Wardak, and Ben Hamed 2014) for a survey) substantiate the observation that visual and auditory attention can indeed be decoded reliably (we note that even so, there is a significant boost with UR-EEG). The challenge, now, is technological rather than scientific: can a comfortable, wearable, long-term solution can be developed and reliably deployed that can infer visual and auditory attention? 

The proposed work will develop wearable technology and associated AI algorithms to infer the location and the degree of a driver's visual attention using UR-EEG. UR-EEG has been developed collaboratively by CMU engineers (the PIs) and neuroscientists (Robinson et al. 2017; Haigh et al. 2018; Krishnan, Kumar, et al. 2018; P. Grover and P. Venkatesh 2017; P. Grover 2016). UR-EEG involves placing densest arrays of electrodes at chosen locations on the user’s scalp noninvasively, and uses sophisticated data analytics to arrive at the highest spatiotemporal resolution and the most reliable neural inferences of any noninvasive neural sensing modality. However, “wet” electrode placement in existing high signal-to-noise ratio (SNR) EEG systems today is done by trained technicians and requires 30-60 minutes for installation (even for low densities). On the other hand, “dry” electrode systems, that do not require any gel interface with the skin, offer poor SNR. To bridge this gap, our prior work has developed novel materials that enable fast installation of high SNR EEG systems. The proposed work will incorporate these materials in a compact, comfortable, and wearable UR-EEG module, enabling high reliability inferences for attention inference on drivers. The system will be rigorously tested for attention with drivers immersed in virtual reality driving simulations, in a labspace set up specifically for such a simulation. 

The main tasks of the proposed work are:
Task 1: Development of a comfortable, wearable UR-EEG module that can be work for long-term use, spanning several hours, with no maintenance during use. 

The work in this task builds on our team's breakthrough success in improving wearability of EEG in general, UR-EEG in particular. It takes the first steps towards addressing the critical difficulties in bringing this science into practice. We have developed a novel conductive sponge electrode (Krishnan et al. 2018), that enables low-setup time and long-term recordings. This sponge will interface with the scalp directly, enabling a comfortable wearable system. Being conductive, it is able to provide neural signals even when it is dry. However, it is most conductive and has the least impedance electrode-scalp interface (and hence, highest SNR signals) when it is wet. To attain these high SNRs, We have also designed a mechanism to automatically deliver a small amount of saline to the sponge when low impedance is detected (Krishnan et al. 2019). This enables quick installation, since the system does not need to be prepared, and no gel needs to be applied. It also enables long-term use because the system can be rehydrated when running dry automatically.

In addition, we have obtained modular constructions that enable UR-EEG of variable density of electrodes to be applied to the scalp in a matter of minutes (Weigle et al., 2019). Lastly, we have designed techniques to obtain a small form factor, low power, high resolution information (Grover et al. 2015) (protected by provisional patent, application submitted for full patent). 

Task 2: Rigorous testing of the module in laboratory conditions for inferring spatial visual and auditory attention, and the degree of attention, while the driver is immersed in a simulated driving VR environment, with low latency.  Key questions that this will answer are: 
a) Is the driver attentive? (quantify level of attention); 
b) How often does the driver lose their attention?
c) What visual and auditory signals is the driver paying attention to? (visual/auditory attention localization);

This simulation will be set up in PI Grover’s new BioWaveSensing Labspace in Hamerschlag Hall. The  setup will be called “NeuroDriving Lab”, and will be advertised across CMU. Oculus VR system will be used, with existing driving simulator apps which model distractions (e.g. “It Can Wait” 360 VR driving simulator from US Automobile Association and AT&T, designed for OculusVR). As an additional benefit, labs from across CMU will be enabled access to the lab to further boost Mobility21 projects that can benefit from incorporating driver’s neural state monitoring. 

Why use neural attention signals? Attention is inherently a neural signal. Complementary techniques, such as driver videos/images (for monitoring microexpressions) or eye tracking, measure signals that may be correlated with attention but are not the ground truth, and as such, are liable to erroneous and/or biased inferences (depending on the dataset the model for these decisions was trained on). Directly monitoring neural signals has promise.  

How attentional signals will be used. This attentional signal can be used, in conjunction with vehicular sensors, for automated decisions at critical moments, or simply to warn the drivers for redirecting their attention on the road. Even on the road, if the drivers are not paying attention to an important event/object (e.g., vehicles in front or behind as they change lanes), a warning system or an automated vehicle response can be engaged.

-----

Please note that an additional document, an image of the overall vision of how the system will be deployed, is included as well (please see “image.png”).

-----
References:

Astrand, Elaine, Claire Wardak, and Suliann Ben Hamed. 2014. “Selective Visual Attention to Drive Cognitive Brain–machine Interfaces: From Concepts to Neurofeedback and Rehabilitation Applications.” Frontiers in Systems Neuroscience 8. https://doi.org/10.3389/fnsys.2014.00144.
Beanland, V., M. Fitzharris, K. L. Young, and M. G. Lenné. 2012. “USING IN-DEPTH CRASH DATA TO ASSESS THE ROLE OF DRIVER INATTENTION AND DRIVER DISTRACTION IN CRASHES.” Injury Prevention. https://doi.org/10.1136/injuryprev-2012-040580g.6.
Grover, P. 2016. “Fundamental Limits on Source-Localization Accuracy of EEG-Based Neural Sensing.” In 2016 IEEE International Symposium on Information Theory (ISIT), 1794–98.
Grover, Pulkit, Jeffrey A. Weldon, Shawn K. Kelly, Praveen Venkatesh, and Haewon Jeong. 2015. “A Novel Information-Theoretic Sensing Strategy for Ultra High-Density EEG Sensing.” In 53rd Annual Allerton Conference on Communication, Control, and Computing.
Grover, P., and P. Venkatesh. 2017. “An Information-Theoretic View of EEG Sensing.” Proceedings of the IEEE 105 (2): 367–84.
Haigh, Sarah M., Amanda K. Robinson, Pulkit Grover, and Marlene Behrmann. 2018. “Differentiation of Types of Visual Agnosia Using EEG.” Vision (Basel, Switzerland) 2 (4). https://doi.org/10.3390/vision2040044.
Krishnan, Ashwati, Ritesh Kumar, Praveen Venkatesh, Shawn Kelly, and Pulkit Grover. 2018. “Low-Cost Carbon Fiber-Based Conductive Silicone Sponge EEG Electrodes.” Conference Proceedings: ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Conference 2018 (July): 1287–90.
Krishnan, Ashwati, Kumar Ritesh, Arnelle Etienne, Amanda Robinson, Shawn K. Kelly, Marlene Behrmann, Mike Tarr, and Pulkit Grover. 2018. “Challenges and Opportunities in Instrumentation and Use of High-Density EEG for Underserved Regions.” EAI International Conference on Innovations and Interdisciplinary Solutions for Underserved Areas.
Posner, M. I. 1980. “Orienting of Attention.” The Quarterly Journal of Experimental Psychology 32 (1): 3–25.
Posner, M. I., C. R. Snyder, and B. J. Davidson. 1980. “Attention and the Detection of Signals.” Journal of Experimental Psychology 109 (2): 160–74.
Robinson, Amanda K., Praveen Venkatesh, Matthew J. Boring, Michael J. Tarr, Pulkit Grover, and Marlene Behrmann. 2017. “Very High Density EEG Elucidates Spatiotemporal Aspects of Early Visual Processing.” Nature Scientific Reports. 
H. Weigle, L. Widmayer, T. Abel, S. K. Kelly, P. Grover, A. Krishnan, “Novel scalp EEG solutions for rapid application, long-term EEG recordings with simultaneous sEEG electrodes”, American Epilepsy Society, Baltimore, December 2019    
Timeline
Overview of timeline for tasks and rationale for the timeline: 

Both tasks will be carried out simultaneously. As components of the wearable module develop, they will be integrated into the overall system. This will also allow a systematic comparison of commercially available system with our system, enabling us to identify and fine-tune any component that lower the signal to noise ratio. This is a practice the team has adopted from the start and it has benefited us in debugging the hardware in the process of advancing our novel solutions. 

Concrete milestones:
Month 0-6: Wearable UR-EEG module with conductive sponges developed and tested for EEG signals of attention (Posner paradigm).
Month 0-6: NeuroDriving lab setup, with an immersion environment for driving with distractions. Begin recruiting participants.
Month 6-12: Precision Neuroscopics develops a mechanically robust and small form-factor version of UR-EEG developed in lab.
Month 6-12: UR-EEG studies on inferring visual and auditory attention, and quantifying level of attention, in NeuroDriving labs setup.    
Deployment Plan
Pathway to deployment and adoption:
This 3-year deployment plan has been designed in partnership with Precision Neuroscopics. 
Year 1: Precision Neuroscopics (PN) develops mechanically robust versions of UR-EEG systems developed by the research team. Rigorous testing of UR-EEG module in performed in simulated environments. PN examines and understand regulatory requirements. PN and the team apply for NSF STTR grant, Phase 2 (Aug 2020) and NIH STTR grant, Phase 1 (Jan 2021), for further development.
Year 2: Rigorous testing of improved module in actual cars in controlled environments, with dashcam recordings. Development of a wearable module that provides auditory feedback to the user (e.g. beeps) when they need to pay attention in a chosen direction. Data collected and shared for analysis and refinement of UR-EEG system. Precision Neuroscopics obtains necessary regulatory permissions, and reaches out to industry in the region (including Uber/Aurora/Argo/Delphi) working on traffic. Submit NSF STTR Phase 2 for military for monitoring pilots. 
Year 3: Real-life testing in actual cars in normal traffic conditions, with dashcam recordings and user videos for establishing ground truth. 
FIRST PILOT TESTING: PIs work with CMU to do a pilot deployment in the university shuttle buses. Results obtained are pitched to insurance companies for incorporating these in their safe-driver discounts, and to the military for aircrafts/helicopters. Precision Neuroscopics works with traffic companies in Pittsburgh to test the product and deploy it more widely. 

Please note that an additional document, an image of the overall vision of how the system will be deployed, is included as well (please see “image.png”).    
Expected Accomplishments and Metrics
In the 1 year time-frame of this project, the following goals will be achieved, along with quantified metrics:

1) Fast application (10 minutes or less), long-term usable (3 hours or more), low-maintenance UR-EEG system instrumented and tested in lab.
2) Reliable (>90% accuracy), low-latency (<300 msec) inference of attention obtained in simulated driving conditions. 
3) Development of a UR-EEG driving simulation lab where the rest of the CMU community can experiment for neural inferences while driving, and incorporate their designs.     

Individuals Involved

Email Name Affiliation Role Position
behrmann@andrew.cmu.edu Behrmann, Marlene Carnegie Mellon University Co-PI Faculty - Tenured
pulkit@cmu.edu Grover, Pulkit Carnegie Mellon School of Engineering PI Faculty - Untenured, Tenure Track
ashwatik@andrew.cmu.edu Krishnan, Ashwati Carnegie Mellon School of Engineering Co-PI Faculty - Researcher/Post-Doc
smaciel@andrew.cmu.edu Naro Maciel, Sophia Carnegie Mellon School of Engineering Other Student - PhD
hweigle@andrew.cmu.edu Weigle, Harper Carnegie Mellon School of Engineering Other Student - Undergrad

Budget

Amount of UTC Funds Awarded
$100000.00
Total Project Budget (from all funding sources)
$280000.00

Documents

Type Name Uploaded
Data Management Plan Wearable_neurotechnology_for_inferring_the_drivers_attention_for_assistive_driving_for_individuals_with_attentional_disorders.pdf Jan. 1, 2020, 9:59 a.m.
Presentation GroverMobility21Presentation.pptx March 7, 2020, 1:52 a.m.
Progress Report 323_Progress_Report_2020-09-30 Sept. 29, 2020, 6:22 a.m.
Presentation Sevo electrodes in the clinic March 30, 2021, 1:31 p.m.
Progress Report 323_Progress_Report_2021-03-31 March 30, 2021, 1:31 p.m.
Final Report Final_Report_-_323.pdf Sept. 28, 2021, 8:30 a.m.

Match Sources

No match sources!

Partners

Name Type
Precision Neuroscopics LLC Deployment Partner Deployment Partner
Children's Hospital of Pittsburgh Deployment Partner Deployment Partner