Machine Learning and End-to-End Deep Learning for Monitoring Driver Distractions From Physiological and Visual Signals
- It is only a matter of time until autonomous vehicles become ubiquitous; however, human driving supervision will remain a necessity for decades. To assess the drive's ability to take control over the vehicle in critical scenarios, driver distractions can be monitored using wearable sensors or sensors that are embedded in the vehicle, such as video cameras. The types of driving distractions that can be sensed with various sensors is an open research question that this study attempts to answer. This study compared data from physiological sensors (palm electrodermal activity (pEDA), heart rate and breathing rate) and visual sensors (eye tracking, pupil diameter, nasal EDA (nEDA), emotional activation and facial action units (AUs)) for the detection of four types of distractions. The dataset was collected in a previous driving simulation study. The statistical tests showed that the most informative feature/modality for detecting driver distraction depends on the type of distraction, with emotional activation and AUs being the most promising. The experimental comparison of seven classical machine learning (ML) and seven end-to-end deep learning (DL) methods, which were evaluated on a separate test set of 10 subjects, showed that when classifying windows into distracted or not distracted, the highest F1-score of 79%; was realized by the extreme gradient boosting (XGB) classifier using 60-second windows of AUs as input. When classifying complete driving sessions, XGB's F1-score was 94%. The best-performing DL model was a spectro-temporal ResNet, which realized an F1-score of 75%; when classifying segments and an F1-score of 87%; when classifying complete driving sessions. Finally, this study identified and discussed problems, such as label jitter, scenario overfitting and unsatisfactory generalization performance, that may adversely affect related ML approaches.
Document Type: | Article |
---|---|
Language: | English |
Author: | Martin Gjoreski, Matja Z. Gams, Mitja Lustrek, Pelin Genc, Jens-U. Garbas, Teena Hassan |
Parent Title (English): | IEEE Access |
Volume: | 8 |
Number of pages: | 14 |
First Page: | 70590 |
Last Page: | 70603 |
ISSN: | 2169-3536 |
DOI: | https://doi.org/10.1109/ACCESS.2020.2986810 |
Publisher: | IEEE |
Date of first publication: | 2020/04/09 |
Keyword: | Machine learning; deep learning; driver distraction; facial expressions; sensors |
Dewey Decimal Classification (DDC): | 0 Informatik, Informationswissenschaft, allgemeine Werke / 00 Informatik, Wissen, Systeme / 006 Spezielle Computerverfahren |
Entry in this database: | 2023/04/13 |