2018 24th International Conference on Pattern Recognition (ICPR)
Download PDF

Abstract

Driver distraction is one of the major causes of road accidents which can lead to severe physical injuries and deaths. Statistics indicate the need of a reliable driver distraction system, which can monitor the driver's distraction and alert the driver before there is a chance of disasters on the road continuously an ubiquitously. Therefore, early detection of driver distraction can help decrease the cost of roadway disasters. Physiological signals such as electrocardiogram (ECG) and electroencephalogram (EEG) have been extensively used for driver state monitoring at the physiological level. More recently, galvanic skin response (GSR) analysis, which is a minimally intrusive technology, has been investigated to develop monitoring systems which alerts divers early. In this paper, we propose a novel detection system that characterizes the impact of secondary tasks of calling and texting on the driver based on the spectrogram and MEL Cepstrum representation of the GSR signals and convolutional neural networks (CNN) modeling. The proposed detection system decomposes the GSR signals in 2D time-frequency representation to decode spectro-temporal patterns. We further isolate the spectral envelope and then extract Mel frequency filter bank coefficients in time and frequency. Our proposed deep CNN structure is designed to automatically learn reliable discriminative visual patterns in the 2D spectrogram and Mel cepstrum space. Passing the layers of the CNN, the low level features transform to high level features representing the impact of the secondary tasks. The aforementioned process replaces the traditional ad hoc hand-crafted feature extraction when working with a high dimensional time-series dataset. The classification accuracy of the proposed prediction algorithm is evaluated based on a set of recorded GSR signals from 7 driver subjects during a naturalistic driving. The experimental results demonstrate that the proposed algorithm achieves a high accuracy of detecting the state of inattention, 93.28%.
Like what you’re reading?
Already a member?
Get this article FREE with a new membership!

Related Articles