全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Analysis of Temporal Relationships between Eye Gaze and Peripheral Vehicle Behavior for Detecting Driver Distraction

DOI: 10.1155/2013/285927

Full-Text   Cite this paper   Add to My Lib

Abstract:

A car driver’s cognitive distraction is a main factor behind car accidents. One’s state of mind is subconsciously exposed as a reaction reflecting it by external stimuli. A visual event that occurs in front of the driver when a peripheral vehicle overtakes the driver’s vehicle is regarded as the external stimulus. We focus on temporal relationships between the driver’s eye gaze and the peripheral vehicle behavior. The analysis result showed that the temporal relationships depend on the driver’s state. In particular we confirmed that the timing of the gaze toward the stimulus under the distracted state induced by a music retrieval task using an automatic speech recognition system is later than that under a neutral state while only driving without the secondary cognitive task. This temporal feature can contribute to detecting the cognitive distraction automatically. A detector based on a Bayesian framework using this feature achieves better accuracy than one based on the percentage road center method. 1. Introduction Driver distraction is a diversion of attention away from activities critical for safe driving toward a competing activity [1] and is a large risk factor that causes accidents [2]. Note that distraction differs from fatigue [3] which is defined as a state that disables one from continuing the activity [4]. Many researchers have developed driver distraction monitoring systems to maintain safety while driving by considering different types and levels of distraction [3]. The National Highway Traffic Safety Administration (NHTSA) classifies distractions into (1) cognitive distraction, (2) visual distraction, (3) auditory distraction, and (4) biomechanical distraction from the viewpoint of the driver’s functionality [2]. Cognitive distraction can be considered as an internal state of the driver. It is difficult to sense this from outside. The other distractions are external factors that disturb the activity and can be observed more easily. We focus on cognitive distraction and seek novel findings to automatically detect it. In the past few decades, a number of methods for detecting distraction have been proposed [3]. The methods fall into the following five categories based on the types of measures: (1) subjective report measures, (2) driver biological measures, (3) driving performance measures, (4) driver physical measures, and (5) hybrid measures. Among these measures, subjective report measures and driver biological measures are not suitable under real driving conditions. Driving performance measures as indicated by steering, braking behavior,

References

[1]  M. A. Regan, J. D. Lee, and K. L. Young, “Defining driver distraction,” in Driver Distraction: Theory, Effects, and Mitigation, chapter 4, pp. 42–54, CRC, 2008.
[2]  T. A. Ranney, W. R. Garrott, and M. J. Goodman, NHTSA Driver Distraction Research: Past, Present, and Future, National Highway Traffic Safety Administration, 2001.
[3]  Y. Dong, Z. Hu, K. Uchimura, and N. Murayama, “Driver inattention monitoring system for intelligent vehicles: a review,” IEEE Transactions on Intelligent Transportation Systems, vol. 12, no. 2, pp. 596–614, 2011.
[4]  H. Croo, M. Bandmann, G. Mackay, K. Rumar, and P. Vollenhoven, The Role of Driver Fatigue in Commercial Road Transport Crashes, 2001.
[5]  Y. Liang and J. D. Lee, “Combining cognitive and visual distraction: less than the sum of its parts,” Accident Analysis and Prevention, vol. 42, no. 3, pp. 881–890, 2010.
[6]  ISO, 15007-1:2002, “Road vehicles—measurement of driver visual behavior with respect to transport information and control systems—part 1: definitions and parameters,” 2002.
[7]  ISO/TS, 15007-1:2001, “Road vehicles—measurement of driver visual behavior with respect to transport information and control systems—part 2: equipment and procedures,” 2001.
[8]  E. Johansson, J. C. Engstr?m, C. Cherri et al., Review of existing techniques and metrics for IVIS and ADAS assessment, Adaptive Integrated Driver-Vehicle Interface, 2004.
[9]  L. Angell, J. Auflick, A. Austria et al., Driver Workload Metrics Project—Task 2 Final Report, U.S. Department of Transportation, National Highway Traffic Safety Administration, 2006.
[10]  J. L. Harbluk and Y. I. Noy, The Impact of Cognitive Distraction on Driver Visual Behavior and Vehicle Control, Ergonomics Division, Road Safety Directorate and Motor Vehicle Regulation Directorate, Ontario, Canada, 2002.
[11]  J. G. May, R. S. Kennedy, M. C. Williams, W. P. Dunlap, and J. R. Brannan, “Eye movement indices of mental workload,” Acta Psychologica, vol. 75, no. 1, pp. 75–89, 1990.
[12]  M. Miyaji, H. Kawanaka, and K. Oguri, “Driver's cognitive distraction detection using physiological features by the AdaBoost,” in Proceedings of the 12th International IEEE Conference on Intelligent Transportation Systems (ITSC '09), pp. 90–95, October 2009.
[13]  K. Kircher, C. Ahlstrom, and A. Kircher, “Comparison of two eye-gaze based real-time driver distraction detection algorithms in a small-scale field operational test,” in Proceedings of the 5th International Symposium on Human Factors in Driver Assessment, Training and Vehicle Design, pp. 16–23, 2009.
[14]  R. Ishii and Y. I. Nakano, “Estimating user’s conversational engagement based on gaze behaviors,” in Intelligent Virtual Agents, Lecture Notes in Computer Science, vol. 5208, pp. 200–207, 2008.
[15]  T. Hirayama, J. Dodane, H. Kawashima, and T. Matsuyama, “Estimates of user interest using timing structures between proactive content-display updates and eye movements,” IEICE Transactions on Information and Systems, vol. E93-D, no. 6, pp. 1470–1478, 2010.
[16]  R. Yonetani, H. Kawashima, T. Hirayama, and T. Matsuyama, “Mental focus analysis using the spatio-temporal correlation between visual saliency and eye movements,” Journal of Information Processing, vol. 20, no. 1, pp. 267–276, 2012.
[17]  T. Hirayama, Y. Sumi, T. Kawahara, and T. Matsuyama, “Info-concierge: proactive multi-modal interaction based on mind probing,” in The Asia Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA/ASC '11), 2011.
[18]  N. Merat and A. H. Jamson, “Multisensory signal detection: a tool for assessing driver workload during IVIS management,” in Proceedings of the 4th International Driving Symposium on Human Factors in Driver Assessment, Training and Vehicle Design, 2007.
[19]  N. Merat, E. Johansson, J. A. Engstr?m, E. Chin, F. Nathan, and T. W. Victor, Specification of a secondary task to be used in safety assessment of IVIS, Adaptive Integrated Driver-Vehicle Interface, 2007.
[20]  G. L. Rupp, Performance Metrics for Assessing Driver Distraction: The Quest for Improved Road Safety, SAE International, 2010.
[21]  L. Hsieh, R. Young, and S. Seaman, “Development of the enhanced peripheral detection task: a surrogate test for driver distraction,” SAE International Journal of Passenger Cars, vol. 5, no. 1, pp. 317–325, 2012.
[22]  M. F. Land and D. N. Lee, “Where we look when we steer,” Nature, vol. 369, no. 6483, pp. 742–744, 1994.
[23]  N. Apostoloff and A. Zelinsky, “Vision in and out of vehicles: integrated driver and road scene monitoring,” International Journal of Robotics Research, vol. 23, no. 4-5, pp. 513–538, 2004.
[24]  L. Fletcher and A. Zelinsky, “Driver inattention detection based on eye gaze-road event correlation,” International Journal of Robotics Research, vol. 28, no. 6, pp. 774–801, 2009.
[25]  M. I. Posner, “Orienting of attention,” The Quarterly journal of experimental psychology, vol. 32, no. 1, pp. 3–25, 1980.
[26]  K. Takeda, J. H. L. Hansen, P. Boyraz, L. Malta, C. Miyajima, and H. Abut, “International large-scale vehicle corpora for research on driver behavior on the road,” IEEE Transactions on Intelligent Transportation Systems, vol. 12, no. 3, pp. 1–15, 2011.
[27]  S. Hara, C. Miyajima, K. Itou, and K. Takeda, “An online customizable music retrieval system with a spoken dialogue interface,” Journal of Acoustical Society of America, vol. 120, no. 5, pp. 3378–3379, 2006.
[28]  Y. Li, C. Miyajima, N. Kitaoka, and K. Takeda, “Driving scene retrieval using integrated vehicle motion feature matching,” in Proceedings of the 5th Biennial Workshop on DSP for In-Vehicle Systems, pp. 1–8, 2011.
[29]  J. M. Wolfe and T. S. Horowitz, “What attributes guide the deployment of visual attention and how do they do it?” Nature Reviews Neuroscience, vol. 5, no. 6, pp. 495–501, 2004.
[30]  L. Malta, C. Miyajima, N. Kitaoka, and K. Takeda, “Analysis of real-world driver's frustration,” IEEE Transactions on Intelligent Transportation Systems, vol. 12, no. 1, pp. 109–118, 2011.
[31]  M. I. Posner and Y. Cohen, “Components of visual orienting,” in Attention and Performance, X. H. . Bouma and D. Bowhuis, Eds., pp. 531–556, Lawrence Erlbaum Associates, Hillsdale, NJ, USA, 1984.
[32]  Alliance of Automobile Manufactures, Statement of Principles, Criteria and Verification Procedures on Driver Interactions with Advanced In-Vehicle Information and Communication Systems, 2006.
[33]  A. Doshi and M. Trivedi, “Investigating the relationships between gaze patterns, dynamic vehicle surround analysis, and driver intentions,” in Proceedings of the IEEE Intelligent Vehicles Symposium, pp. 887–892, June 2009.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133