全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Human’s Overtrust in and Overreliance on Advanced Driver Assistance Systems: A Theoretical Framework

DOI: 10.1155/2013/951762

Full-Text   Cite this paper   Add to My Lib

Abstract:

This paper gives a theoretical framework to describe, analyze, and evaluate the driver’s overtrust in and overreliance on ADAS. Although “overtrust” and “overreliance” are often used as if they are synonyms, this paper differentiates the two notions rigorously. To this end, two aspects, (1) situation diagnostic aspect and (2) action selection aspect, are introduced. The first aspect is to describe overtrust, and it has three axes: (1-1) dimension of trust, (1-2) target object, and (1-3) chances of observation. The second aspect, (2), is to describe overreliance on the ADAS, and it has other three axes: (2-1) type of action selected, (2-2) benefits expected, and (2-3) time allowance for human intervention. 1. Introduction Driving a car requires a continuous process of perception, cognition, action selection, and action implementation. Various functions are implemented in an advanced driver assistance system (ADAS) to assist a human to drive a car in a dynamic environment. Such functions, sometimes arranged in a multilayered manner, include (a) perception enhancement that helps the driver to perceive the traffic environment around his/her vehicle, (b) arousing attention of the driver to encourage paying attention to potential risks around his/her vehicle, (c) setting off a warning to encourage the driver to take a specific action to avoid an incident or accident, and (d) automatic safety control that is activated when the driver takes no action even after being warned or when the driver’s control action seems to be insufficient [1]. The first two functions, (a) and (b), are to help the driver to understand the situation. Understanding of the current situation determines what action needs to be done [2]. Once situation diagnostic decision is made, action selection decision is usually straightforward, as has been suggested by recognition-primed decision making research [3]. However, the driver may sometimes feel difficulty in action selection decision. Function (c) is to help the driver in such a circumstance. Note that any ADAS that uses only the three functions, (a)–(c), is completely compatible with the human-centered automation principle [4] in which the human is assumed to have the final authority over the automation. Suppose an ADAS contains the fourth function, (d). Then the ADAS may not always be fully compatible with the human-centered automation principle, because the system can implement an action that is not ordered by the driver explicitly. Some automatic safety control functions have been already implemented in the real world. Typical examples

References

[1]  T. Inagaki, “Smart collaboration between humans and machines based on mutual understanding,” Annual Reviews in Control, vol. 32, no. 2, pp. 253–261, 2008.
[2]  E. Hollnagel and A. Bye, “Principles for modelling function allocation,” International Journal of Human Computer Studies, vol. 52, no. 2, pp. 253–265, 2000.
[3]  G. Klein, “A recognition-primed decision (RPD) model of rapid decision making,” in Decision Making in Action, G. Klein, J. Orasanu, and R. Calderwood, Eds., pp. 138–147, Ablex, 1993.
[4]  C. E. Billings, Aviation Automation—The Search for a Human-Centered Approach, LEA, 1997.
[5]  T. Inagaki, “Design of human-machine interactions in light of domain-dependence of human-centered automation,” Cognition, Technology and Work, vol. 8, no. 3, pp. 161–167, 2006.
[6]  M. R. Endsley and E. O. Kiris, “The out-of-the-loop performance problem and level of control in automation,” Human Factors, vol. 37, no. 2, pp. 381–394, 1995.
[7]  N. B. Sarter and D. D. Woods, “How in the world did we ever get into that mode? Mode error and awareness in supervisory control,” Human Factors, vol. 37, no. 1, pp. 5–19, 1995.
[8]  R. Parasuraman and V. Riley, “Humans and automation: use, misuse, disuse, abuse,” Human Factors, vol. 39, no. 2, pp. 230–253, 1997.
[9]  N. B. Sarter, D. D. Woods, and C. E. Billings, “Automation surprises,” in Handbook of Human Factors and Ergonomics, G. Salvendy, Ed., pp. 1926–1943, John Wiley & Sons, 2nd edition, 1997.
[10]  T. Inagaki and J. Stahre, “Human supervision and control in engineering and music: similarities, dissimilarities and their implications,” Proceedings of the IEEE, vol. 92, no. 4, pp. 589–600, 2004.
[11]  R. Parasuraman, R. Molloy, and I. L. Singh, “Performance consequences of automation-induced ‘complacency’,” International Journal of Aviation Psychology, vol. 3, no. 1, pp. 1–23, 1993.
[12]  K. L. Mosier, L. J. Skitka, S. Heers, and M. Burdick, “Automation bias: decision making and performance in high-tech cockpits,” International Journal of Aviation Psychology, vol. 8, no. 1, pp. 47–63, 1998.
[13]  J. Meyer, “Effects of warning validity and proximity on responses to warnings,” Human Factors, vol. 43, no. 4, pp. 563–572, 2001.
[14]  T. B. Sheridan and R. Parasuraman, “Human-automation interaction,” in Reviews of Human Factors and Ergonomics, R. S. Nickerson, Ed., vol. 1, pp. 89–129, HFES, 2005.
[15]  Complacency, Merriam-Webster Online Dictionary, 2010.
[16]  N. Moray and T. Inagaki, “Attention and complacency,” Theoretical Issues in Ergonomics Science, vol. 1, no. 4, pp. 354–365, 2001.
[17]  J. Lee and N. Moray, “Trust, control strategies and allocation of function in human-machine systems,” Ergonomics, vol. 35, no. 10, pp. 1243–1270, 1992.
[18]  M. Itoh, “Toward overtrust-free advanced driver assistance systems,” Cognition, Technology, and Work, vol. 14, no. 1, pp. 51–60, 2012.
[19]  T. Inagaki, “New challenges on vehicle automation: human trust in and reliance on adaptive cruise control systems,” in Proceedings of the IEA (CD-ROM), p. 4, 2003.
[20]  G. J. S. Wilde, Target Risk, PDE Publications, Toronto, Canada, 1994.
[21]  OECD Road Transport Research, Behavioral Adaptations to Changes in the Road Transportation System, OECD, Paris, France, 1990.
[22]  M. Weinberger, H. Winner, and H. Bubb, “Adaptive cruise control field operational test—the learning phase,” JSAE Review, vol. 22, pp. 487–494, 2001.
[23]  B. D. Seppelt and J. D. Lee, “Making adaptive cruise control (ACC) limits visible,” International Journal of Human Computer Studies, vol. 65, no. 3, pp. 192–205, 2007.
[24]  D. A. Dickie and L. N. Boyle, “Drivers' understanding of adaptive cruise control limitations,” in Proceedings of the 53rd Human Factors and Ergonomics Society Annual Meeting (HFES '09), pp. 1806–1810, October 2009.
[25]  K. Tanno, M. Kohno, K. Ono et al., “Fatal cardiovascular injuries to the unbelted occupant associated with airbag deployment: two case-reports,” Legal Medicine, vol. 2, no. 4, pp. 227–231, 2000.
[26]  M. Itoh, Y. Fujiwara, and T. Inagaki, “Driver behavioural changes through interactions with an automatic brake systems,” Transactions of the Society of Instrument and Control Engineers, vol. 47, no. 11, pp. 512–519, 2011 (Japanese).
[27]  T. Inagaki, “Adaptive automation: sharing and trading of control,” in Handbook of Cognitive Task Design, E. Hollnagel, Ed., pp. 147–169, LEA, 2003.
[28]  M. W. Scerbo, “Theoretical perspectives on adaptive automation,” in Automation and Human Performance, R. Parasuraman and M. Mouloua, Eds., pp. 37–63, LEA, 1996.
[29]  T. Inagaki and T. B. Sheridan, “Authority and responsibility in human-machine systems: probability theoretic validation of machine-initiated trading of authority,” Cognition, Technology & Work, vol. 14, no. 1, pp. 29–37, 2012.
[30]  T. Inagaki, M. Itoh, and Y. Nagai, “Efficacy and acceptance of driver support under possible mismatches between driver's intent and traffic conditions,” in Proceedings of the 50th Annual Meeting of the Human Factors and Ergonomics Society (HFES '06), pp. 280–283, October 2006.
[31]  T. Inagaki, M. Itoh, and Y. Nagai, “Driver support functions under resource-limited situations,” in Proceedings of the 51st Annual Meeting of the Human Factors and Ergonomics Society (HFES '07), pp. 176–180, October 2007.
[32]  T. Inagaki, M. Itoh, and Y. Nagai, “Support by warning or by action: which is appropriate under mismatches between driver intent and traffic conditions?” IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, vol. E90-A, no. 11, pp. 2540–2545, 2007.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133