All Title Author
Keywords Abstract


A Functional Model of Sensemaking in a Neurocognitive Architecture

DOI: 10.1155/2013/921695

Full-Text   Cite this paper   Add to My Lib

Abstract:

Sensemaking is the active process of constructing a meaningful representation (i.e., making sense) of some complex aspect of the world. In relation to intelligence analysis, sensemaking is the act of finding and interpreting relevant facts amongst the sea of incoming reports, images, and intelligence. We present a cognitive model of core information-foraging and hypothesis-updating sensemaking processes applied to complex spatial probability estimation and decision-making tasks. While the model was developed in a hybrid symbolic-statistical cognitive architecture, its correspondence to neural frameworks in terms of both structure and mechanisms provided a direct bridge between rational and neural levels of description. Compared against data from two participant groups, the model correctly predicted both the presence and degree of four biases: confirmation, anchoring and adjustment, representativeness, and probability matching. It also favorably predicted human performance in generating probability distributions across categories, assigning resources based on these distributions, and selecting relevant features given a prior probability distribution. This model provides a constrained theoretical framework describing cognitive biases as arising from three interacting factors: the structure of the task environment, the mechanisms and limitations of the cognitive architecture, and the use of strategies to adapt to the dual constraints of cognition and the environment. 1. Introduction We present a computational cognitive model, developed in the ACT-R architecture [1, 2], of several core information-foraging and hypothesis-updating processes involved in a complex sensemaking task. Sensemaking [3–6] is a concept that has been used to define a class of activities and tasks in which there is an active seeking and processing of information to achieve understanding about some state of affairs in the world. Complex tasks in intelligence analysis and situation awareness have frequently been cited as examples of sensemaking [3–5]. Sensemaking, as in to make sense, implies an active process to construct a meaningful and functional representation of some aspects of the world. A variety of theories and perspectives on sensemaking have been developed in psychology [3, 4], human-computer interaction [6], information and library science [7], and in organizational science [8]. In this paper we present a cognitive model of basic sensemaking processes for an intelligence analysis task. A major concern in the intelligence community is the impact of cognitive biases on the

References

[1]  J. R. Anderson, How Can the Human Mind Occur in the Physical Universe?Oxford University Press, Oxford, UK, 2007.
[2]  J. R. Anderson, D. Bothell, M. D. Byrne, S. Douglass, C. Lebiere, and Y. Qin, “An integrated theory of the mind,” Psychological Review, vol. 111, no. 4, pp. 1036–1060, 2004.
[3]  G. Klein, B. Moon, and R. R. Hoffman, “Making sense of sensemaking 1: alternative perspectives,” IEEE Intelligent Systems, vol. 21, no. 4, pp. 70–73, 2006.
[4]  G. Klein, B. Moon, and R. R. Hoffman, “Making sense of sensemaking 2: a macrocognitive model,” IEEE Intelligent Systems, vol. 21, no. 5, pp. 88–92, 2006.
[5]  P. Pirolli and S. K. Card, “The sensemaking process and leverage points for analyst technology,” in Proceedings of the International Conference on Intelligence Analysis, McLean, Va, USA, May 2005.
[6]  D. M. Russell, M. J. Stefik, P. Pirolli, and S. K. Card, “The cost structure of sensemaking,” in Proceedings of the INTERACT and CHI Conference on Human Factors in Computing Systems, pp. 269–276, April 1993.
[7]  B. Dervin, “From the mind's eye of the user: the sense-making of qualitative-quantitative methodology,” in Sense-Making Methodology Reader: Selected Writings of Brenda Dervin, B. Dervin, L. Foreman-Wenet, and E. Lauterbach, Eds., pp. 269–292, Hampton Press, Cresskill, NJ, USA, 2003.
[8]  K. Weick, Sensemaking in Oragnizations, Sage, Thousand Oaks, Calif, USA, 1995.
[9]  R. J. Heuer, Psychology of Intelligence Analysis, Center for the Study of Intelligence, Washington, DC, USA, 1999.
[10]  S. G. Hutchins, P. Pirolli, and S. K. Card, “What makes intelligence analysis difficult? A cognitive task analysis of intelligence analysts,” in Expertise Out of Context, R. R. Hoffman, Ed., pp. 281–316, Lawrence Erlbaum Associates, Mahwah, NJ, USA, 2007.
[11]  L. G. Militello and R. J. B. Hutton, “Applied cognitive task analysis (ACTA): a practitioner's toolkit for understanding cognitive task demands,” Ergonomics, vol. 41, no. 11, pp. 1618–1641, 1998.
[12]  J. M. Schraagen, S. F. Chipman, and V. L. Shalin, Eds., Cognitive Task Analysis, Lawrence Erlbaum Associates, Mahwah, NJ, USA, 2000.
[13]  M. I. Posner, R. Goldsmith, and K. E. Welton Jr., “Perceived distance and the classification of distorted patterns,” Journal of Experimental Psychology, vol. 73, no. 1, pp. 28–38, 1967.
[14]  C. Lefebvre, R. Dell'acqua, P. R. Roelfsema, and P. Jolic?ur, “Surfing the attentional waves during visual curve tracing: evidence from the sustained posterior contralateral negativity,” Psychophysiology, vol. 48, no. 11, pp. 1510–1516, 2011.
[15]  F. G. Ashby and W. T. Maddox, “Complex decision rules in categorization: contrasting novice and experienced performance,” Journal of Experimental Psychology, vol. 18, no. 1, pp. 50–71, 1992.
[16]  A. Stocco, C. Lebiere, R. C. O'Reilly, and J. R. Anderson, “The role of the anterior prefrontal-basal ganglia circuit as a biological instruction interpreter,” in Biologically Inspired Cognitive Architectures 2010, A. V. Samsonovich, K. R. Jóhannsdóttir, A. Chella, and B. Goertzel, Eds., vol. 221 of Frontiers in Artificial Intelligence and Applications, pp. 153–162, 2010.
[17]  G. Ryle, The Concept of Mind, Hutchinson, London, UK, 1949.
[18]  G. A. Miller, “The magical number seven, plus or minus two: some limits on our capacity for processing information,” Psychological Review, vol. 63, no. 2, pp. 81–97, 1956.
[19]  H. A. Simon, “How big is a chunk?” Science, vol. 183, no. 4124, pp. 482–488, 1974.
[20]  C. Lebiere, “The dynamics of cognition: an ACT-R model of cognitive arithmetic,” Kognitionswissenschaft, vol. 8, no. 1, pp. 5–19, 1999.
[21]  N. Taatgen, C. Lebiere, and J. R. Anderson, “Modeling paradigms in ACT-R,” in Cognition and Multi-Agent Interaction: From Cognitive Modeling to Social Simulation, R. Sun, Ed., Cambridge University Press., New York, NY, USA, 2006.
[22]  S. K. Card, T. P. Moran, and A. Newell, The Psychology of Human-Computer Interaction, Lawrence Erlbaum Associates, Hillsdale, NJ, USA, 1983.
[23]  A. Newell, Unified Theories of Cognition, Harvard University Press, Cambridge, Mass, USA, 1990.
[24]  J. R. Anderson, “Acquisition of cognitive skill,” Psychological Review, vol. 89, no. 4, pp. 369–406, 1982.
[25]  J. R. Anderson, Rules of the Mind, Lawrence Erlbaum Associates, Hillsdale, NJ, USA, 1993.
[26]  C. Gonzalez, J. F. Lerch, and C. Lebiere, “Instance-based learning in dynamic decision making,” Cognitive Science, vol. 27, no. 4, pp. 591–635, 2003.
[27]  C. Lebiere, C. Gonzalez, and W. Warwick, “Metacognition and multiple strategies in a cognitive model of online control,” Journal of Artificial General Intelligence, vol. 2, no. 2, pp. 20–37, 2010.
[28]  R. C. O'Reilly, T. E. Hazy, and S. A. Herd, “The leabra cognitive architecture: how to play 20 principles with nature and win!,” in Oxford Handbook of Cognitive Science, S. Chipman, Ed., Oxford University Press, Oxford, UK.
[29]  M. Ziegler, M. Howard, A. Zaldivar et al., “Simulation of anchoring bias in a spatial estimation task due to cholinergic neuromodulation,” submitted.
[30]  Y. Sun and H. Wang, “The parietal cortex in sensemaking: spatio-attentional aspects,” in press.
[31]  R. Thomson and C. Lebiere, “Constraining Bayesian inference with cognitive architectures: an updated associative learning mechanism in ACT-R,” in Proceedings of the 35th Annual Meeting of the Cognitive Science Society (CogSci '13), Berlin, Germany, July-August 2013.
[32]  C. Lebiere and J. R. Anderson, “A connectionist implementation of the ACT-R production system,” in Proceedings of the 15th Annual Meeting of the Cognitive Science Society (CogSci '93), pp. 635–640, Lawrence Erlbaum Associates, June 1993.
[33]  D. J. Jilk, C. Lebiere, R. C. O'Reilly, and J. R. Anderson, “SAL: an explicitly pluralistic cognitive architecture,” Journal of Experimental and Theoretical Artificial Intelligence, vol. 20, no. 3, pp. 197–218, 2008.
[34]  D. Marr, “Simple memory: a theory for archicortex,” Philosophical Transactions of the Royal Society of London B, vol. 262, no. 841, pp. 23–81, 1971.
[35]  D. E. Rumelhart, J. L. McClelland, and PDP Research Group, Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vol. 1: Foundations, MIT Press, Cambridge, Mass, USA, 1986.
[36]  C. Lebiere, J. R. Anderson, and L. M. Reder, “Error modeling in the ACT-R production system,” in Proceedings of the 16th Annual Meeting of the Cognitive Science Society, pp. 555–559, Lawrence Erlbaum Associates, 1994.
[37]  J. R. Anderson, The Adaptive Character of Thought, Lawrence Erlbaum Associates, 1990.
[38]  A. Klippel, H. Tappe, and C. Habel, “Pictorial representations of routes: chunking route segments during comprehension,” in Spatial Cognition III, C. Freksa, W. Brauer, C. Habel, and K. F. Wender, Eds., vol. 2685 of Lecture Notes in Computer Science, pp. 11–33, 2003.
[39]  S. M. Kosslyn, Image and Brain: The Resolution of the Imagery Debate, MIT Press, Cambridge, Mass, USA, 1994.
[40]  W. C. Gogel and J. A. da Silva, “A two-process theory of the response to size and distance,” Perception & Psychophysics, vol. 41, no. 3, pp. 220–238, 1987.
[41]  W. M. Wiest and B. Bell, “Stevens's exponent for psychophysical scaling of perceived, remembered, and inferred distance,” Psychological Bulletin, vol. 98, no. 3, pp. 457–470, 1985.
[42]  J. A. da Silva, “Scales for perceived egocentric distance in a large open field: comparison of three psychophysical methods,” The American Journal of Psychology, vol. 98, no. 1, pp. 119–144, 1985.
[43]  J. A. Aznar-Casanova, E. H. Matsushima, N. P. Ribeiro-Filho, and J. A. da Silva, “One-dimensional and multi-dimensional studies of the exocentric distance estimates in frontoparallel plane, virtual space, and outdoor open field,” The Spanish Journal of Psychology, vol. 9, no. 2, pp. 273–284, 2006.
[44]  C. A. Levin and R. N. Haber, “Visual angle as a determinant of perceived interobject distance,” Perception & Psychophysics, vol. 54, no. 2, pp. 250–259, 1993.
[45]  R. Thomson, The role of object size on judgments of lateral separation [Ph.D. dissertation].
[46]  S. Dehaene and L. Cohen, “Language and elementary arithmetic: dissociations between operations,” Brain and Language, vol. 69, no. 3, pp. 492–494, 1999.
[47]  C. . Lebiere, C. Gonzalez, and M. Martin, “Instance-based decision making model of repeated binary choice,” in Proceedings of the 8th International Conference on Cognitive Modeling (ICCM '07), Ann Arbor, Mich, USA, July 2007.
[48]  I. Erev, E. Ert, A. E. Roth et al., “A choice prediction competition: choices from experience and from description,” Journal of Behavioral Decision Making, vol. 23, no. 1, pp. 15–47, 2010.
[49]  D. Wallach and C. Lebiere, “Conscious and unconscious knowledge: mapping to the symbolic and subsymbolic levels of a hybrid architecture,” in Attention and Implicit Learning, L. Jimenez, Ed., John Benjamins Publishing, Amsterdam, Netherlands, 2003.
[50]  C. Lebiere, C. Gonzalez, and W. Warwick, “A comparative approach to understanding general intelligence: predicting cognitive performance in an open-ended dynamic task,” in Proceedings of the 2nd Conference on Artificial General Intelligence (AGI '09), pp. 103–107, Arlington, Va, USA, March 2009.
[51]  J. Klayman, “Varieties of confirmation bias,” in Decision Making from a Cognitive Perspective, J. Busemeyer, R. Hastie, and D. L. Medin, Eds., vol. 32 of Psychology of Learning and Motivation, pp. 365–418, Academic Press, New York, NY, USA, 1995.
[52]  J. Klayman and Y.-W. Ha, “Confirmation, disconfirmation, and information in hypothesis testing,” Psychological Review, vol. 94, no. 2, pp. 211–228, 1987.
[53]  R. S. Nickerson, “Confirmation bias: a ubiquitous phenomenon in many guises,” Review of General Psychology, vol. 2, no. 2, pp. 175–220, 1998.
[54]  A. Tversky and D. Kahneman, “Judgment under uncertainty: heuristics and biases,” Science, vol. 185, no. 4157, pp. 1124–1131, 1974.
[55]  P. Wason, “On the failure to eliminate hypotheses in a conceptual task,” The Quarterly Journal of Experimental Psychology, vol. 12, no. 3, pp. 129–140, 1960.
[56]  C. D. Wickens and J. G. Hollands, Engineering Psychology and Human Performance, Prentice Hall, Upper Saddle River, NJ, USA, 3rd edition, 2000.
[57]  B. A. Cheikes, M. J. Brown, P. E. Lehner, and L. Adelman, “Confirmation bias in complex analyses,” Tech. Rep., MITRE Center for Integrated Intelligence Systems, Bedford, Mass, USA, 2004.
[58]  G. Convertino, D. Billman, P. Pirolli, J. P. Massar, and J. Shrager, “Collaborative intelligence analysis with CACHE: bias reduction and information coverage,” Tech. Rep., Palo Alto Research Center, Palo Alto, Calif, USA, 2006.
[59]  M. A. Tolcott, F. F. Marvin, and P. E. Lehner, “Expert decisionmaking in evolving situations,” IEEE Transactions on Systems, Man and Cybernetics, vol. 19, no. 3, pp. 606–615, 1989.
[60]  C. Grabo and J. Goldman, Anticipating Surprise, Rowman & Littlefield, 2004.
[61]  J. R. Anderson and C. Lebiere, The Atomic Components of Thought, Lawrence Erlbaum Associates, Mahwah, NJ, USA, 1998.
[62]  K. Burns, “Mental models and normal errors,” in How Professionals Make Decision, H. Montgomery, R. Lipshitz, and B. Brehmer, Eds., pp. 15–28, Lawrence Erlbaum Associates, Mahwah, NJ, USA, 2004.
[63]  MITRE Technical Report, “IARPA's ICArUS program: phase 1 challenge problem design and test specification,” in progress.
[64]  MITRE Technical Report, “A computational basis for ICArUS challenge problem design,” in progress.
[65]  D. Kahneman and S. Frederick, “A model of heuristic judgment,” in The Cambridge Handbook of Thinking and Reasoning, K. J. Holyoak and R. G. Morrison, Eds., pp. 267–293, Cambridge University Press, New York, NY, USA, 2005.

Full-Text

comments powered by Disqus