Data provided by sensors is always subjected to some level of uncertainty and inconsistency. Multisensor data fusion algorithms reduce the uncertainty by combining data from several sources. However, if these several sources provide inconsistent data, catastrophic fusion may occur where the performance of multisensor data fusion is significantly lower than the performance of each of the individual sensor. This paper presents an approach to multisensor data fusion in order to decrease data uncertainty with ability to identify and handle inconsistency. The proposed approach relies on combining a modified Bayesian fusion algorithm with Kalman filtering. Three different approaches, namely, prefiltering, postfiltering and pre-postfiltering are described based on how filtering is applied to the sensor data, to the fused data or both. A case study to find the position of a mobile robot by estimating its x and y coordinates using four sensors is presented. The simulations show that combining fusion with filtering helps in handling the problem of uncertainty and inconsistency of the data. 1. Introduction Multisensor data fusion is a multidisciplinary research area borrowing ideas from many diverse fields such as signal processing, information theory, statistical estimation and inference, and artificial intelligence. This is indeed reflected in the variety of the techniques reported in the literature . Several definitions for data fusion exist in the literature. Klein  defines it by stating that data can be provided either by a single source or by multiple sources. Data fusion is defined by Joint Directors of Laboratories (JDL)  as a “multilevel, multifaceted process handling the automatic detection, association, correlation, estimation, and combination of data and information from several sources.” Both definitions are general and can be applied in different fields including remote sensing. In , the authors present a review and discussion of many data fusion definitions. Based on the identified strengths and weaknesses of previous work, a principled definition of data fusion is proposed as the study of efficient methods for automatically or semiautomatically transforming data from different sources and different points in time into a representation that provides effective support for human or automated decision making. Data fusion is applied in many areas of autonomous systems. Autonomous systems must be able to perceive the physical world and physically interact with it through computer-controlled mechanical devices. A critical problem of autonomous
J. Llinas, C. Bowman, G. Rogova, A. Steinberg, E. Waltz, and F. White, “Revisiting the JDL data fusion model II,” in Proceedings of the 7th International Conference on Information Fusion (FUSION '04), pp. 1218–1230, July 2004.
B. Khaleghi, A. Khamis, and F. Karray, “Random finite set theoretic based soft/hard data fusion with application for target tracking,” in Proceedings of the IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI '10), pp. 50–55, September 2010.
D. P. Garg, M. Kumar, and R. A. Zachery, “A eneralized approach for inconsistency detection in data fusion from multiple sensors,” in Proceedings of the American Control Conference, pp. 2078–2083, June 2006.