全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Modeling a Sensor to Improve Its Efficacy

DOI: 10.1155/2013/481054

Full-Text   Cite this paper   Add to My Lib

Abstract:

Robots rely on sensors to provide them with information about their surroundings. However, high-quality sensors can be extremely expensive and cost-prohibitive. Thus many robotic systems must make due with lower-quality sensors. Here we demonstrate via a case study how modeling a sensor can improve its efficacy when employed within a Bayesian inferential framework. As a test bed we employ a robotic arm that is designed to autonomously take its own measurements using an inexpensive LEGO light sensor to estimate the position and radius of a white circle on a black field. The light sensor integrates the light arriving from a spatially distributed region within its field of view weighted by its spatial sensitivity function (SSF). We demonstrate that by incorporating an accurate model of the light sensor SSF into the likelihood function of a Bayesian inference engine, an autonomous system can make improved inferences about its surroundings. The method presented here is data based, fairly general, and made with plug-and-play in mind so that it could be implemented in similar problems. 1. Introduction Robots rely on sensors to provide them with information about their surroundings. However, high-quality sensors can be cost-prohibitive and often one must make due with lower quality sensors. In this paper we present a case study which demonstrates how employing an accurate model of a sensor within a Bayesian inferential framework can improve the quality of inferences made from the data produced by that sensor. In fact, the quality of the sensor can be quite poor, but if it is known precisely how it is poor, this information can be used to improve the results of inferences made from the sensor data. To accomplish this we rely on a Bayesian inferential framework where a machine learning system considers a set of hypotheses about its surroundings and identifies more probable hypotheses given incoming sensor data. Such inferences rely on a likelihood function, which quantifies the probability that a hypothesized situation could have given rise to the data. The likelihood is often considered to represent the noise model, and this inherently includes a model of how the sensor is expected to behave when presented with a given stimulus. By incorporating an accurate model of the sensor, the inferences made by the system are improved. As a test bed we employ an autonomous robotic arm developed in the Knuth Cyberphysics Laboratory at the University at Albany (SUNY). The robot is designed to perform studies in autonomous experimental design [1, 2]. In particular it performs

References

[1]  K. H. Knuth, P. M. Erner, and S. Frasso, “Designing intelligent instruments,” in Proceedings of the 27th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, K. Knuth, A. Caticha, J. Center, A. Giffin, and C. Rodriguez, Eds., vol. 954, pp. 203–211, AIP, July 2007.
[2]  K. H. Knuth and J. L. Center Jr., “Autonomous science platforms and question-asking machines,” in Proceedings of the 2nd International Workshop on Cognitive Information Processing (CIP '10), pp. 221–226, June 2010.
[3]  K. H. Knuth and J. L. Center, “Autonomous sensor placement,” in Proceedings of the IEEE International Conference on Technologies for Practical Robot Applications, pp. 94–99, November 2008.
[4]  K. H. Knuth, “Intelligent machines in the twenty-first century: foundations of inference and inquiry,” Philosophical Transactions of the Royal Society A, vol. 361, no. 1813, pp. 2859–2873, 2003.
[5]  N. K. Malakar and K. H. Knuth, “Entropy-based search algorithm for experimental design,” in Proceedings of the 30th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, A. M. Djafari, J. F. Bercher, and P. Bessière, Eds., vol. 1305, pp. 157–164, AIP, July 2010.
[6]  N. K. Malakar, K. H. Knuth, and D. J. Lary, “Maximum joint entropy and information-based collaboration of automated learning machines,” in Proceedings of the 31st International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, P. Goyal, A. Giffin, K. H. Knuth, and E. Vrscay, Eds., vol. 1443, pp. 230–237, AIP, 2012.
[7]  D. V. Lindley, “On a measure of the information provided by an experiment,” The Annals of Mathematical Statistics, vol. 27, no. 4, pp. 986–1005, 1956.
[8]  V. V. Fedorov, Theory of Optimal Experiments, Probability and Mathematical Statistics, Academic Press, 1972.
[9]  K. Chaloner and I. Verdinelli, “Bayesian experimental design: a review,” Statistical Science, vol. 10, pp. 273–304, 1995.
[10]  P. Sebastiani and H. P. Wynn, “Bayesian experimental design and shannon information,” in 1997 Proceedings of the Section on Bayesian Statistical Science, pp. 176–181, 1997, http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.56.6037.
[11]  P. Sebastiani and H. P. Wynn, “Maximum entropy sampling and optimal Bayesian experimental design,” Journal of the Royal Statistical Society B, vol. 62, no. 1, pp. 145–157, 2000.
[12]  T. J. Loredo and D. F. Cherno, “Bayesian adaptive exploration,” in Proceedings of the 23rd International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, G. Erickson and Y. Zhai, Eds., pp. 330–346, AIP, August 2003.
[13]  R. Fischer, “Bayesian experimental design—studies for fusion diagnostics,” in Proceedings of the 24th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, vol. 735, pp. 76–83, November 2004.
[14]  C. E. Shannon and W. Weaver, The Mathematical Theory of Communication, University of Illinois Press, Chicago, Ill, USA, 1949.
[15]  J. Skilling, “Nested sampling,” in Proceedings of the 24th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, vol. 735, pp. 395–405, 2004.
[16]  D. Sivia and J. Skilling, Data Analysis: A Bayesian Tutorial, Oxford University Press, New York, NY, USA, 2nd edition, 2006.
[17]  N. Metropolis, A. W. Rosenbluth, M. N. Rosenbluth, A. H. Teller, and E. Teller, “Equation of state calculations by fast computing machines,” The Journal of Chemical Physics, vol. 21, no. 6, pp. 1087–1092, 1953.
[18]  W. K. Hastings, “Monte carlo sampling methods using Markov chains and their applications,” Biometrika, vol. 57, no. 1, pp. 97–109, 1970.
[19]  N. K. Malakar, A. J. Mesiti, and K. H. Knuth, “The spatial sensitivity function of a light sensor,” in Proceedings of the 29th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, P. Goggans and C. Y. Chan, Eds., vol. 1193 of AIP Conference Proceedings, pp. 352–359, American Institute of Physics, Oxford, Mass, USA, July 2009.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133