Home OALib Journal OALib PrePrints Submit Ranking News My Lib FAQ About Us Follow Us+
 Title Keywords Abstract Author All
Search Results: 1 - 10 of 100 matches for " "
 Page 1 /100 Display every page 5 10 20 Item
 Steven A. Frank Quantitative Biology , 2012, DOI: 10.1111/jeb.12010 Abstract: The equations of evolutionary change by natural selection are commonly expressed in statistical terms. Fisher's fundamental theorem emphasizes the variance in fitness. Quantitative genetics expresses selection with covariances and regressions. Population genetic equations depend on genetic variances. How can we read those statistical expressions with respect to the meaning of natural selection? One possibility is to relate the statistical expressions to the amount of information that populations accumulate by selection. However, the connection between selection and information theory has never been compelling. Here, I show the correct relations between statistical expressions for selection and information theory expressions for selection. Those relations link selection to the fundamental concepts of entropy and information in the theories of physics, statistics, and communication. We can now read the equations of selection in terms of their natural meaning. Selection causes populations to accumulate information about the environment.
 International Journal of Rotating Machinery , 2008, DOI: 10.1155/2008/784749 Abstract: Engine health monitoring has been an area of intensive research for many years. Numerous methods have been developed with the goal of determining a faithful picture of the engine condition. On the other hand, the issue of sensor selection allowing an efficient diagnosis has received less attention from the community. The present contribution revisits the problem of sensor selection for engine performance monitoring within the scope of information theory. To this end, a metric that integrates the essential elements of the sensor selection problem is defined from the Fisher information matrix. An example application consisting in a commercial turbofan engine illustrates the enhancement that can be expected from a wise selection of the sensor set.
 Computer Science , 2014, Abstract: This paper considers the problem of state tracking with observation control for a particular class of dynamical systems. The system state evolution is described by a discrete-time, finite-state Markov chain, while the measurement process is characterized by a controlled multi-variate Gaussian observation model. The computational complexity of the optimal control strategy proposed in our prior work proves to be prohibitive. A suboptimal, lower complexity algorithm based on the Fisher information measure is proposed. Toward this end, the preceding measure is generalized to account for multi-valued discrete parameters and control inputs. A closed-form formula for our system model is also derived. Numerical simulations are provided for a physical activity tracking application showing the near-optimal performance of the proposed algorithm.
 Physics , 1998, Abstract: We study the Fisher model describing natural selection in a population with a diploid structure of a genome by differential- geometric methods. For the selection dynamics we introduce an affine connection which is shown to be the projectively Euclidean and the equiaffine one. The selection dynamics is reformulated similar to the motion of an effective particle moving along the geodesic lines in an 'effective external field' of a tensor type. An exact solution is found to the Fisher equations for the special case of fitness matrix associated to the effect of chromosomal imprinting of mammals. Biological sense of the differential- geometric constructions is discussed. The affine curvature is considered as a direct consequence of an allele coupling in the system. This curving of the selection dynamics geometry is related to an inhomogenity of the time flow in the course of the selection.
 Mathematics , 1998, DOI: 10.1088/0305-4470/33/24/306 Abstract: Braunstein and Caves (1994) proposed to use Helstrom's {\em quantum information} number to define, meaningfully, a metric on the set of all possible states of a given quantum system. They showed that the quantum information is nothing else than the maximal Fisher information in a measurement of the quantum system, maximized over all possible measurements. Combining this fact with classical statistical results, they argued that the quantum information determines the asymptotically optimal rate at which neighbouring states on some smooth curve can be distinguished, based on arbitrary measurements on $n$ identical copies of the given quantum system. We show that the measurement which maximizes the Fisher information typically depends on the true, unknown, state of the quantum system. We close the resulting loophole in the argument by showing that one can still achieve the same, optimal, rate of distinguishability, by a two stage adaptive measurement procedure. When we consider states lying not on a smooth curve, but on a manifold of higher dimension, the situation becomes much more complex. We show that the notion of distinguishability of close-by states'' depends strongly on the measurement resources one allows oneself, and on a further specification of the task at hand. The quantum information matrix no longer seems to play a central role.
 Steven A. Frank Quantitative Biology , 2013, DOI: 10.1111/jeb.12066 Abstract: Three steps aid in the analysis of selection. First, describe phenotypes by their component causes. Components include genes, maternal effects, symbionts, and any other predictors of phenotype that are of interest. Second, describe fitness by its component causes, such as an individual's phenotype, its neighbors' phenotypes, resource availability, and so on. Third, put the predictors of phenotype and fitness into an exact equation for evolutionary change, providing a complete expression of selection and other evolutionary processes. The complete expression separates the distinct causal roles of the various hypothesized components of phenotypes and fitness. Traditionally, those components are given by the covariance, variance, and regression terms of evolutionary models. I show how to interpret those statistical expressions with respect to information theory. The resulting interpretation allows one to read the fundamental equations of selection and evolution as sentences that express how various causes lead to the accumulation of information by selection and the decay of information by other evolutionary processes. The interpretation in terms of information leads to a deeper understanding of selection and heritability, and a clearer sense of how to formulate causal hypotheses about evolutionary process. Kin selection appears as a particular type of causal analysis that partitions social effects into meaningful components.
 Quantitative Biology , 2014, Abstract: Eukaryotic cell development has been optimized by natural selection to obey maximal intracellular flux of messenger proteins. This, in turn, implies maximum Fisher information on angular position about a target nuclear pore complex (NPR). The cell is simply modeled as spherical, with cell membrane (CM) diameter 10 micron and concentric nuclear membrane (NM) diameter 6 micron. The NM contains about 3000 nuclear pore complexes (NPCs). Development requires messenger ligands to travel from the CM-NPC-DNA target binding sites. Ligands acquire negative charge by phosphorylation, passing through the cytoplasm over Newtonian trajectories toward positively charged NPCs (utilizing positive nuclear localization sequences). The CM-NPC channel obeys maximized mean protein flux F and Fisher information I at the NPC, with first-order delta I = 0 and approximate 2nd-order delta I = 0 stability to environmental perturbations. Many of its predictions are confirmed, including the dominance of protein pathways of from 1-4 proteins, a 4nm size for the EGFR protein and the approximate flux value F =10^16 proteins/m2-s. After entering the nucleus, each protein ultimately delivers its ligand information to a DNA target site with maximum probability, i.e. maximum Kullback-Liebler entropy HKL. In a smoothness limit HKL approaches IDNA/2, so that the total CM-NPC-DNA channel obeys maximum Fisher I. Thus maximum information approaches non-equilibrium, one condition for life.
 Robert Carroll Physics , 2008, Abstract: Some situations are discussed where subquantum oscillations in momentum arise in connectiion with Fisher information and the quantum potential.
 Statistics , 2010, DOI: 10.1016/j.spl.2010.08.015 Abstract: Motivated by the information bound for the asymptotic variance of M-estimates for scale, we define Fisher information of scale of any distribution function F on the real line as a suitable supremum. In addition, we enforce equivariance by a scale factor. Fisher information of scale is weakly lower semicontinuous and convex. It is finite iff the usual assumptions on densities hold, under which Fisher information of scale is classically defined, and then both classical and our notions agree. Fisher information of scale finite is also equivalent to L_2-differentiability and local asymptotic normality, respectively, of the scale model induced by F.
 Physics , 2014, Abstract: In this paper we will review the co-adjoint orbit formulation of finite dimensional quantum mechanics, and in this framework we will interpret the notion of quantum Fisher information index (and metric). Following [15, 16], where the definition of Fisher information tensor is introduced, we will show how its antisymmetric part is the pullback of the natural Kostant-Kirillov-Souriau symplectic form along some natural difeomorphism. In order to do this we will need to understand the symmetric logarithmic derivative (SLD) as a proper 1-form, settling the issues about its very definition and explicit computation. Moreover, the fibration of co-adjoint orbits, seen as spaces of mixed states, is also discussed.
 Page 1 /100 Display every page 5 10 20 Item