oalib

Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99

Submit

Any time

2019 ( 47 )

2018 ( 290 )

2017 ( 277 )

2016 ( 354 )

Custom range...

Search Results: 1 - 10 of 127753 matches for " Joseph T. Lizier "
All listed articles are free for downloading (OA Articles)
Page 1 /127753
Display every page Item
JIDT: An information-theoretic toolkit for studying the dynamics of complex systems
Joseph T. Lizier
Physics , 2014, DOI: 10.3389/frobt.2014.00011
Abstract: Complex systems are increasingly being viewed as distributed information processing systems, particularly in the domains of computational neuroscience, bioinformatics and Artificial Life. This trend has resulted in a strong uptake in the use of (Shannon) information-theoretic measures to analyse the dynamics of complex systems in these fields. We introduce the Java Information Dynamics Toolkit (JIDT): a Google code project which provides a standalone, (GNU GPL v3 licensed) open-source code implementation for empirical estimation of information-theoretic measures from time-series data. While the toolkit provides classic information-theoretic measures (e.g. entropy, mutual information, conditional mutual information), it ultimately focusses on implementing higher-level measures for information dynamics. That is, JIDT focusses on quantifying information storage, transfer and modification, and the dynamics of these operations in space and time. For this purpose, it includes implementations of the transfer entropy and active information storage, their multivariate extensions and local or pointwise variants. JIDT provides implementations for both discrete and continuous-valued data for each measure, including various types of estimator for continuous data (e.g. Gaussian, box-kernel and Kraskov-Stoegbauer-Grassberger) which can be swapped at run-time due to Java's object-oriented polymorphism. Furthermore, while written in Java, the toolkit can be used directly in MATLAB, GNU Octave, Python and other environments. We present the principles behind the code design, and provide several examples to guide users.
Differentiating information transfer and causal effect
Joseph T. Lizier,Mikhail Prokopenko
Physics , 2008, DOI: 10.1140/epjb/e2010-00034-5
Abstract: The concepts of information transfer and causal effect have received much recent attention, yet often the two are not appropriately distinguished and certain measures have been suggested to be suitable for both. We discuss two existing measures, transfer entropy and information flow, which can be used separately to quantify information transfer and causal information flow respectively. We apply these measures to cellular automata on a local scale in space and time, in order to explicitly contrast them and emphasize the differences between information transfer and causality. We also describe the manner in which the measures are complementary, including the circumstances under which the transfer entropy is the best available choice to infer a causal effect. We show that causal information flow is a primary tool to describe the causal structure of a system, while information transfer can then be used to describe the emergent computation in the system.
Identifying influential spreaders and efficiently estimating infection numbers in epidemic models: a walk counting approach
Frank Bauer,Joseph T. Lizier
Computer Science , 2012, DOI: 10.1209/0295-5075/99/68007
Abstract: We introduce a new method to efficiently approximate the number of infections resulting from a given initially-infected node in a network of susceptible individuals. Our approach is based on counting the number of possible infection walks of various lengths to each other node in the network. We analytically study the properties of our method, in particular demonstrating different forms for SIS and SIR disease spreading (e.g. under the SIR model our method counts self-avoiding walks). In comparison to existing methods to infer the spreading efficiency of different nodes in the network (based on degree, k-shell decomposition analysis and different centrality measures), our method directly considers the spreading process and, as such, is unique in providing estimation of actual numbers of infections. Crucially, in simulating infections on various real-world networks with the SIR model, we show that our walks-based method improves the inference of effectiveness of nodes over a wide range of infection rates compared to existing methods. We also analyse the trade-off between estimate accuracy and computational cost, showing that the better accuracy here can still be obtained at a comparable computational cost to other methods.
Moving Frames of Reference, Relativity and Invariance in Transfer Entropy and Information Dynamics
Joseph T. Lizier,John R. Mahoney
Entropy , 2013, DOI: 10.3390/e15010177
Abstract: We present a new interpretation of a local framework for informationdynamics, including the transfer entropy, by defining a moving frame of reference for theobserver of dynamics in lattice systems. This formulation is inspired by the idea ofinvestigating “relativistic” effects on observing the dynamics of information - in particular,we investigate a Galilean transformation of the lattice system data. In applying thisinterpretation to elementary cellular automata, we demonstrate that using a moving frameof reference certainly alters the observed spatiotemporal measurements of informationdynamics, yet still returns meaningful results in this context. We find that, as expected,an observer will report coherent spatiotemporal structures that are moving in their frame asinformation transfer, and structures that are stationary in their frame as information storage.Crucially, the extent to which the shifted frame of reference alters the results dependson whether the shift of frame retains, adds or removes relevant information regarding thesource-destination interaction.
Bits from Biology for Computational Intelligence
Michael Wibral,Joseph T. Lizier,Viola Priesemann
Quantitative Biology , 2014,
Abstract: Computational intelligence is broadly defined as biologically-inspired computing. Usually, inspiration is drawn from neural systems. This article shows how to analyze neural systems using information theory to obtain constraints that help identify the algorithms run by such systems and the information they represent. Algorithms and representations identified information-theoretically may then guide the design of biologically inspired computing systems (BICS). The material covered includes the necessary introduction to information theory and the estimation of information theoretic quantities from neural data. We then show how to analyze the information encoded in a system about its environment, and also discuss recent methodological developments on the question of how much information each agent carries about the environment either uniquely, or redundantly or synergistically together with others. Last, we introduce the framework of local information dynamics, where information processing is decomposed into component processes of information storage, transfer, and modification -- locally in space and time. We close by discussing example applications of these measures to neural data and other complex systems.
On Thermodynamic Interpretation of Transfer Entropy
Mikhail Prokopenko,Joseph T. Lizier,Don C. Price
Entropy , 2013, DOI: 10.3390/e15020524
Abstract: We propose a thermodynamic interpretation of transfer entropy near equilibrium, using a specialised Boltzmann’s principle. The approach relates conditional probabilities to the probabilities of the corresponding state transitions. This in turn characterises transfer entropy as a difference of two entropy rates: the rate for a resultant transition and another rate for a possibly irreversible transition within the system affected by an additional source. We then show that this difference, the local transfer entropy, is proportional to the external entropy production, possibly due to irreversibility. Near equilibrium, transfer entropy is also interpreted as the difference in equilibrium stabilities with respect to two scenarios: a default case and the case with an additional source. Finally, we demonstrated that such a thermodynamic treatment is not applicable to information flow, a measure of causal effect.
Local active information storage as a tool to understand distributed neural information processing
Michael Wibral,Joseph T. Lizier,Viola Priesemann,Ralf Galuske
Frontiers in Neuroinformatics , 2014, DOI: 10.3389/fninf.2014.00001
Abstract: Every act of information processing can in principle be decomposed into the component operations of information storage, transfer, and modification. Yet, while this is easily done for today's digital computers, the application of these concepts to neural information processing was hampered by the lack of proper mathematical definitions of these operations on information. Recently, definitions were given for the dynamics of these information processing operations on a local scale in space and time in a distributed system, and the specific concept of local active information storage was successfully applied to the analysis and optimization of artificial neural systems. However, no attempt to measure the space-time dynamics of local active information storage in neural data has been made to date. Here we measure local active information storage on a local scale in time and space in voltage sensitive dye imaging data from area 18 of the cat. We show that storage reflects neural properties such as stimulus preferences and surprise upon unexpected stimulus change, and in area 18 reflects the abstract concept of an ongoing stimulus despite the locally random nature of this stimulus. We suggest that LAIS will be a useful quantity to test theories of cortical function, such as predictive coding.
Local information transfer as a spatiotemporal filter for complex systems
Joseph T. Lizier,Mikhail Prokopenko,Albert Y. Zomaya
Physics , 2008, DOI: 10.1103/PhysRevE.77.026110
Abstract: We present a measure of local information transfer, derived from an existing averaged information-theoretical measure, namely transfer entropy. Local transfer entropy is used to produce profiles of the information transfer into each spatiotemporal point in a complex system. These spatiotemporal profiles are useful not only as an analytical tool, but also allow explicit investigation of different parameter settings and forms of the transfer entropy metric itself. As an example, local transfer entropy is applied to cellular automata, where it is demonstrated to be a novel method of filtering for coherent structure. More importantly, local transfer entropy provides the first quantitative evidence for the long-held conjecture that the emergent traveling coherent structures known as particles (both gliders and domain walls, which have analogues in many physical processes) are the dominant information transfer agents in cellular automata.
A framework for the local information dynamics of distributed computation in complex systems
Joseph T. Lizier,Mikhail Prokopenko,Albert Y. Zomaya
Physics , 2008, DOI: 10.1007/978-3-642-53734-9_5
Abstract: The nature of distributed computation has often been described in terms of the component operations of universal computation: information storage, transfer and modification. We review the first complete framework that quantifies each of these individual information dynamics on a local scale within a system, and describes the manner in which they interact to create non-trivial computation where "the whole is greater than the sum of the parts". We describe the application of the framework to cellular automata, a simple yet powerful model of distributed computation. This is an important application, because the framework is the first to provide quantitative evidence for several important conjectures about distributed computation in cellular automata: that blinkers embody information storage, particles are information transfer agents, and particle collisions are information modification events. The framework is also shown to contrast the computations conducted by several well-known cellular automata, highlighting the importance of information coherence in complex computation. The results reviewed here provide important quantitative insights into the fundamental nature of distributed computation and the dynamics of complex systems, as well as impetus for the framework to be applied to the analysis and design of other systems.
Towards a Synergy-based Approach to Measuring Information Modification
Joseph T. Lizier,Benjamin Flecker,Paul L. Williams
Physics , 2013, DOI: 10.1109/ALIFE.2013.6602430
Abstract: Distributed computation in artificial life and complex systems is often described in terms of component operations on information: information storage, transfer and modification. Information modification remains poorly described however, with the popularly-understood examples of glider and particle collisions in cellular automata being only quantitatively identified to date using a heuristic (separable information) rather than a proper information-theoretic measure. We outline how a recently-introduced axiomatic framework for measuring information redundancy and synergy, called partial information decomposition, can be applied to a perspective of distributed computation in order to quantify component operations on information. Using this framework, we propose a new measure of information modification that captures the intuitive understanding of information modification events as those involving interactions between two or more information sources. We also consider how the local dynamics of information modification in space and time could be measured, and suggest a new axiom that redundancy measures would need to meet in order to make such local measurements. Finally, we evaluate the potential for existing redundancy measures to meet this localizability axiom.
Page 1 /127753
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.