oalib
Search Results: 1 - 10 of 100 matches for " "
All listed articles are free for downloading (OA Articles)
Page 1 /100
Display every page Item
Quantum Algorithmic Entropy  [PDF]
Peter Gacs
Mathematics , 2000, DOI: 10.1088/0305-4470/34/35/312
Abstract: We extend algorithmic information theory to quantum mechanics, taking a universal semicomputable density matrix (``universal probability'') as a starting point, and define complexity (an operator) as its negative logarithm. A number of properties of Kolmogorov complexity extend naturally to the new domain. Approximately, a quantum state is simple if it is within a small distance from a low-dimensional subspace of low Kolmogorov complexity. The von Neumann entropy of a computable density matrix is within an additive constant from the average complexity. Some of the theory of randomness translates to the new domain. We explore the relations of the new quantity to the quantum Kolmogorov complexity defined by Vitanyi (we show that the latter is sometimes as large as 2n - 2log n and the qubit complexity defined by Berthiaume, Dam and Laplante. The ``cloning'' properties of our complexity measure are similar to those of qubit complexity.
Entropy Measures vs. Algorithmic Information  [PDF]
Andreia Teixeira,Andre Souto,Armando Matos,Luis Antunes
Mathematics , 2009,
Abstract: Algorithmic entropy and Shannon entropy are two conceptually different information measures, as the former is based on size of programs and the later in probability distributions. However, it is known that, for any recursive probability distribution, the expected value of algorithmic entropy equals its Shannon entropy, up to a constant that depends only on the distribution. We study if a similar relationship holds for R\'{e}nyi and Tsallis entropies of order $\alpha$, showing that it only holds for R\'{e}nyi and Tsallis entropies of order 1 (i.e., for Shannon entropy). Regarding a time bounded analogue relationship, we show that, for distributions such that the cumulative probability distribution is computable in time $t(n)$, the expected value of time-bounded algorithmic entropy (where the alloted time is $nt(n)\log (nt(n))$) is in the same range as the unbounded version. So, for these distributions, Shannon entropy captures the notion of computationally accessible information. We prove that, for universal time-bounded distribution $\m^t(x)$, Tsallis and R\'{e}nyi entropies converge if and only if $\alpha$ is greater than 1.
Algorithmic entropy, thermodynamics, and game interpretation  [PDF]
Lev Sakhnovich
Mathematics , 2011,
Abstract: Basic relations for the mean length and algorithmic entropy are obtained by solving a new extremal problem. Using this extremal problem, they are obtained in a most simple and general way. The length and entropy are considered as two players of a new type of a game, in which we follow the scheme of our previous work on thermodynamic characteristics in quantum and classical approaches.
Quantum Dynamical Entropies and Gács Algorithmic Entropy  [PDF]
Fabio Benatti
Entropy , 2012, DOI: 10.3390/e14071259
Abstract: Several quantum dynamical entropies have been proposed that extend the classical Kolmogorov–Sinai (dynamical) entropy. The same scenario appears in relation to the extension of algorithmic complexity theory to the quantum realm. A theorem of Brudno establishes that the complexity per unit time step along typical trajectories of a classical ergodic system equals the KS-entropy. In the following, we establish a similar relation between the Connes–Narnhofer–Thirring quantum dynamical entropy for the shift on quantum spin chains and the Gács algorithmic entropy. We further provide, for the same system, a weaker linkage between the latter algorithmic complexity and a different quantum dynamical entropy proposed by Alicki and Fannes.
The Manneville map: topological, metric and algorithmic entropy  [PDF]
Claudio Bonanno
Mathematics , 2001,
Abstract: We study the Manneville map f(x)=x+x^z (mod 1), with z>1, from a computational point of view, studying the behaviour of the Algorithmic Information Content. In particular, we consider a family of piecewise linear maps that gives examples of algorithmic behaviour ranging from the fully to the mildly chaotic, and show that the Manneville map is a member of this family.
Small Data Matters, Algorithmic Data Analytics and Correlation versus Causation  [PDF]
Hector Zenil
Computer Science , 2013,
Abstract: This is a review of aspects of the theory of algorithmic information that may contribute to a framework for formulating questions related to complex highly unpredictable systems. We start by contrasting Shannon Entropy and Kolmogorov-Chaitin complexity epitomizing the difference between correlation and causation to then move onto surveying classical results from algorithmic complexity and algorithmic probability, highlighting their deep connection to the study of automata frequency distributions. We end showing how long-range algorithmic predicting models for economic and biological systems may require infinite computation but locally approximated short-range estimations are possible thereby showing how small data can deliver important insights into important features of complex "Big Data".
Algorithmic Thermodynamics  [PDF]
John C. Baez,Mike Stay
Mathematics , 2010,
Abstract: Algorithmic entropy can be seen as a special case of entropy as studied in statistical mechanics. This viewpoint allows us to apply many techniques developed for use in thermodynamics to the subject of algorithmic information theory. In particular, suppose we fix a universal prefix-free Turing machine and let X be the set of programs that halt for this machine. Then we can regard X as a set of 'microstates', and treat any function on X as an 'observable'. For any collection of observables, we can study the Gibbs ensemble that maximizes entropy subject to constraints on expected values of these observables. We illustrate this by taking the log runtime, length, and output of a program as observables analogous to the energy E, volume V and number of molecules N in a container of gas. The conjugate variables of these observables allow us to define quantities which we call the 'algorithmic temperature' T, 'algorithmic pressure' P and algorithmic potential' mu, since they are analogous to the temperature, pressure and chemical potential. We derive an analogue of the fundamental thermodynamic relation dE = T dS - P d V + mu dN, and use it to study thermodynamic cycles analogous to those for heat engines. We also investigate the values of T, P and mu for which the partition function converges. At some points on the boundary of this domain of convergence, the partition function becomes uncomputable. Indeed, at these points the partition function itself has nontrivial algorithmic entropy.
Optimal Algorithmic Cooling of Spins  [PDF]
Yuval Elias,José M. Fernandez,Tal Mor,Yossi Weinstein
Physics , 2007, DOI: 10.1007/978-3-540-73554-0
Abstract: Algorithmic Cooling (AC) of Spins is potentially the first near-future application of quantum computing devices. Straightforward quantum algorithms combined with novel entropy manipulations can result in a method to improve the identification of molecules. We introduce here several new exhaustive cooling algorithms, such as the Tribonacci and k-bonacci algorithms. In particular, we present the ``all-bonacci'' algorithm, which appears to reach the maximal degree of cooling obtainable by the optimal AC approach.
Algorithmic Complexity in Minority Game  [PDF]
Ricardo Mansilla Corona
Physics , 1999,
Abstract: In this paper we introduce a new approach for the study of the complex behavior of Minority Game using the tools of algorithmic complexity, physical entropy and information theory. We show that physical complexity and mutual information function strongly depend on memory size of the agents and yields more information about the complex features of the stream of binary outcomes of the game than volatility itself.
The Algorithmic Information Content for randomly perturbed systems  [PDF]
Claudio Bonanno
Mathematics , 2003,
Abstract: In this paper we prove estimates on the behaviour of the Kolmogorov-Sinai entropy relative to a partition for randomly perturbed dynamical systems. Our estimates use the entropy for the unperturbed system and are obtained using the notion of Algorithmic Information Content. The main result is an extension of known results to study time series obtained by the observation of real systems.
Page 1 /100
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.