Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99


Any time

2020 ( 5 )

2019 ( 59 )

2018 ( 69 )

2017 ( 63 )

Custom range...

Search Results: 1 - 10 of 34681 matches for " Thomas Schürmann "
All listed articles are free for downloading (OA Articles)
Page 1 /34681
Display every page Item
On the measurement probability of quantum phases
Thomas Schürmann
Physics , 2006,
Abstract: We consider the probability by which quantum phase measurements of a given precision can be done successfully. The least upper bound of this probability is derived and the associated optimal state vectors are determined. The probability bound represents an unique and continuous transition between macroscopic and microscopic measurement precisions.
Bias Analysis in Entropy Estimation
Thomas Schürmann
Physics , 2004, DOI: 10.1088/0305-4470/37/27/L02
Abstract: We consider the problem of finite sample corrections for entropy estimation. New estimates of the Shannon entropy are proposed and their systematic error (the bias) is computed analytically. We find that our results cover correction formulas of current entropy estimates recently discussed in literature. The trade-off between bias reduction and the increase of the corresponding statistical error is analyzed.
A single particle uncertainty relation
Thomas Schürmann
Physics , 2003,
Abstract: We consider successive measurements of position and momentum of a single particle. Let P be the conditional probability to measure the momentum k with precision dk, given a previously successful position measurement q with precision dq. Several upper bounds for the probability P are determined. For arbitrary, but given precision dq, dk, these bounds refer to the variation of q, k and the state vector of the particle. A weak bound is given by the inequality P <= dkdq/h, where h is Planck's quantum of action. It is non-trivial for all measurements with dkdq < h. A sharper bound is obtained by applying the Hilbert-Schmidt-norm. As our main result the least upper bound of P is determined. All bounds are independent of the order with which the measuring of position and momentum is made.
A note on entropic uncertainty relations of position and momentum
Thomas Schürmann
Physics , 2010, DOI: 10.1007/s10946-012-9258-y
Abstract: We consider two entropic uncertainty relations of position and momentum recently discussed in literature. By a suitable rescaling of one of them, we obtain a smooth interpolation of both for high-resolution and low-resolution measurements respectively. Because our interpolation has never been mentioned in literature before, we propose it as a candidate for an improved entropic uncertainty relation of position and momentum. Up to now, the author has neither been able to falsify nor prove the new inequality. In our opinion it is a challenge to do either one.
Scaling behaviour of entropy estimates
Thomas Schürmann
Mathematics , 2002, DOI: 10.1088/0305-4470/35/7/308
Abstract: Entropy estimation of information sources is highly non trivial for symbol sequences with strong long-range correlations. The rabbit sequence, related to the symbolic dynamics of the nonlinear circle map at the critical point as well as the logistic map at the Feigenbaum point have been argued to exhibit long memory tails. For both dynamical systems the scaling behavior of the block entropy of order n has been shown to increase like as log(n) . In contrast to probabilistic concepts, we investigate the scaling behavior of certain non-probabilistic entropy estimation schemes suggested by Lempel and Ziv in the context of algorithmic complexity and data compression. These are applied in a sequential manner with the scaling variable being the length N of the sequence. We determine the scaling law for the Lempel-Ziv entropy estimate applied to the case of the critical circle map and the logistic map at the Feigenbaum point in a binary partition.
A note on entropy estimation
Thomas Schürmann
Statistics , 2015, DOI: 10.1162/NECO_a_00775
Abstract: We compare an entropy estimator $H_z$ recently discussed in [10] with two estimators $H_1$ and $H_2$ introduced in [6][7]. We prove the identity $H_z \equiv H_1$, which has not been taken into account in [10]. Then, we prove that the statistical bias of $H_1$ is less than the bias of the ordinary likelihood estimator of entropy. Finally, by numerical simulation we verify that for the most interesting regime of small sample estimation and large event spaces, the estimator $H_2$ has a significant smaller statistical error than $H_z$.
A note on the best invariant estimation of continuous probability distributions under mean square loss
Thomas Schürmann
Statistics , 2015,
Abstract: We consider the nonparametric estimation problem of continuous probability distribution functions. For the integrated mean square error we provide the statistic corresponding to the best invariant estimator proposed by Aggarwal (1955) and Ferguson (1967). The table of critical values is computed and a numerical power comparison of the statistic with the traditional Cram\'{e}r-von Mises statistic is done for several representative distributions.
The entropy of ``strange'' billiards inside n-simplexes
Thomas Schürmann,Ingo Hoffmann
Physics , 1995,
Abstract: In the present work we investigate a new type of billiards defined inside of $n$--simplex regions. We determine an invariant ergodic (SRB) measure of the dynamics for any dimension. In using symbolic dynamics, the (KS or metric) entropy is computed and we find that the system is chaotic for all cases $n>2$.
A closer look at the uncertainty relation of position and momentum
Thomas Schürmann,Ingo Hoffmann
Physics , 2008, DOI: 10.1007/s10701-009-9310-0
Abstract: We consider particles prepared by the von Neumann-L\"uders projection. For those particles the standard deviation of the momentum is discussed. We show that infinite standard deviations are not exceptions but rather typical. A necessary and sufficient condition for finite standard deviations is given. Finally, a new uncertainty relation is derived and it is shown that the latter cannot be improved.
Enlarged scaling ranges for the KS-entropy and the information dimension
Holger Kantz,Thomas Schürmann
Physics , 2002, DOI: 10.1063/1.166161
Abstract: Numerical estimates of the Kolmogorov-Sinai entropy based on a finite amount of data decay towards zero in the relevant limits. Rewriting differences of block entropies as averages over decay rates, and ignoring all parts of the sample where these rates are uncomputable because of the lack of neighbours, yields improved entropy estimates. In the same way, the scaling range for estimates of the information dimension can be extended considerably. The improvement is demonstrated for experimental data.
Page 1 /34681
Display every page Item

Copyright © 2008-2017 Open Access Library. All rights reserved.