Abstract:
We consider the probability by which quantum phase measurements of a given precision can be done successfully. The least upper bound of this probability is derived and the associated optimal state vectors are determined. The probability bound represents an unique and continuous transition between macroscopic and microscopic measurement precisions.

Abstract:
We consider the problem of finite sample corrections for entropy estimation. New estimates of the Shannon entropy are proposed and their systematic error (the bias) is computed analytically. We find that our results cover correction formulas of current entropy estimates recently discussed in literature. The trade-off between bias reduction and the increase of the corresponding statistical error is analyzed.

Abstract:
We consider successive measurements of position and momentum of a single particle. Let P be the conditional probability to measure the momentum k with precision dk, given a previously successful position measurement q with precision dq. Several upper bounds for the probability P are determined. For arbitrary, but given precision dq, dk, these bounds refer to the variation of q, k and the state vector of the particle. A weak bound is given by the inequality P <= dkdq/h, where h is Planck's quantum of action. It is non-trivial for all measurements with dkdq < h. A sharper bound is obtained by applying the Hilbert-Schmidt-norm. As our main result the least upper bound of P is determined. All bounds are independent of the order with which the measuring of position and momentum is made.

Abstract:
We consider two entropic uncertainty relations of position and momentum recently discussed in literature. By a suitable rescaling of one of them, we obtain a smooth interpolation of both for high-resolution and low-resolution measurements respectively. Because our interpolation has never been mentioned in literature before, we propose it as a candidate for an improved entropic uncertainty relation of position and momentum. Up to now, the author has neither been able to falsify nor prove the new inequality. In our opinion it is a challenge to do either one.

Abstract:
Entropy estimation of information sources is highly non trivial for symbol sequences with strong long-range correlations. The rabbit sequence, related to the symbolic dynamics of the nonlinear circle map at the critical point as well as the logistic map at the Feigenbaum point have been argued to exhibit long memory tails. For both dynamical systems the scaling behavior of the block entropy of order n has been shown to increase like as log(n) . In contrast to probabilistic concepts, we investigate the scaling behavior of certain non-probabilistic entropy estimation schemes suggested by Lempel and Ziv in the context of algorithmic complexity and data compression. These are applied in a sequential manner with the scaling variable being the length N of the sequence. We determine the scaling law for the Lempel-Ziv entropy estimate applied to the case of the critical circle map and the logistic map at the Feigenbaum point in a binary partition.

Abstract:
We compare an entropy estimator $H_z$ recently discussed in [10] with two estimators $H_1$ and $H_2$ introduced in [6][7]. We prove the identity $H_z \equiv H_1$, which has not been taken into account in [10]. Then, we prove that the statistical bias of $H_1$ is less than the bias of the ordinary likelihood estimator of entropy. Finally, by numerical simulation we verify that for the most interesting regime of small sample estimation and large event spaces, the estimator $H_2$ has a significant smaller statistical error than $H_z$.

Abstract:
We consider the nonparametric estimation problem of continuous probability distribution functions. For the integrated mean square error we provide the statistic corresponding to the best invariant estimator proposed by Aggarwal (1955) and Ferguson (1967). The table of critical values is computed and a numerical power comparison of the statistic with the traditional Cram\'{e}r-von Mises statistic is done for several representative distributions.

Abstract:
In the present work we investigate a new type of billiards defined inside of $n$--simplex regions. We determine an invariant ergodic (SRB) measure of the dynamics for any dimension. In using symbolic dynamics, the (KS or metric) entropy is computed and we find that the system is chaotic for all cases $n>2$.

Abstract:
We consider particles prepared by the von Neumann-L\"uders projection. For those particles the standard deviation of the momentum is discussed. We show that infinite standard deviations are not exceptions but rather typical. A necessary and sufficient condition for finite standard deviations is given. Finally, a new uncertainty relation is derived and it is shown that the latter cannot be improved.

Abstract:
Numerical estimates of the Kolmogorov-Sinai entropy based on a finite amount of data decay towards zero in the relevant limits. Rewriting differences of block entropies as averages over decay rates, and ignoring all parts of the sample where these rates are uncomputable because of the lack of neighbours, yields improved entropy estimates. In the same way, the scaling range for estimates of the information dimension can be extended considerably. The improvement is demonstrated for experimental data.