Abstract:
The experimental problem of converting a measured binomial quantity, the fraction of events in a sample that pass a cut, into a physical binomial quantity, the fraction of events originating from a signal source, is described as a system of linear equations. This linear system illustrates several familiar aspects of experimental data analysis. Bayesian probability theory is used to find a solution to this binomial measurement problem that allows for the straightforward construction of confidence intervals. This solution is also shown to provide an unbiased formalism for evaluating the behavior of data sets under different choices of cuts, including a cut designed to increase the significance of a possible, albeit previously unseen, signal. Several examples are used to illustrate the features of this method, including the discovery of the top quark and searches for new particles produced in association with $W^{\pm}$ bosons. It is also demonstrated how to use this method to make projections for the potential discovery of a Standard Model Higgs boson at a Tevatron Run 2 experiment, as well as the utility of measuring the integrated luminosity through inclusive $p\bar{p} \to W^{\pm}$ production.

Abstract:
Interpretation of the nonclassical total probability formula arising in some quantum experiments is provided based on stochastic models described by means of a sequence of random vectors changing in the measurement procedures.

Abstract:
We consider the probability by which quantum phase measurements of a given precision can be done successfully. The least upper bound of this probability is derived and the associated optimal state vectors are determined. The probability bound represents an unique and continuous transition between macroscopic and microscopic measurement precisions.

Abstract:
Fermilab operates the world's most intense source of antiprotons. Recently various experiments have been proposed that can use those antiprotons either parasitically during Tevatron Collider running or after the Tevatron Collider finishes in about 2010. We discuss the physics goals and prospects of the proposed experiments.

Abstract:
We show that the natural scaling of measurement for a particular problem defines the most likely probability distribution of observations taken from that measurement scale. Our approach extends the method of maximum entropy to use measurement scale as a type of information constraint. We argue that a very common measurement scale is linear at small magnitudes grading into logarithmic at large magnitudes, leading to observations that often follow Student’s probability distribution which has a Gaussian shape for small fluctuations from the mean and a power law shape for large fluctuations from the mean. An inverse scaling often arises in which measures naturally grade from logarithmic to linear as one moves from small to large magnitudes, leading to observations that often follow a gamma probability distribution. A gamma distribution has a power law shape for small magnitudes and an exponential shape for large magnitudes. The two measurement scales are natural inverses connected by the Laplace integral transform. This inversion connects the two major scaling patterns commonly found in nature. We also show that superstatistics is a special case of an integral transform, and thus can be understood as a particular way in which to change the scale of measurement. Incorporating information about measurement scale into maximum entropy provides a general approach to the relations between measurement, information and probability.

Abstract:
Different criteria (Shannon's entropy, Bayes' average cost, Durr's normalized rms spread) have been introduced to measure the "which-way" information present in interference experiments where, due to non-orthogonality of the detector states, the path determination is incomplete. For each of these criteria, we determine the optimal measurement to be carried on the detectors, in order to read out the maximum which-way information. We show that, while in two-beam experiments, the optimal measurement is always provided by an observable involving the detector only, in multibeam experiments, with equally populated beams and two-state detectors, this is the case only for the Durr criterion, as the other two require the introduction of an ancillary quantum system, as part of the read-out apparatus.

Abstract:
One more derivation of the quantum probability rule is presented in order to shed more light on the versatile aspects of this fundamental law. It is shown that the change of state in minimal quantum non-demolition measurement, also known as ideal measurement, implies the probability law in a simple way. Namely, the very requirement of minimal change of state, put in proper mathematical form, gives the well known Lueders formula, which contains the probability rule.

Abstract:
We propose a method to estimate the probability of new physics discovery in future high energy physics experiments. Physics simulation gives both the average numbers $$ of background and $$ of signal events. We find that the proper definition of the significance is $S_{12} = \sqrt{+} - \sqrt{}$ in comparison with often used significances $S_1 = \displaystyle \frac{}{\sqrt{}}$ and $S_2 = \displaystyle \frac{}{\sqrt{ + }}$. We propose a method for taking into account systematic uncertainties related to nonexact knowledge of background and signal cross sections. An account of such systematics is very essential in the search for supersymmetry at LHC. We propose a method for estimation of exclusion limits on new physics in future experiments. We also estimate the probability of new physics discovery in future experiments taking into account systematical errors.

Abstract:
Ideal occurrence of an event (projector) leads to the known change of a state (density operator) into (the Lüders state). It is shown that two events and give the same Lüders state if and only if the equivalence relation is valid. This relation determines equivalence classes. The set of them and each class, are studied in detail. It is proved that the range projector of the Lüders state can be evaluated as , where denotes the greatest lower bound, and is the null projector of . State-dependent implication extends absolute implication (which, in turn, determines the entire structure of quantum logic). and are investigated in a closely related way to mutual benefit. Inherent in the preorder is the state-dependent equivalence , defining equivalence classes in a given Boolean subalgebra. The quotient set, in which the classes are the elements, has itself a partially ordered structure, and so has each class. In a complete Boolean subalgebra, both structures are complete lattices. Physical meanings are discussed. 1. Introduction The basic object of this study is the concept of a quantum-mechanical state (density operator) and its change when an event (projector) with positive probability occurs (the result is obtained) in ideal measurement. The terms “state” and “density operator” as well as “event,” and “projector” will be used interchangeably. As it is well known, ideal measurement is the simplest special case of measurement of the first kind (synonyms: predictive, repeatable, and nondemolition measurement). It causes a change of state according to the Lüders selective (or definite-result) formula See [1–3]. We call the Lüders state, and a state-determining projector (it is all meant with respect to the initially given state ). Usually one treats the special case of a pure-state . Then, as easily seen, (1.1) takes the simple form which is sometimes called the von Neumann-Lüders projection (it is actually a normalized projection). Von Neumann treated the even more special case when the event is elementary (an atom) [4]. The change of state then is The Lüders state was postulated by Lüders. It was derived by several authors including the present one [5, 6] (and the first derivation was repeated in different context in [7]; see also references in these articles). For a different approach, see references [8–10]. In Khrennikov？s terminology, one deals with the postulates of Lüders and von Neumann (and he carefully examines their effect on some foundational issues). As it was mentioned, in my view, ideal measurement is the simplest kind of measurement, and of

Abstract:
An interesting link between two very different physical aspects of quantum mechanics is revealed; these are the absence of third-order interference and Tsirelson's bound for the nonlocal correlations. Considering multiple-slit experiments - not only the traditional configuration with two slits, but also configurations with three and more slits - Sorkin detected that third-order (and higher-order) interference is not possible in quantum mechanics. The EPR experiments show that quantum mechanics involves nonlocal correlations which are demonstrated in a violation of the Bell or CHSH inequality, but are still limited by a bound discovered by Tsirelson. It now turns out that Tsirelson's bound holds in a broad class of probabilistic theories provided that they rule out third-order interference. A major characteristic of this class is the existence of a reasonable calculus of conditional probability or, phrased more physically, of a reasonable model for the quantum measurement process.