oalib

Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99

Submit

Any time

2020 ( 1 )

2019 ( 182 )

2018 ( 248 )

2017 ( 270 )

Custom range...

Search Results: 1 - 10 of 189853 matches for " G. Morvai "
All listed articles are free for downloading (OA Articles)
Page 1 /189853
Display every page Item
Queueing for ergodic arrivals and services
L. Gyorfi,G. Morvai
Mathematics , 2007,
Abstract: In this paper we revisit the results of Loynes (1962) on stability of queues for ergodic arrivals and services, and show examples when the arrivals are bounded and ergodic, the service rate is constant, and under stability the limit distribution has larger than exponential tail.
Prediction for discrete time series
G. Morvai,B. Weiss
Mathematics , 2007,
Abstract: Let $\{X_n\}$ be a stationary and ergodic time series taking values from a finite or countably infinite set ${\cal X}$. Assume that the distribution of the process is otherwise unknown. We propose a sequence of stopping times $\lambda_n$ along which we will be able to estimate the conditional probability $P(X_{\lambda_n+1}=x|X_0,...,X_{\lambda_n})$ from data segment $(X_0,...,X_{\lambda_n})$ in a pointwise consistent way for a restricted class of stationary and ergodic finite or countably infinite alphabet time series which includes among others all stationary and ergodic finitarily Markovian processes. If the stationary and ergodic process turns out to be finitarily Markovian (among others, all stationary and ergodic Markov chains are included in this class) then $ \lim_{n\to \infty} {n\over \lambda_n}>0$ almost surely. If the stationary and ergodic process turns out to possess finite entropy rate then $\lambda_n$ is upperbounded by a polynomial, eventually almost surely.
Intermittent estimation of stationary time series
G. Morvai,B. Weiss
Mathematics , 2007,
Abstract: Let $\{X_n\}_{n=0}^{\infty}$ be a stationary real-valued time series with unknown distribution. Our goal is to estimate the conditional expectation of $X_{n+1}$ based on the observations $X_i$, $0\le i\le n$ in a strongly consistent way. Bailey and Ryabko proved that this is not possible even for ergodic binary time series if one estimates at all values of $n$. We propose a very simple algorithm which will make prediction infinitely often at carefully selected stopping times chosen by our rule. We show that under certain conditions our procedure is strongly (pointwise) consistent, and $L_2$ consistent without any condition. An upper bound on the growth of the stopping times is also presented in this paper.
Order estimation of Markov chains
G. Morvai,B. Weiss
Mathematics , 2007,
Abstract: We describe estimators $\chi_n(X_0,X_1,...,X_n)$, which when applied to an unknown stationary process taking values from a countable alphabet ${\cal X}$, converge almost surely to $k$ in case the process is a $k$-th order Markov chain and to infinity otherwise.
On Sequential Estimation and Prediction for Discrete Time Series
G. Morvai,B. Weiss
Mathematics , 2008,
Abstract: The problem of extracting as much information as possible from a sequence of observations of a stationary stochastic process $X_0,X_1,...X_n$ has been considered by many authors from different points of view. It has long been known through the work of D. Bailey that no universal estimator for $\textbf{P}(X_{n+1}|X_0,X_1,...X_n)$ can be found which converges to the true estimator almost surely. Despite this result, for restricted classes of processes, or for sequences of estimators along stopping times, universal estimators can be found. We present here a survey of some of the recent work that has been done along these lines.
A simple randomized algorithm for sequential prediction of ergodic time series
L. Gy?rfi,G. Lugosi,G. Morvai
Mathematics , 2008,
Abstract: We present a simple randomized procedure for the prediction of a binary sequence. The algorithm uses ideas from recent developments of the theory of the prediction of individual sequences. We show that if the sequence is a realization of a stationary and ergodic random process then the average number of mistakes converges, almost surely, to that of the optimum, given by the Bayes predictor. The desirable finite-sample properties of the predictor are illustrated by its performance for Markov processes. In such cases the predictor exhibits near optimal behavior even without knowing the order of the Markov process. Prediction with side information is also considered.
Nonparametric inference for ergodic, stationary time series
G. Morvai,S. Yakowitz,L. Gyorfi
Mathematics , 2007,
Abstract: The setting is a stationary, ergodic time series. The challenge is to construct a sequence of functions, each based on only finite segments of the past, which together provide a strongly consistent estimator for the conditional probability of the next observation, given the infinite past. Ornstein gave such a construction for the case that the values are from a finite set, and recently Algoet extended the scheme to time series with coordinates in a Polish space. The present study relates a different solution to the challenge. The algorithm is simple and its verification is fairly transparent. Some extensions to regression, pattern recognition, and on-line forecasting are mentioned.
Limits to consistent on-line forecasting for ergodic time series
L. Gyorfi,G. Morvai,S. Yakowitz
Mathematics , 2007,
Abstract: This study concerns problems of time-series forecasting under the weakest of assumptions. Related results are surveyed and are points of departure for the developments here, some of which are new and others are new derivations of previous findings. The contributions in this study are all negative, showing that various plausible prediction problems are unsolvable, or in other cases, are not solvable by predictors which are known to be consistent when mixing conditions hold.
Weakly Convergent Nonparametric Forecasting of Stationary Time Series
G. Morvai,S. Yakowitz,P. Algoet
Mathematics , 2008,
Abstract: The conditional distribution of the next outcome given the infinite past of a stationary process can be inferred from finite but growing segments of the past. Several schemes are known for constructing pointwise consistent estimates, but they all demand prohibitive amounts of input data. In this paper we consider real-valued time series and construct conditional distribution estimates that make much more efficient use of the input data. The estimates are consistent in a weak sense, and the question whether they are pointwise consistent is still open. For finite-alphabet processes one may rely on a universal data compression scheme like the Lempel-Ziv algorithm to construct conditional probability mass function estimates that are consistent in expected information divergence. Consistency in this strong sense cannot be attained in a universal sense for all stationary processes with values in an infinite alphabet, but weak consistency can. Some applications of the estimates to on-line forecasting, regression and classification are discussed.
Strongly consistent nonparametric forecasting and regression for stationary ergodic sequences
S. Yakowitz,L. Gyorfi,J. Kieffer,G. Morvai
Mathematics , 2007,
Abstract: Let $\{(X_i,Y_i)\}$ be a stationary ergodic time series with $(X,Y)$ values in the product space $\R^d\bigotimes \R .$ This study offers what is believed to be the first strongly consistent (with respect to pointwise, least-squares, and uniform distance) algorithm for inferring $m(x)=E[Y_0|X_0=x]$ under the presumption that $m(x)$ is uniformly Lipschitz continuous. Auto-regression, or forecasting, is an important special case, and as such our work extends the literature of nonparametric, nonlinear forecasting by circumventing customary mixing assumptions. The work is motivated by a time series model in stochastic finance and by perspectives of its contribution to the issues of universal time series estimation.
Page 1 /189853
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.