Home OALib Journal OALib PrePrints Submit Ranking News My Lib FAQ About Us Follow Us+
 Title Keywords Abstract Author All
Search Results: 1 - 10 of 100 matches for " "
 Page 1 /100 Display every page 5 10 20 Item
 Statistics , 2014, Abstract: We propose a two-sample extended empirical likelihood for inference on the difference between two p-dimensional parameters defined by estimating equations. The standard two-sample empirical likelihood for the difference is Bartlett correctable but its domain is a bounded subset of the parameter space. We expand its domain through a composite similarity transformation to derive the two-sample extended empirical likelihood which is defined on the full parameter space. The extended empirical likelihood has the same asymptotic distribution as the standard one and can also achieve the second order accuracy of the Bartlett correction. We include two applications to illustrate the use of two-sample empirical likelihood methods and to demonstrate the superior coverage accuracy of the extended empirical likelihood confidence regions.
 Mathematics , 2009, DOI: 10.1214/07-AOS585 Abstract: We consider an empirical likelihood inference for parameters defined by general estimating equations when some components of the random observations are subject to missingness. As the nature of the estimating equations is wide-ranging, we propose a nonparametric imputation of the missing values from a kernel estimator of the conditional distribution of the missing variable given the always observable variable. The empirical likelihood is used to construct a profile likelihood for the parameter of interest. We demonstrate that the proposed nonparametric imputation can remove the selection bias in the missingness and the empirical likelihood leads to more efficient parameter estimation. The proposed method is further evaluated by simulation and an empirical study on a genetic dataset on recombinant inbred mice.
 Hiroaki Ogata Advances in Decision Sciences , 2012, DOI: 10.1155/2012/704693 Abstract: An application of the empirical likelihood method to non-Gaussian locally stationary processes is presented. Based on the central limit theorem for locally stationary processes, we give the asymptotic distributions of the maximum empirical likelihood estimator and the empirical likelihood ratio statistics, respectively. It is shown that the empirical likelihood method enables us to make inferences on various important indices in a time series analysis. Furthermore, we give a numerical study and investigate a finite sample property. 1. Introduction The empirical likelihood is one of the nonparametric methods for a statistical inference proposed by Owen [1, 2]. It is used for constructing confidence regions for a mean, for a class of M-estimates that includes quantile, and for differentiable statistical functionals. The empirical likelihood method has been applied to various problems because of its good properties: generality of the nonparametric method and effectiveness of the likelihood method. For example, we can name applications to the general estimating equations, [3] the regression models [4–6], the biased sample models [7], and so forth. Applications are also extended to dependent observations. Kitamura [8] developed the blockwise empirical likelihood for estimating equations and for smooth functions of means. Monti [9] applied the empirical likelihood method to linear processes, essentially under the circular Gaussian assumption, using a spectral method. For short- and long-range dependence, Nordman and Lahiri [10] gave the asymptotic properties of the frequency domain empirical likelihood. As we named above, some applications to time series analysis can be found but it seems that they were mainly for stationary processes. Although stationarity is the most fundamental assumption when we are engaged in a time series analysis, it is also known that real time series data are generally nonstationary (e.g., economics analysis). Therefore we need to use nonstationary models in order to describe the real world. Recently Dahlhaus [11–13] proposed an important class of nonstationary processes, called locally stationary processes. They have so-called time-varying spectral densities whose spectral structures smoothly change in time. In this paper we extend the empirical likelihood method to non-Gaussian locally stationary processes with time-varying spectra. First, We derive the asymptotic normality of the maximum empirical likelihood estimator based on the central limit theorem for locally stationary processes, which is stated in Dahlhaus [13, Theorem
 Statistics , 2013, Abstract: Jing (1995) and Liu et al. (2008) studied the two-sample empirical likelihood and showed it is Bartlett correctable for the univariate and multivariate cases, respectively. We expand its domain to the full parameter space and obtain a two-sample extended empirical likelihood which is more accurate and can also achieve the second-order accuracy of the Bartlett correction.
 Min Tsao Mathematics , 2004, DOI: 10.1214/009053604000000337 Abstract: This paper studies the least upper bounds on coverage probabilities of the empirical likelihood ratio confidence regions based on estimating equations. The implications of the bounds on empirical likelihood inference are also discussed.
 Statistics , 2010, DOI: 10.1214/09-AOS750 Abstract: Empirical likelihood is a popular nonparametric or semi-parametric statistical method with many nice statistical properties. Yet when the sample size is small, or the dimension of the accompanying estimating function is high, the application of the empirical likelihood method can be hindered by low precision of the chi-square approximation and by nonexistence of solutions to the estimating equations. In this paper, we show that the adjusted empirical likelihood is effective at addressing both problems. With a specific level of adjustment, the adjusted empirical likelihood achieves the high-order precision of the Bartlett correction, in addition to the advantage of a guaranteed solution to the estimating equations. Simulation results indicate that the confidence regions constructed by the adjusted empirical likelihood have coverage probabilities comparable to or substantially more accurate than the original empirical likelihood enhanced by the Bartlett correction.
 Mathematics , 2009, DOI: 10.1214/07-AOS555 Abstract: This article extends the scope of empirical likelihood methodology in three directions: to allow for plug-in estimates of nuisance parameters in estimating equations, slower than $\sqrt{n}$-rates of convergence, and settings in which there are a relatively large number of estimating equations compared to the sample size. Calibrating empirical likelihood confidence regions with plug-in is sometimes intractable due to the complexity of the asymptotics, so we introduce a bootstrap approximation that can be used in such situations. We provide a range of examples from survival analysis and nonparametric statistics to illustrate the main results.
 Statistics , 2014, DOI: 10.3150/14-BEJ636 Abstract: Empirical likelihood approach is one of non-parametric statistical methods, which is applied to the hypothesis testing or construction of confidence regions for pivotal unknown quantities. This method has been applied to the case of independent identically distributed random variables and second order stationary processes. In recent years, we observe heavy-tailed data in many fields. To model such data suitably, we consider symmetric scalar and multivariate $\alpha$-stable linear processes generated by infinite variance innovation sequence. We use a Whittle likelihood type estimating function in the empirical likelihood ratio function and derive the asymptotic distribution of the empirical likelihood ratio statistic for $\alpha$-stable linear processes. With the empirical likelihood statistic approach, the theory of estimation and testing for second order stationary processes is nicely extended to heavy-tailed data analyses, not straightforward, and applicable to a lot of financial statistical analyses.
 Statistics , 2009, DOI: 10.1214/08-AOS638 Abstract: We propose a general maximum likelihood empirical Bayes (GMLEB) method for the estimation of a mean vector based on observations with i.i.d. normal errors. We prove that under mild moment conditions on the unknown means, the average mean squared error (MSE) of the GMLEB is within an infinitesimal fraction of the minimum average MSE among all separable estimators which use a single deterministic estimating function on individual observations, provided that the risk is of greater order than $(\log n)^5/n$. We also prove that the GMLEB is uniformly approximately minimax in regular and weak $\ell_p$ balls when the order of the length-normalized norm of the unknown means is between $(\log n)^{\kappa_1}/n^{1/(p\wedge2)}$ and $n/(\log n)^{\kappa_2}$. Simulation experiments demonstrate that the GMLEB outperforms the James--Stein and several state-of-the-art threshold estimators in a wide range of settings without much down side.
 Open Journal of Statistics (OJS) , 2012, DOI: 10.4236/ojs.2012.25070 Abstract: The authors propose a robust semi-parametric empirical likelihood method to integrate all available information from multiple samples with a common center of measurements. Two different sets of estimating equations are used to improve the classical likelihood inference on the measurement center. The proposed method does not require the knowle- dge of the functional forms of the probability density functions of related populations. The advantages of the proposed method are demonstrated through extensive simulation studies by comparing the mean squared errors, coverage proba- bilities and average lengths of confidence intervals with those from the classical likelihood method. Simulation results suggest that our approach provides more informative and efficient inference than the conventional maximum likelihood estimator if certain structural relationship exists among the parameters of relevant samples.
 Page 1 /100 Display every page 5 10 20 Item