Abstract:
We derive an extended empirical likelihood for parameters defined by estimating equations which generalizes the original empirical likelihood for such parameters to the full parameter space. Under mild conditions, the extended empirical likelihood has all asymptotic properties of the original empirical likelihood. Its contours retain the data-driven shape of the latter. It can also attain the second order accuracy. The first order extended empirical likelihood is easy-to-use yet it is substantially more accurate than other empirical likelihoods, including second order ones. We recommend it for practical applications of the empirical likelihood method.

Abstract:
Jing (1995) and Liu et al. (2008) studied the two-sample empirical likelihood and showed it is Bartlett correctable for the univariate and multivariate cases, respectively. We expand its domain to the full parameter space and obtain a two-sample extended empirical likelihood which is more accurate and can also achieve the second-order accuracy of the Bartlett correction.

Abstract:
We consider an empirical likelihood inference for parameters defined by general estimating equations when some components of the random observations are subject to missingness. As the nature of the estimating equations is wide-ranging, we propose a nonparametric imputation of the missing values from a kernel estimator of the conditional distribution of the missing variable given the always observable variable. The empirical likelihood is used to construct a profile likelihood for the parameter of interest. We demonstrate that the proposed nonparametric imputation can remove the selection bias in the missingness and the empirical likelihood leads to more efficient parameter estimation. The proposed method is further evaluated by simulation and an empirical study on a genetic dataset on recombinant inbred mice.

Abstract:
An application of the empirical likelihood method to non-Gaussian locally stationary processes is presented. Based on the central limit theorem for locally stationary processes, we give the asymptotic distributions of the maximum empirical likelihood estimator and the empirical likelihood ratio statistics, respectively. It is shown that the empirical likelihood method enables us to make inferences on various important indices in a time series analysis. Furthermore, we give a numerical study and investigate a finite sample property. 1. Introduction The empirical likelihood is one of the nonparametric methods for a statistical inference proposed by Owen [1, 2]. It is used for constructing confidence regions for a mean, for a class of M-estimates that includes quantile, and for differentiable statistical functionals. The empirical likelihood method has been applied to various problems because of its good properties: generality of the nonparametric method and effectiveness of the likelihood method. For example, we can name applications to the general estimating equations, [3] the regression models [4–6], the biased sample models [7], and so forth. Applications are also extended to dependent observations. Kitamura [8] developed the blockwise empirical likelihood for estimating equations and for smooth functions of means. Monti [9] applied the empirical likelihood method to linear processes, essentially under the circular Gaussian assumption, using a spectral method. For short- and long-range dependence, Nordman and Lahiri [10] gave the asymptotic properties of the frequency domain empirical likelihood. As we named above, some applications to time series analysis can be found but it seems that they were mainly for stationary processes. Although stationarity is the most fundamental assumption when we are engaged in a time series analysis, it is also known that real time series data are generally nonstationary (e.g., economics analysis). Therefore we need to use nonstationary models in order to describe the real world. Recently Dahlhaus [11–13] proposed an important class of nonstationary processes, called locally stationary processes. They have so-called time-varying spectral densities whose spectral structures smoothly change in time. In this paper we extend the empirical likelihood method to non-Gaussian locally stationary processes with time-varying spectra. First, We derive the asymptotic normality of the maximum empirical likelihood estimator based on the central limit theorem for locally stationary processes, which is stated in Dahlhaus [13, Theorem

Abstract:
This paper studies the least upper bounds on coverage probabilities of the empirical likelihood ratio confidence regions based on estimating equations. The implications of the bounds on empirical likelihood inference are also discussed.

Abstract:
Empirical likelihood is a popular nonparametric or semi-parametric statistical method with many nice statistical properties. Yet when the sample size is small, or the dimension of the accompanying estimating function is high, the application of the empirical likelihood method can be hindered by low precision of the chi-square approximation and by nonexistence of solutions to the estimating equations. In this paper, we show that the adjusted empirical likelihood is effective at addressing both problems. With a specific level of adjustment, the adjusted empirical likelihood achieves the high-order precision of the Bartlett correction, in addition to the advantage of a guaranteed solution to the estimating equations. Simulation results indicate that the confidence regions constructed by the adjusted empirical likelihood have coverage probabilities comparable to or substantially more accurate than the original empirical likelihood enhanced by the Bartlett correction.

Abstract:
This article extends the scope of empirical likelihood methodology in three directions: to allow for plug-in estimates of nuisance parameters in estimating equations, slower than $\sqrt{n}$-rates of convergence, and settings in which there are a relatively large number of estimating equations compared to the sample size. Calibrating empirical likelihood confidence regions with plug-in is sometimes intractable due to the complexity of the asymptotics, so we introduce a bootstrap approximation that can be used in such situations. We provide a range of examples from survival analysis and nonparametric statistics to illustrate the main results.

Abstract:
Empirical likelihood approach is one of non-parametric statistical methods, which is applied to the hypothesis testing or construction of confidence regions for pivotal unknown quantities. This method has been applied to the case of independent identically distributed random variables and second order stationary processes. In recent years, we observe heavy-tailed data in many fields. To model such data suitably, we consider symmetric scalar and multivariate $\alpha$-stable linear processes generated by infinite variance innovation sequence. We use a Whittle likelihood type estimating function in the empirical likelihood ratio function and derive the asymptotic distribution of the empirical likelihood ratio statistic for $\alpha$-stable linear processes. With the empirical likelihood statistic approach, the theory of estimation and testing for second order stationary processes is nicely extended to heavy-tailed data analyses, not straightforward, and applicable to a lot of financial statistical analyses.

Abstract:
A non parametric method based on the empirical likelihood is proposed for detecting the change in the coefficients of high-dimensional linear model where the number of model variables may increase as the sample size increases. This amounts to testing the null hypothesis of no change against the alternative of one change in the regression coefficients. Based on the theoretical asymptotic behaviour of the empirical likelihood ratio statistic, we propose, for a fixed design, a simpler test statistic, easier to use in practice. The asymptotic normality of the proposed test statistic under the null hypothesis is proved, a result which is different from the $\chi^2$ law for a model with a fixed variable number. Under alternative hypothesis, the test statistic diverges. We can then find the asymptotic confidence region for the difference of parameters of the two phases. Some Monte-Carlo simulations study the behaviour of the proposed test statistic.

Abstract:
The authors propose a robust semi-parametric empirical likelihood method to integrate all available information from multiple samples with a common center of measurements. Two different sets of estimating equations are used to improve the classical likelihood inference on the measurement center. The proposed method does not require the knowle- dge of the functional forms of the probability density functions of related populations. The advantages of the proposed method are demonstrated through extensive simulation studies by comparing the mean squared errors, coverage proba- bilities and average lengths of confidence intervals with those from the classical likelihood method. Simulation results suggest that our approach provides more informative and efficient inference than the conventional maximum likelihood estimator if certain structural relationship exists among the parameters of relevant samples.