oalib
Search Results: 1 - 10 of 100 matches for " "
All listed articles are free for downloading (OA Articles)
Page 1 /100
Display every page Item
On Bayesian Nonparametric Continuous Time Series Models  [PDF]
George Karabatsos,Stephen G. Walker
Statistics , 2013,
Abstract: This paper is a note on the use of Bayesian nonparametric mixture models for continuous time series. We identify a key requirement for such models, and then establish that there is a single type of model which meets this requirement. As it turns out, the model is well known in multiple change-point problems.
Bayesian nonparametric models for ranked data  [PDF]
Francois Caron,Yee Whye Teh
Computer Science , 2012,
Abstract: We develop a Bayesian nonparametric extension of the popular Plackett-Luce choice model that can handle an infinite number of choice items. Our framework is based on the theory of random atomic measures, with the prior specified by a gamma process. We derive a posterior characterization and a simple and effective Gibbs sampler for posterior simulation. We develop a time-varying extension of our model, and apply it to the New York Times lists of weekly bestselling books.
A Bayesian Splitotic Theory For Nonparametric Models  [PDF]
Zuofeng Shang,Guang Cheng
Statistics , 2015,
Abstract: We develop a set of scalable Bayesian inference procedures for a general class of nonparametric regression models based on embarrassingly parallel MCMC. Specifically, we first perform independent nonparametric Bayesian inference on each subset split from a massive dataset, and then aggregate those results into global counterparts. By partitioning the dataset carefully, we show that our aggregated inference results obtain the oracle rule in the sense that they are equivalent to those obtained directly from the massive data (which are computationally prohibitive in practice, though). For example, the aggregated credible sets achieve desirable credibility level and frequentist coverage possessed by the oracle counterparts (with similar radius). The oracle matching phenomenon occurs due to the nice geometric structures of the infinite-dimensional parameter space. A technical by-product is a new version of uniformly consistent test that applies to a general regression model under Sobolev norm.
Bayesian Nonparametrics in Topic Modeling: A Brief Tutorial  [PDF]
Alexander Spangher
Statistics , 2015,
Abstract: Using nonparametric methods has been increasingly explored in Bayesian hierarchical modeling as a way to increase model flexibility. Although the field shows a lot of promise, inference in many models, including Hierachical Dirichlet Processes (HDP), remain prohibitively slow. One promising path forward is to exploit the submodularity inherent in Indian Buffet Process (IBP) to derive near-optimal solutions in polynomial time. In this work, I will present a brief tutorial on Bayesian nonparametric methods, especially as they are applied to topic modeling. I will show a comparison between different non-parametric models and the current state-of-the-art parametric model, Latent Dirichlet Allocation (LDA).
Bayesian Nonparametric Hidden Semi-Markov Models  [PDF]
Matthew J. Johnson,Alan S. Willsky
Statistics , 2012,
Abstract: There is much interest in the Hierarchical Dirichlet Process Hidden Markov Model (HDP-HMM) as a natural Bayesian nonparametric extension of the ubiquitous Hidden Markov Model for learning from sequential and time-series data. However, in many settings the HDP-HMM's strict Markovian constraints are undesirable, particularly if we wish to learn or encode non-geometric state durations. We can extend the HDP-HMM to capture such structure by drawing upon explicit-duration semi-Markovianity, which has been developed mainly in the parametric frequentist setting, to allow construction of highly interpretable models that admit natural prior information on state durations. In this paper we introduce the explicit-duration Hierarchical Dirichlet Process Hidden semi-Markov Model (HDP-HSMM) and develop sampling algorithms for efficient posterior inference. The methods we introduce also provide new methods for sampling inference in the finite Bayesian HSMM. Our modular Gibbs sampling methods can be embedded in samplers for larger hierarchical Bayesian models, adding semi-Markov chain modeling as another tool in the Bayesian inference toolbox. We demonstrate the utility of the HDP-HSMM and our inference methods on both synthetic and real experiments.
An adaptive truncation method for inference in Bayesian nonparametric models  [PDF]
Jim E. Griffin
Statistics , 2013,
Abstract: Many exact Markov chain Monte Carlo algorithms have been developed for posterior inference in Bayesian nonparametric models which involve infinite-dimensional priors. However, these methods are not generic and special methodology must be developed for different classes of prior or different models. Alternatively, the infinite-dimensional prior can be truncated and standard Markov chain Monte Carlo methods used for inference. However, the error in approximating the infinite-dimensional posterior can be hard to control for many models. This paper describes an adaptive truncation method which allows the level of the truncation to be decided by the algorithm and so can avoid large errors in approximating the posterior. A sequence of truncated priors is constructed which are sampled using Markov chain Monte Carlo methods embedded in a sequential Monte Carlo algorithm. Implementational details for infinite mixture models with stick-breaking priors and normalized random measures with independent increments priors are discussed. The methodology is illustrated on infinite mixture models, a semiparametric linear mixed model and a nonparametric time series model.
Streaming Variational Inference for Bayesian Nonparametric Mixture Models  [PDF]
Alex Tank,Nicholas J. Foti,Emily B. Fox
Statistics , 2014,
Abstract: In theory, Bayesian nonparametric (BNP) models are well suited to streaming data scenarios due to their ability to adapt model complexity with the observed data. Unfortunately, such benefits have not been fully realized in practice; existing inference algorithms are either not applicable to streaming applications or not extensible to BNP models. For the special case of Dirichlet processes, streaming inference has been considered. However, there is growing interest in more flexible BNP models building on the class of normalized random measures (NRMs). We work within this general framework and present a streaming variational inference algorithm for NRM mixture models. Our algorithm is based on assumed density filtering (ADF), leading straightforwardly to expectation propagation (EP) for large-scale batch inference as well. We demonstrate the efficacy of the algorithm on clustering documents in large, streaming text corpora.
Nonparametric Bayesian methods for one-dimensional diffusion models  [PDF]
Harry van Zanten
Statistics , 2012,
Abstract: In this paper we review recently developed methods for nonparametric Bayesian inference for one-dimensional diffusion models. We discuss different possible prior distributions, computational issues, and asymptotic results.
Bayesian nonparametric estimation and consistency of mixed multinomial logit choice models  [PDF]
Pierpaolo De Blasi,Lancelot F. James,John W. Lau
Statistics , 2011, DOI: 10.3150/09-BEJ233
Abstract: This paper develops nonparametric estimation for discrete choice models based on the mixed multinomial logit (MMNL) model. It has been shown that MMNL models encompass all discrete choice models derived under the assumption of random utility maximization, subject to the identification of an unknown distribution $G$. Noting the mixture model description of the MMNL, we employ a Bayesian nonparametric approach, using nonparametric priors on the unknown mixing distribution $G$, to estimate choice probabilities. We provide an important theoretical support for the use of the proposed methodology by investigating consistency of the posterior distribution for a general nonparametric prior on the mixing distribution. Consistency is defined according to an $L_1$-type distance on the space of choice probabilities and is achieved by extending to a regression model framework a recent approach to strong consistency based on the summability of square roots of prior probabilities. Moving to estimation, slightly different techniques for non-panel and panel data models are discussed. For practical implementation, we describe efficient and relatively easy-to-use blocked Gibbs sampling procedures. These procedures are based on approximations of the random probability measure by classes of finite stick-breaking processes. A simulation study is also performed to investigate the performance of the proposed methods.
Nonparametric Bayesian models of hierarchical structure in complex networks  [PDF]
Mikkel N. Schmidt,Tue Herlau,Morten M?rup
Statistics , 2013,
Abstract: Analyzing and understanding the structure of complex relational data is important in many applications including analysis of the connectivity in the human brain. Such networks can have prominent patterns on different scales, calling for a hierarchically structured model. We propose two non-parametric Bayesian hierarchical network models based on Gibbs fragmentation tree priors, and demonstrate their ability to capture nested patterns in simulated networks. On real networks we demonstrate detection of hierarchical structure and show predictive performance on par with the state of the art. We envision that our methods can be employed in exploratory analysis of large scale complex networks for example to model human brain connectivity.
Page 1 /100
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.