oalib
Search Results: 1 - 10 of 100 matches for " "
All listed articles are free for downloading (OA Articles)
Page 1 /100
Display every page Item
Grouped Variable Selection via Nested Spike and Slab Priors  [PDF]
Tso-Jung Yen,Yu-Min Yen
Statistics , 2011,
Abstract: In this paper we study grouped variable selection problems by proposing a specified prior, called the nested spike and slab prior, to model collective behavior of regression coefficients. At the group level, the nested spike and slab prior puts positive mass on the event that the l2-norm of the grouped coefficients is equal to zero. At the individual level, each coefficient is assumed to follow a spike and slab prior. We carry out maximum a posteriori estimation for the model by applying blockwise coordinate descent algorithms to solve an optimization problem involving an approximate objective modified by majorization-minimization techniques. Simulation studies show that the proposed estimator performs relatively well in the situations in which the true and redundant covariates are both covered by the same group. Asymptotic analysis under a frequentist's framework further shows that the l2 estimation error of the proposed estimator can have a better upper bound if the group that covers the true covariates does not cover too many redundant covariates. In addition, given some regular conditions hold, the proposed estimator is asymptotically invariant to group structures, and its model selection consistency can be established without imposing irrepresentable-type conditions.
Convergent Expectation Propagation in Linear Models with Spike-and-slab Priors  [PDF]
José Miguel Hernández-Lobato,Daniel Hernández-Lobato
Statistics , 2011,
Abstract: Exact inference in the linear regression model with spike and slab priors is often intractable. Expectation propagation (EP) can be used for approximate inference. However, the regular sequential form of EP (R-EP) may fail to converge in this model when the size of the training set is very small. As an alternative, we propose a provably convergent EP algorithm (PC-EP). PC-EP is proved to minimize an energy function which, under some constraints, is bounded from below and whose stationary points coincide with the solution of R-EP. Experiments with synthetic data indicate that when R-EP does not converge, the approximation generated by PC-EP is often better. By contrast, when R-EP converges, both methods perform similarly.
Bayesian inference for spatio-temporal spike and slab priors  [PDF]
Michael Riis Andersen,Aki Vehtari,Ole Winther,Lars Kai Hansen
Statistics , 2015,
Abstract: In this work we address the problem of solving a series of underdetermined linear inverse problems subject to a sparsity constraint. We generalize the spike and slab prior distribution to encode a priori correlation of the support of the solution in both space and time by imposing a transformed Gaussian process on the spike and slab probabilities. An expectation propagation (EP) algorithm for posterior inference under the proposed model is derived. For large scale problems, the standard EP algorithm can be prohibitively slow. We therefore introduce three different approximation schemes to reduce the computational complexity. Finally, we demonstrate the proposed model using numerical experiments based on both synthetic and real data sets.
Spatio-temporal Spike and Slab Priors for Multiple Measurement Vector Problems  [PDF]
Michael Riis Andersen,Ole Winther,Lars Kai Hansen
Statistics , 2015,
Abstract: We are interested in solving the multiple measurement vector (MMV) problem for instances, where the underlying sparsity pattern exhibit spatio-temporal structure motivated by the electroencephalogram (EEG) source localization problem. We propose a probabilistic model that takes this structure into account by generalizing the structured spike and slab prior and the associated Expectation Propagation inference scheme. Based on numerical experiments, we demonstrate the viability of the model and the approximate inference scheme.
ICR: Iterative Convex Refinement for Sparse Signal Recovery Using Spike and Slab Priors  [PDF]
Hojjat S. Mousavi,Vishal Monga,Trac D. Tran
Mathematics , 2015, DOI: 10.1109/LSP.2015.2438255
Abstract: In this letter, we address sparse signal recovery using spike and slab priors. In particular, we focus on a Bayesian framework where sparsity is enforced on reconstruction coefficients via probabilistic priors. The optimization resulting from spike and slab prior maximization is known to be a hard non-convex problem, and existing solutions involve simplifying assumptions and/or relaxations. We propose an approach called Iterative Convex Refinement (ICR) that aims to solve the aforementioned optimization problem directly allowing for greater generality in the sparse structure. Essentially, ICR solves a sequence of convex optimization problems such that sequence of solutions converges to a sub-optimal solution of the original hard optimization problem. We propose two versions of our algorithm: a.) an unconstrained version, and b.) with a non-negativity constraint on sparse coefficients, which may be required in some real-world problems. Experimental validation is performed on both synthetic data and for a real-world image recovery problem, which illustrates merits of ICR over state of the art alternatives.
Multi-task Image Classification via Collaborative, Hierarchical Spike-and-Slab Priors  [PDF]
Hojjat Seyed Mousavi,Umamahesh Srinivas,Vishal Monga,Yuanming Suo,Minh Dao,Trac. D. Tran
Computer Science , 2015,
Abstract: Promising results have been achieved in image classification problems by exploiting the discriminative power of sparse representations for classification (SRC). Recently, it has been shown that the use of \emph{class-specific} spike-and-slab priors in conjunction with the class-specific dictionaries from SRC is particularly effective in low training scenarios. As a logical extension, we build on this framework for multitask scenarios, wherein multiple representations of the same physical phenomena are available. We experimentally demonstrate the benefits of mining joint information from different camera views for multi-view face recognition.
Spike-and-Slab Priors for Function Selection in Structured Additive Regression Models  [PDF]
Fabian Scheipl,Ludwig Fahrmeir,Thomas Kneib
Statistics , 2011, DOI: 10.1080/01621459.2012.737742
Abstract: Structured additive regression provides a general framework for complex Gaussian and non-Gaussian regression models, with predictors comprising arbitrary combinations of nonlinear functions and surfaces, spatial effects, varying coefficients, random effects and further regression terms. The large flexibility of structured additive regression makes function selection a challenging and important task, aiming at (1) selecting the relevant covariates, (2) choosing an appropriate and parsimonious representation of the impact of covariates on the predictor and (3) determining the required interactions. We propose a spike-and-slab prior structure for function selection that allows to include or exclude single coefficients as well as blocks of coefficients representing specific model terms. A novel multiplicative parameter expansion is required to obtain good mixing and convergence properties in a Markov chain Monte Carlo simulation approach and is shown to induce desirable shrinkage properties. In simulation studies and with (real) benchmark classification data, we investigate sensitivity to hyperparameter settings and compare performance to competitors. The flexibility and applicability of our approach are demonstrated in an additive piecewise exponential model with time-varying effects for right-censored survival times of intensive care patients with sepsis. Geoadditive and additive mixed logit model applications are discussed in an extensive appendix.
Spike-and-Slab Dirichlet Process Mixture Models  [PDF]
Kai Cui, Wenshan Cui
Open Journal of Statistics (OJS) , 2012, DOI: 10.4236/ojs.2012.25066
Abstract: In this paper, Spike-and-Slab Dirichlet Process (SS-DP) priors are introduced and discussed for non-parametric Bayesian modeling and inference, especially in the mixture models context. Specifying a spike-and-slab base measure for DP priors combines the merits of Dirichlet process and spike-and-slab priors and serves as a flexible approach in Bayesian model selection and averaging. Computationally, Bayesian Expectation-Maximization (BEM) is utilized to obtain MAP estimates. Two simulated examples in mixture modeling and time series analysis contexts demonstrate the models and computational methodology.
Generalized Majorization-Minimization  [PDF]
Sobhan Naderi Parizi,Kun He,Stan Sclaroff,Pedro Felzenszwalb
Computer Science , 2015,
Abstract: Non-convex optimization is ubiquitous in machine learning. The Majorization-Minimization (MM) procedure systematically optimizes non-convex functions through an iterative construction and optimization of upper bounds on the objective function. The bound at each iteration is required to \emph{touch} the objective function at the optimizer of the previous bound. We show that this touching constraint is unnecessary and overly restrictive. We generalize MM by relaxing this constraint, and propose a new framework for designing optimization algorithms, named Generalized Majorization-Minimization (G-MM). Compared to MM, G-MM is much more flexible. For instance, it can incorporate application-specific biases into the optimization procedure without changing the objective function. We derive G-MM algorithms for several latent variable models and show that they consistently outperform their MM counterparts in optimizing non-convex objectives. In particular, G-MM algorithms appear to be less sensitive to initialization.
Spike and slab variable selection: Frequentist and Bayesian strategies  [PDF]
Hemant Ishwaran,J. Sunil Rao
Mathematics , 2005, DOI: 10.1214/009053604000001147
Abstract: Variable selection in the linear regression model takes many apparent faces from both frequentist and Bayesian standpoints. In this paper we introduce a variable selection method referred to as a rescaled spike and slab model. We study the importance of prior hierarchical specifications and draw connections to frequentist generalized ridge regression estimation. Specifically, we study the usefulness of continuous bimodal priors to model hypervariance parameters, and the effect scaling has on the posterior mean through its relationship to penalization. Several model selection strategies, some frequentist and some Bayesian in nature, are developed and studied theoretically. We demonstrate the importance of selective shrinkage for effective variable selection in terms of risk misclassification, and show this is achieved using the posterior from a rescaled spike and slab model. We also show how to verify a procedure's ability to reduce model uncertainty in finite samples using a specialized forward selection strategy. Using this tool, we illustrate the effectiveness of rescaled spike and slab models in reducing model uncertainty.
Page 1 /100
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.