oalib
Search Results: 1 - 10 of 100 matches for " "
All listed articles are free for downloading (OA Articles)
Page 1 /100
Display every page Item
Thresholded Basis Pursuit: An LP Algorithm for Achieving Optimal Support Recovery for Sparse and Approximately Sparse Signals from Noisy Random Measurements  [PDF]
V. Saligrama,M. Zhao
Mathematics , 2008,
Abstract: In this paper we present a linear programming solution for sign pattern recovery of a sparse signal from noisy random projections of the signal. We consider two types of noise models, input noise, where noise enters before the random projection; and output noise, where noise enters after the random projection. Sign pattern recovery involves the estimation of sign pattern of a sparse signal. Our idea is to pretend that no noise exists and solve the noiseless $\ell_1$ problem, namely, $\min \|\beta\|_1 ~ s.t. ~ y=G \beta$ and quantizing the resulting solution. We show that the quantized solution perfectly reconstructs the sign pattern of a sufficiently sparse signal. Specifically, we show that the sign pattern of an arbitrary k-sparse, n-dimensional signal $x$ can be recovered with $SNR=\Omega(\log n)$ and measurements scaling as $m= \Omega(k \log{n/k})$ for all sparsity levels $k$ satisfying $0< k \leq \alpha n$, where $\alpha$ is a sufficiently small positive constant. Surprisingly, this bound matches the optimal \emph{Max-Likelihood} performance bounds in terms of $SNR$, required number of measurements, and admissible sparsity level in an order-wise sense. In contrast to our results, previous results based on LASSO and Max-Correlation techniques either assume significantly larger $SNR$, sublinear sparsity levels or restrictive assumptions on signal sets. Our proof technique is based on noisy perturbation of the noiseless $\ell_1$ problem, in that, we estimate the maximum admissible noise level before sign pattern recovery fails.
Sparse approximation property and stable recovery of sparse signals from noisy measurements  [PDF]
Qiyu Sun
Mathematics , 2011, DOI: 10.1109/TSP.2011.2161470
Abstract: In this paper, we introduce a sparse approximation property of order $s$ for a measurement matrix ${\bf A}$: $$\|{\bf x}_s\|_2\le D \|{\bf A}{\bf x}\|_2+ \beta \frac{\sigma_s({\bf x})}{\sqrt{s}} \quad {\rm for\ all} \ {\bf x},$$ where ${\bf x}_s$ is the best $s$-sparse approximation of the vector ${\bf x}$ in $\ell^2$, $\sigma_s({\bf x})$ is the $s$-sparse approximation error of the vector ${\bf x}$ in $\ell^1$, and $D$ and $\beta$ are positive constants. The sparse approximation property for a measurement matrix can be thought of as a weaker version of its restricted isometry property and a stronger version of its null space property. In this paper, we show that the sparse approximation property is an appropriate condition on a measurement matrix to consider stable recovery of any compressible signal from its noisy measurements. In particular, we show that any compressible signalcan be stably recovered from its noisy measurements via solving an $\ell^1$-minimization problem if the measurement matrix has the sparse approximation property with $\beta\in (0,1)$, and conversely the measurement matrix has the sparse approximation property with $\beta\in (0,\infty)$ if any compressible signal can be stably recovered from its noisy measurements via solving an $\ell^1$-minimization problem.
On the Fundamental Limits of Recovering Tree Sparse Vectors from Noisy Linear Measurements  [PDF]
Akshay Soni,Jarvis Haupt
Statistics , 2013,
Abstract: Recent breakthrough results in compressive sensing (CS) have established that many high dimensional signals can be accurately recovered from a relatively small number of non-adaptive linear observations, provided that the signals possess a sparse representation in some basis. Subsequent efforts have shown that the performance of CS can be improved by exploiting additional structure in the locations of the nonzero signal coefficients during inference, or by utilizing some form of data-dependent adaptive measurement focusing during the sensing process. To our knowledge, our own previous work was the first to establish the potential benefits that can be achieved when fusing the notions of adaptive sensing and structured sparsity -- that work examined the task of support recovery from noisy linear measurements, and established that an adaptive sensing strategy specifically tailored to signals that are tree-sparse can significantly outperform adaptive and non-adaptive sensing strategies that are agnostic to the underlying structure. In this work we establish fundamental performance limits for the task of support recovery of tree-sparse signals from noisy measurements, in settings where measurements may be obtained either non-adaptively (using a randomized Gaussian measurement strategy motivated by initial CS investigations) or by any adaptive sensing strategy. Our main results here imply that the adaptive tree sensing procedure analyzed in our previous work is nearly optimal, in the sense that no other sensing and estimation strategy can perform fundamentally better for identifying the support of tree-sparse signals.
Sparse Support Recovery with Phase-Only Measurements  [PDF]
Yipeng Liu,Qun Wan,Fei Wen,Jia Xu,Yingning Peng
Mathematics , 2010,
Abstract: Sparse support recovery (SSR) is an important part of the compressive sensing (CS). Most of the current SSR methods are with the full information measurements. But in practice the amplitude part of the measurements may be seriously destroyed. The corrupted measurements mismatch the current SSR algorithms, which leads to serious performance degeneration. This paper considers the problem of SSR with only phase information. In the proposed method, the minimization of the l1 norm of the estimated sparse signal enforces sparse distribution, while a nonzero constraint of the uncorrupted random measurements' amplitudes with respect to the reconstructed sparse signal is introduced. Because it only requires the phase components of the measurements in the constraint, it can avoid the performance deterioration by corrupted amplitude components. Simulations demonstrate that the proposed phase-only SSR is superior in the support reconstruction accuracy when the amplitude components of the measurements are contaminated.
Sparse Signals Recovery from Noisy Measurements by Orthogonal Matching Pursuit  [PDF]
Yi Shen,Song Li
Mathematics , 2011,
Abstract: Recently, many practical algorithms have been proposed to recover the sparse signal from fewer measurements. Orthogonal matching pursuit (OMP) is one of the most effective algorithm. In this paper, we use the restricted isometry property to analysis the algorithm. We show that, under certain conditions based on the restricted isometry property and the signals, OMP will recover the support of the sparse signal when measurements are corrupted by additive noise.
Regularized Modified BPDN for Noisy Sparse Reconstruction with Partial Erroneous Support and Signal Value Knowledge  [PDF]
Wei Lu,Namrata Vaswani
Mathematics , 2010,
Abstract: We study the problem of sparse reconstruction from noisy undersampled measurements when the following two things are available. (1) We are given partial, and partly erroneous, knowledge of the signal's support, denoted by $T$. (2) We are also given an erroneous estimate of the signal values on $T$, denoted by $(\hat{\mu})_T$. In practice, both these may be available from available prior knowledge. Alternatively, in recursive reconstruction applications, like real-time dynamic MRI, one can use the support estimate and the signal value estimate from the previous time instant as $T$ and $(\hat{\mu})_T$. In this work, we introduce regularized modified-BPDN (reg-mod-BPDN) and obtain computable bounds on its reconstruction error. Reg-mod-BPDN tries to find the signal that is sparsest outside the set $T$, while being "close enough" to $(\hat{\mu})_T$ on $T$ and while satisfying the data constraint. Corresponding results for modified-BPDN and BPDN follow as direct corollaries. A second key contribution is an approach to obtain computable error bounds that hold without any sufficient conditions. This makes it easy to compare the bounds for the various approaches. Empirical reconstruction error comparisons with many existing approaches are also provided.
Phase transition in the detection of modules in sparse networks  [PDF]
Aurelien Decelle,Florent Krzakala,Cristopher Moore,Lenka Zdeborová
Computer Science , 2011, DOI: 10.1103/PhysRevLett.107.065701
Abstract: We present an asymptotically exact analysis of the problem of detecting communities in sparse random networks. Our results are also applicable to detection of functional modules, partitions, and colorings in noisy planted models. Using a cavity method analysis, we unveil a phase transition from a region where the original group assignment is undetectable to one where detection is possible. In some cases, the detectable region splits into an algorithmically hard region and an easy one. Our approach naturally translates into a practical algorithm for detecting modules in sparse networks, and learning the parameters of the underlying model.
Bayesian Hypothesis Test using Nonparametric Belief Propagation for Noisy Sparse Recovery  [PDF]
Jaewook Kang,Heung-No Lee,Kiseon Kim
Mathematics , 2015, DOI: 10.1109/TSP.2014.2385659
Abstract: This paper proposes a low-computational Bayesian algorithm for noisy sparse recovery (NSR), called BHT-BP. In this framework, we consider an LDPC-like measurement matrices which has a tree-structured property, and additive white Gaussian noise. BHT-BP has a joint detection-and-estimation structure consisting of a sparse support detector and a nonzero estimator. The support detector is designed under the criterion of the minimum detection error probability using a nonparametric belief propagation (nBP) and composite binary hypothesis tests. The nonzeros are estimated in the sense of linear MMSE, where the support detection result is utilized. BHT-BP has its strength in noise robust support detection, effectively removing quantization errors caused by the uniform sampling-based nBP. Therefore, in the NSR problems, BHT-BP has advantages over CS-BP which is an existing nBP algorithm, being comparable to other recent CS solvers, in several aspects. In addition, we examine impact of the minimum nonzero value of sparse signals via BHT-BP, on the basis of the results of the recent literature. Our empirical result shows that variation of x_min is reflected to recovery performance in the form of SNR shift.
Sharp Support Recovery from Noisy Random Measurements by L1 minimization  [PDF]
Charles Dossal,Marie-Line Chabanol,Gabriel Peyré,Jalal Fadili
Mathematics , 2011,
Abstract: In this paper, we investigate the theoretical guarantees of penalized $\lun$ minimization (also called Basis Pursuit Denoising or Lasso) in terms of sparsity pattern recovery (support and sign consistency) from noisy measurements with non-necessarily random noise, when the sensing operator belongs to the Gaussian ensemble (i.e. random design matrix with i.i.d. Gaussian entries). More precisely, we derive sharp non-asymptotic bounds on the sparsity level and (minimal) signal-to-noise ratio that ensure support identification for most signals and most Gaussian sensing matrices by solving the Lasso problem with an appropriately chosen regularization parameter. Our first purpose is to establish conditions allowing exact sparsity pattern recovery when the signal is strictly sparse. Then, these conditions are extended to cover the compressible or nearly sparse case. In these two results, the role of the minimal signal-to-noise ratio is crucial. Our third main result gets rid of this assumption in the strictly sparse case, but this time, the Lasso allows only partial recovery of the support. We also provide in this case a sharp $\ell_2$-consistency result on the coefficient vector. The results of the present work have several distinctive features compared to previous ones. One of them is that the leading constants involved in all the bounds are sharp and explicit. This is illustrated by some numerical experiments where it is indeed shown that the sharp sparsity level threshold identified by our theoretical results below which sparsistency of the Lasso is guaranteed meets that empirically observed.
Near-Oracle Performance of Greedy Block-Sparse Estimation Techniques from Noisy Measurements  [PDF]
Zvika Ben-Haim,Yonina C. Eldar
Mathematics , 2010, DOI: 10.1109/JSTSP.2011.2160250
Abstract: This paper examines the ability of greedy algorithms to estimate a block sparse parameter vector from noisy measurements. In particular, block sparse versions of the orthogonal matching pursuit and thresholding algorithms are analyzed under both adversarial and Gaussian noise models. In the adversarial setting, it is shown that estimation accuracy comes within a constant factor of the noise power. Under Gaussian noise, the Cramer-Rao bound is derived, and it is shown that the greedy techniques come close to this bound at high SNR. The guarantees are numerically compared with the actual performance of block and non-block algorithms, highlighting the advantages of block sparse techniques.
Page 1 /100
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.