Abstract:
Orthogonal Matching Pursuit (OMP) has long been considered a powerful heuristic for attacking compressive sensing problems; however, its theoretical development is, unfortunately, somewhat lacking. This paper presents an improved Restricted Isometry Property (RIP) based performance guarantee for T-sparse signal reconstruction that asymptotically approaches the conjectured lower bound given in Davenport et al. We also further extend the state-of-the-art by deriving reconstruction error bounds for the case of general non-sparse signals subjected to measurement noise. We then generalize our results to the case of K-fold Orthogonal Matching Pursuit (KOMP). We finish by presenting an empirical analysis suggesting that OMP and KOMP outperform other compressive sensing algorithms in average case scenarios. This turns out to be quite surprising since RIP analysis (i.e. worst case scenario) suggests that these matching pursuits should perform roughly T^0.5 times worse than convex optimization, CoSAMP, and Iterative Thresholding.

Abstract:
This paper presents a new analysis for the orthogonal matching pursuit (OMP) algorithm. It is shown that if the restricted isometry property (RIP) is satisfied at sparsity level $O(\bar{k})$, then OMP can recover a $\bar{k}$-sparse signal in 2-norm. For compressed sensing applications, this result implies that in order to uniformly recover a $\bar{k}$-sparse signal in $\Real^d$, only $O(\bar{k} \ln d)$ random projections are needed. This analysis improves earlier results on OMP that depend on stronger conditions such as mutual incoherence that can only be satisfied with $\Omega(\bar{k}^2 \ln d)$ random projections.

Abstract:
Generalized Orthogonal Matching Pursuit (gOMP) is a natural extension of OMP algorithm where unlike OMP, it may select $N (\geq1)$ atoms in each iteration. In this paper, we demonstrate that gOMP can successfully reconstruct a $K$-sparse signal from a compressed measurement $ {\bf y}={\bf \Phi x}$ by $K^{th}$ iteration if the sensing matrix ${\bf \Phi}$ satisfies restricted isometry property (RIP) of order $NK$ where $\delta_{NK} < \frac {\sqrt{N}}{\sqrt{K}+2\sqrt{N}}$. Our bound offers an improvement over the very recent result shown in \cite{wang_2012b}. Moreover, we present another bound for gOMP of order $NK+1$ with $\delta_{NK+1} < \frac {\sqrt{N}}{\sqrt{K}+\sqrt{N}}$ which exactly relates to the near optimal bound of $\delta_{K+1} < \frac {1}{\sqrt{K}+1}$ for OMP (N=1) as shown in \cite{wang_2012a}.

Abstract:
The orthogonal multi-matching pursuit (OMMP) is a natural extension of orthogonal matching pursuit (OMP). We denote the OMMP with the parameter $M$ as OMMP(M) where $M\geq 1$ is an integer. The main difference between OMP and OMMP(M) is that OMMP(M) selects $M$ atoms per iteration, while OMP only adds one atom to the optimal atom set. In this paper, we study the performance of orthogonal multi-matching pursuit (OMMP) under RIP. In particular, we show that, when the measurement matrix A satisfies $(9s, 1/10)$-RIP, there exists an absolutely constant $M_0\leq 8$ so that OMMP(M_0) can recover $s$-sparse signal within $s$ iterations. We furthermore prove that, for slowly-decaying $s$-sparse signal, OMMP(M) can recover s-sparse signal within $O(\frac{s}{M})$ iterations for a large class of $M$. In particular, for $M=s^a$ with $a\in [0,1/2]$, OMMP(M) can recover slowly-decaying $s$-sparse signal within $O(s^{1-a})$ iterations. The result implies that OMMP can reduce the computational complexity heavily.

Abstract:
We show that if a matrix $\Phi$ satisfies the RIP of order $[CK^{1.2}]$ with isometry constant $\dt = c K^{-0.2}$ and has coherence less than $1/(20 K^{0.8})$, then Orthogonal Matching Pursuit (OMP) will recover $K$-sparse signal $x$ from $y=\Phi x$ in at most $[CK^{1.2}]$ iterations. This result implies that $K$-sparse signal can be recovered via OMP by $M=O(K^{1.6}\log N)$ measurements.

Abstract:
In this paper, we present coherence-based performance guarantees of Orthogonal Matching Pursuit (OMP) for both support recovery and signal reconstruction of sparse signals when the measurements are corrupted by noise. In particular, two variants of OMP either with known sparsity level or with a stopping rule are analyzed. It is shown that if the measurement matrix $X\in\mathbb{C}^{n\times p}$ satisfies the strong coherence property, then with $n\gtrsim\mathcal{O}(k\log p)$, OMP will recover a $k$-sparse signal with high probability. In particular, the performance guarantees obtained here separate the properties required of the measurement matrix from the properties required of the signal, which depends critically on the minimum signal to noise ratio rather than the power profiles of the signal. We also provide performance guarantees for partial support recovery. Comparisons are given with other performance guarantees for OMP using worst-case analysis and the sorted one step thresholding algorithm.

Abstract:
The performance guarantees of generalized orthogonal matching pursuit (gOMP) are considered in the framework of mutual coherence. The gOMP algorithm is an extension of the well-known OMP greed algorithm for compressed sensing. It identifies multiple N indices per iteration to reconstruct sparse signals. The gOMP with N≥2 can perfectly reconstruct any K-sparse signals from measurement y=Φx if K< 1/N((1/μ)-1)+1, where μ is coherence parameter of measurement matrix Φ. Furthermore, the performance of the gOMP in the case of y=Φx+e with bounded noise ‖e‖2≤ε is analyzed and the sufficient condition ensuring identification of correct indices of sparse signals via the gOMP is derived, i.e., K<1/N((1/μ)-1)+1-((2ε)/(Nμxmin)), where xmin denotes the minimum magnitude of the nonzero elements of x. Similarly, the sufficient condition in the case of Gaussian noise is also given. The performance guarantees of generalized orthogonal matching pursuit (gOMP) are considered in the framework of mutual coherence. The gOMP algorithm is an extension of the well-known OMP greed algorithm for compressed sensing. It identifies multiple N indices per iteration to reconstruct sparse signals. The gOMP with N≥2 can perfectly reconstruct any K-sparse signals from measurement y=Φx if K< 1/N((1/μ)-1)+1, where μ is coherence parameter of measurement matrix Φ. Furthermore, the performance of the gOMP in the case of y=Φx+e with bounded noise ‖e‖2≤ε is analyzed and the sufficient condition ensuring identification of correct indices of sparse signals via the gOMP is derived, i.e., K<1/N((1/μ)-1)+1-((2ε)/(Nμxmin)), where xmin denotes the minimum magnitude of the nonzero elements of x. Similarly, the sufficient condition in the case of Gaussian noise is also given.

Abstract:
A sufficient condition reported very recently for perfect recovery of a K-sparse vector via orthogonal matching pursuit (OMP) in K iterations is that the restricted isometry constant of the sensing matrix satisfies delta_K+1<1/(sqrt(delta_K+1)+1). By exploiting an approximate orthogonality condition characterized via the achievable angles between two orthogonal sparse vectors upon compression, this paper shows that the upper bound on delta can be further relaxed to delta_K+1<(sqrt(1+4*delta_K+1)-1)/(2K).This result thus narrows the gap between the so far best known bound and the ultimate performance guarantee delta_K+1<1/(sqrt(delta_K+1)) that is conjectured by Dai and Milenkovic in 2009. The proposed approximate orthogonality condition is also exploited to derive less restricted sufficient conditions for signal reconstruction in several compressive sensing problems, including signal recovery via OMP in a noisy environment, compressive domain interference cancellation, and support identification via the subspace pursuit algorithm.

Abstract:
DOA (Direction of Arrival) estimation is a major problem in array signal processing applications. Recently, compressive sensing algorithms, including convex relaxation algorithms and greedy algorithms, have been recognized as a kind of novel DOA estimation algorithm. However, the success of these algorithms is limited by the RIP (Restricted Isometry Property) condition or the mutual coherence of measurement matrix. In the DOA estimation problem, the columns of measurement matrix are steering vectors corresponding to different DOAs. Thus, it violates the mutual coherence condition. The situation gets worse when there are two sources from two adjacent DOAs. In this paper, an algorithm based on OMP (Orthogonal Matching Pursuit), called ILS-OMP (Iterative Local Searching-Orthogonal Matching Pursuit), is proposed to improve DOA resolution by Iterative Local Searching. Firstly, the conventional OMP algorithm is used to obtain initial estimated DOAs. Then, in each iteration, a local searching process for every estimated DOA is utilized to find a new DOA in a given DOA set to further decrease the residual. Additionally, the estimated DOAs are updated by substituting the initial DOA with the new one. The simulation results demonstrate the advantages of the proposed algorithm.

Abstract:
As a greedy algorithm to recover sparse signals from compressed measurements, orthogonal matching pursuit (OMP) algorithm has received much attention in recent years. In this paper, we introduce an extension of the OMP for pursuing efficiency in reconstructing sparse signals. Our approach, henceforth referred to as generalized OMP (gOMP), is literally a generalization of the OMP in the sense that multiple $N$ indices are identified per iteration. Owing to the selection of multiple ''correct'' indices, the gOMP algorithm is finished with much smaller number of iterations when compared to the OMP. We show that the gOMP can perfectly reconstruct any $K$-sparse signals ($K > 1$), provided that the sensing matrix satisfies the RIP with $\delta_{NK} < \frac{\sqrt{N}}{\sqrt{K} + 3 \sqrt{N}}$. We also demonstrate by empirical simulations that the gOMP has excellent recovery performance comparable to $\ell_1$-minimization technique with fast processing speed and competitive computational complexity.