Journal Article

Scaling law for recovering the sparsest element in a subspace

Laurent Demanet and Paul Hand

in Information and Inference: A Journal of the IMA

Published on behalf of The Institute of Mathematics and its Applications

Volume 3, issue 4, pages 295-309
Published in print December 2014 | ISSN: 2049-8764
Published online July 2014 | e-ISSN: 2049-8772 | DOI: https://dx.doi.org/10.1093/imaiai/iau007
Scaling law for recovering the sparsest element in a subspace

More Like This

Show all results sharing these subjects:

  • Science and Mathematics
  • Mathematics
  • Applied Mathematics
  • Computer Science

GO

Show Summary Details

Preview

We address the problem of recovering a sparse n-vector within a given subspace. This problem is a subtask of some approaches to dictionary learning and sparse principal component analysis. Hence, if we can prove scaling laws for recovery of sparse vectors, it will be easier to derive and prove recovery results in these applications. In this paper, we present a scaling law for recovering the sparse vector from a subspace that is spanned by the sparse vector and k random vectors. We prove that the sparse vector will be the output to one of n linear programs with high probability if its support size s satisfies [math]. The scaling law still holds when the desired vector is approximately sparse. To get a single estimate for the sparse vector from the n linear programs, we must select which output is the sparsest. This selection process can be based on any proxy for sparsity, and the specific proxy has the potential to improve or worsen the scaling law. If sparsity is interpreted in an ℓ1/ℓ sense, then the scaling law cannot be better than [math]. Computer simulations show that selecting the sparsest output in the ℓ1/ℓ2 or thresholded-ℓ0 senses can lead to a larger parameter range for successful recovery than that given by the ℓ1/ℓ sense.

Keywords: sparsity; linear programming; signal recovery; sparse principal component analysis; dictionary learning

Journal Article.  0 words. 

Subjects: Science and Mathematics ; Mathematics ; Applied Mathematics ; Computer Science

Full text: subscription required

How to subscribe Recommend to my Librarian

Users without a subscription are not able to see the full content. Please, subscribe or login to access all content. subscribe or purchase to access all content.