TUHH Open Research
Help
  • Log In
    New user? Click here to register.Have you forgotten your password?
  • English
  • Deutsch
  • Communities & Collections
  • Publications
  • Research Data
  • People
  • Institutions
  • Projects
  • Statistics
  1. Home
  2. TUHH
  3. Publication References
  4. Simultaneous Structures in Convex Signal Recovery—Revisiting the Convex Combination of Norms
 
Options

Simultaneous Structures in Convex Signal Recovery—Revisiting the Convex Combination of Norms

Publikationstyp
Journal Article
Date Issued
2019-05-28
Sprache
English
Author(s)
Kliesch, Martin  
Szarek, Stanislaw J.  
Jung, Peter  
TORE-URI
http://hdl.handle.net/11420/14085
Journal
Frontiers in applied mathematics and statistics  
Volume
5
Article Number
23
Citation
Frontiers in Applied Mathematics and Statistics 5: 23 (2019-05-28)
Publisher DOI
10.3389/fams.2019.00023
Scopus ID
2-s2.0-85077554757
In compressed sensing one uses known structures of otherwise unknown signals to recover them from as few linear observations as possible. The structure comes in form of some compressibility including different notions of sparsity and low rankness. In many cases convex relaxations allow to efficiently solve the inverse problems using standard convex solvers at almost-optimal sampling rates. A standard practice to account for multiple simultaneous structures in convex optimization is to add further regularizers or constraints. From the compressed sensing perspective there is then the hope to also improve the sampling rate. Unfortunately, when taking simple combinations of regularizers, this seems not to be automatically the case as it has been shown for several examples in recent works. Here, we give an overview over ideas of combining multiple structures in convex programs by taking weighted sums and weighted maximums. We discuss explicitly cases where optimal weights are used reflecting an optimal tuning of the reconstruction. In particular, we extend known lower bounds on the number of required measurements to the optimally weighted maximum by using geometric arguments. As examples, we discuss simultaneously low rank and sparse matrices and notions of matrix norms (in the “square deal” sense) for regularizing for tensor products. We state an SDP formulation for numerically estimating the statistical dimensions and find a tensor case where the lower bound is roughly met up to a factor of two.
Subjects
compressed sensing
convex relaxation
low rank
matrix
reconstruction
sparse
statistical dimension
tensor
TUHH
Weiterführende Links
  • Contact
  • Send Feedback
  • Cookie settings
  • Privacy policy
  • Impress
DSpace Software

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science
Design by effective webwork GmbH

  • Deutsche NationalbibliothekDeutsche Nationalbibliothek
  • ORCiD Member OrganizationORCiD Member Organization
  • DataCiteDataCite
  • Re3DataRe3Data
  • OpenDOAROpenDOAR
  • OpenAireOpenAire
  • BASE Bielefeld Academic Search EngineBASE Bielefeld Academic Search Engine
Feedback