Abstract
In this thesis, we studied sparsity-constrained optimization problems and proposed a number of greedy algorithms as approximate solvers for these problems. Unlike the existing convex programming methods, the proposed greedy methods do not require the objective to be convex everywhere and produce a solution that is exactly sparse. We showed that if the objective function has well-behaved second order variations, namely if it obeys the SRH or the SRL conditions, then our proposed algorithms provide accurate solutions. Some of these algorithms are also examined through simulations for the 1-bit CS problem and sparse logistic regression. In our work the minimization of functions subject to structured sparsity is also addressed. Assuming the objective function obeys a variant of the SRH condition tailored for model-based sparsity, we showed that a non-convex PGD method can produce an accurate estimate of the underlying parameter.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Bahmani, S. (2014). Conclusion and Future Work. In: Algorithms for Sparsity-Constrained Optimization. Springer Theses, vol 261. Springer, Cham. https://doi.org/10.1007/978-3-319-01881-2_7
Download citation
DOI: https://doi.org/10.1007/978-3-319-01881-2_7
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-01880-5
Online ISBN: 978-3-319-01881-2
eBook Packages: EngineeringEngineering (R0)