Abstract
This chapter considers sparse underdetermined (ill-posed) multivariate multiple linear regression model known in signal processing literature as multiple measurement vector (MMV) model. The objective is to find good recovery of jointly sparse unknown signal vectors from the given multiple measurement vectors which are different linear combinations of the same known elementary vectors. The MMV model is an extension of the compressed sensing (CS) which is an emerging field that has attracted considerable research interest over the past few years. Recently, many popular greedy pursuit algorithms have been extended to MMV setting. All these methods, such as simultaneous normalized iterative hard thresholding (SNIHT), are not resistant to outliers or heavy-tailed errors. In this chapter, we develop a robust SNIHT method that computes the estimates of the sparse signal matrix and the scale of the error distribution simultaneously. The method is based on Huber’s criterion and hence referred to as HUB-SNIHT algorithm. The method can be tuned to have a negligible performance loss compared to SNIHT under Gaussian noise, but obtains superior joint sparse recovery under heavy-tailed non-Gaussian noise conditions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Blanchard, J.D., Cermak, M., Hanle, D., Jin, Y.: Greedy algorithms for joint sparse recovery. IEEE Trans. Signal Process. 62(7), 1694–1704 (2014)
Blumensath, T., Davies, M.E.: Iterative hard thresholding for compressed sensing. Appl. Comput. Harmon. Anal. 27(3), 265–274 (2009)
Blumensath, T., Davies, M.E.: Normalized iterative hard thresholding: guaranteed stability and performance. IEEE J. Sel. Top. Sign. Proces. 4(2), 298–309 (2010)
Candes, E.J., Wakin, M.B.: An introduction to compressive sampling. IEEE Signal Process. Mag. 25(2), 21–30 (2008)
Carrillo, R., Barner, K.: Lorentzian based iterative hard thresholding for compressed sensing. In: Proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP’11), pp. 3664–3667. Prague, Czech Republic (2011)
Chen, J., Huo, X.: Theoretical results on sparse representations of multiple-measurement vectors. IEEE Trans. Signal Process. 54(12), 4634–4643 (2006)
Cotter, S., Rao, B.: Sparse channel estimation via matching pursuit with application to equalization. IEEE Trans. Commun. 50(3), 374–377 (2002)
Donoho, D.: Compressive sensing. IEEE Trans. Inf. Theory 52(2), 5406–5425 (2006)
Duarte, M.F., Eldar, Y.C.: Structured compressed sensing: from theory to applications. IEEE Trans. Signal Process. 59(9), 4053–4085 (2011)
Elad, M.: Sparse and Redundant Representations. Springer, New York (2010)
Eldar, Y.C., Rauhut, H. Average case analysis of multichannel sparse recovery using convex relaxation. IEEE Trans. Inf. Theory 56(1), 505–519 (2010)
Gorodnitsky, I., George, J., Rao, B. Neuromagnetic source imaging with FOCUSS: a recursive weighted minimum norm algorithm. J Electroencephalogr. Clin. Neurophysiol. 95(4), 231–251 (1995)
Gribonval, R., Zibulevsky, M. Handbook of Blind Source Separation, pp. 367–420. Academic, Oxford (2010)
Gribonval, R., Rauhut, H., Schnass, K., Vandergheynst, P.: Atoms of all channels, unite! average case analysis of multi-channel sparse recovery using greedy algorithms. J. Fourier Anal. Appl. 14(5–6), 655–687 (2008)
Huber, P.J.: Robust Statistics. Wiley, New York (1981)
Malioutov, D., Çetin, M., Willsky, A.S.: A sparse signal reconstruction perspective for source localization with sensor arrays. IEEE Trans. Signal Process. 53(8), 3010–3022 (2005)
Maronna, R.A., Martin, R.D., Yohai, V.J.: Robust Statistics: Theory and Methods. Wiley, New York (2006)
Ollila, E.: Nonparametric simultaneous sparse recovery: an application to source localization. In: Proceedings of the European Signal Processing Conference (EUSIPCO’15). Nice, France (2015). http://arxiv.org/abs/1502.02441
Ollila, E., Kim, H.J., Koivunen, V.: Robust compressive sensing. In: Regularization, Optimization, Kernels, and Support Vector Machines. Machine Learning and Pattern Recognition Series, pp 217–236. Chapman and Hall/CRC. Boca Raton, USA (2014a)
Ollila, E., Kim, H.J., Koivunen, V. Robust iterative hard thresholding for compressed sensing. In: Proceedings of the IEEE International Symposium on Communications, Control and Signal Processing (ISCCSP’14), pp. 226–229. Athens, Greece (2014b)
Ou, W., Hämäläinen, M.S., Golland, P. A distributed spatio-temporal EEG/MEG inverse solver. Neuroimage 44(3), 932–946 (2009)
Owen, A.B.: A robust hybrid of lasso and ridge regression. Contemp. Math. 443, 59–72 (2007)
Razavi, S.A., Ollila, E., Koivunen, V.: Robust greedy algorithms for compressed sensing. In: Proceedings of the 20th European Signal Processing Conference (EUSIPCO’12), pp 969–973. Bucharest, Romania (2009)
Tibshirani, R. Regression shrinkage and selection via the lasso. J. R. Stat. Soc. Ser. B 58, 267–288 (1996)
Tropp, J.A.: Algorithms for simultaneous sparse approximation. part II: convex relaxation. Signal Process. 86, 589–602 (2006)
Tropp, J.A., Gilbert, A.C., Strauss, M.J.: Algorithms for simultaneous sparse approximation. part I: greedy pursuit. Signal Process. 86, 572–588 (2006)
Turlach, B., Venables, W.N., Wright, S.J.: Simultaneous variable selection. Technometrics 47(3), 349–363 (2005)
Acknowledgements
The author wishes to thank Academy of Finland for supporting this research.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Ollila, E. (2015). Robust Simultaneous Sparse Approximation. In: Nordhausen, K., Taskinen, S. (eds) Modern Nonparametric, Robust and Multivariate Methods. Springer, Cham. https://doi.org/10.1007/978-3-319-22404-6_26
Download citation
DOI: https://doi.org/10.1007/978-3-319-22404-6_26
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-22403-9
Online ISBN: 978-3-319-22404-6
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)