Abstract
Approximate message passing (AMP) is a known compressive sensing (CS) algorithm, owing to it being computationally efficient, and having high performance and a deterministic state evolution (SE) trajectory. Turbo generalized AMP (Turbo-GAMP) was proposed based on AMP, and it was extended to multitask CS with multiple measurement vectors (MMVs). The resulting Turbo-GAMP-MMV can reconstruct multiple structured-sparse signals when they are well correlated. This paper considers the case where the CS tasks belong to various groups and signals from different groups may have weak correlation. We explore the SE property to enhance Turbo-GAMP-MMV for weakly correlated signals. The developed methods first conduct task classification via dividing CS tasks into groups and then reconstruct the original signals from each group jointly. Experiments using synthetic signals and grey images show that the new algorithms outperform several state-of-art benchmark techniques.






Similar content being viewed by others
References
Donoho, D.L., Maleki, A., Montanari, A.: Message passing algorithms for compressed sensing: I. Motivation and construction. In: Proceedings of Information Theory Workshop, Cairo, Egypt, pp. 1–5 (2010)
Bayati, M., Montanari, A.: The dynamics of message passing on dense graphs, with applications to compressed sensing. IEEE Trans. Inf. Theory 57, 764–785 (2011)
Rangan, S.: Generalized approximate message passing for estimation with random linear mixing. In: Proceedings of IEEE International Symposium on Information Theory (ISIT), Saint Petersburg, Russia, pp. 2168–2172 (2011)
Vila, J.P., Schniter, P.: Expectation-maximization Gaussian-mixture approximate message passing. IEEE Trans. Signal Process. 61(19), 4658–4672 (2013)
Schniter, P.: Turbo reconstruction of structured sparse signals. In: CISS (Princeton) (2010)
Compressive sensing resources: references and software. http://dsp.rice.edu/cs
Ziniel, J., Schniter, P.: Efficient high-dimensional inference in the multiple measurement vector problem (2011). arXiv:1111.5272 [cs.IT]
Ziniel, J., Rangan, S., Schniter, P.: A generalized framework for learning and recovery of structured sparse signals. In: Proceedings of Ann Arbor, MI, USA, Aug, IEEE Workshop Statistical Signal Processing (2012)
Ji, S., Dunson, D., Carin, L.: Multitask compressive sensing. IEEE Trans. Signal Process. 57(1), 92–106 (2009)
Wang, Y., Yang, L., Liu, Z., Jiang, W.: SBL-based multi-task algorithms for recovering block-sparse signals with unknown partitions. EURASIP J. Adv. Signal Process. 2014, 14 (2014). doi:10.1186/1687-6180-2014-14
Zhang, Z., Rao, B.D.: Extension of SBL algorithms for the recovery of block sparse signals with intra-block correlation. to appear in IEEE Trans. Signal Process. arXiv:1201.0862
Wang, Y., Yang, L., Tang, L., Liu, Z., Jiang, W.: Enhanced multi-task compressive sensing using Laplace priors and MDL-based task classification. EURASIP J. Adv. Signal Process. 2013, 160 (2013). doi:10.1186/1687-6180-2013-160
Babacan, S., Molina, R., Katsaggelos, A.: Bayesian compressive sensing using Laplace priors. IEEE Trans. Image Process. 19(1), 53–63 (2010)
Qi, Y., Liu, D., Carin, L., Dunson, D.: Multi-task compressive sensing with Dirichlet process priors. In: Proceedings of the 25th International Conference on Machine Learning, Helsinki, Finland (2008)
Rissanen, J.: Modeling by shortest data description. Automatica 14, 465–471 (1978)
Rissanen, J.: Universal coding, information, prediction, and estimation. IEEE Trans. Inf. Theory 30(4), 629–636 (1984)
Barron, A., Rissanen, J., Yu, B.: The minimum description length principle in coding and modeling. IEEE Trans. Inf. Theory 44(6), 2743–2760 (1998)
Author information
Authors and Affiliations
Corresponding author
Additional information
This work was supported by the Natural Science Foundation of China (Nos. 61304264 and 61305017).
Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
About this article
Cite this article
Wang, YG., Yang, L., Tang, ZY. et al. Multitask classification and reconstruction using extended Turbo approximate message passing. SIViP 11, 219–226 (2017). https://doi.org/10.1007/s11760-016-0922-5
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11760-016-0922-5