Abstract
In recent years, tensor ring (TR) decomposition has drawn a lot of attention and was successfully applied to tensor completion problem, due to its more compact representation ability. As well known, both global and local structural information is important for tensor completion problem. Although the existing TR-based completion algorithms obtain the impressive performance in visual-data inpainting by using low-rank global structure information, most of them didn’t take into account local smooth property which is often exhibited in visual data. To further improve visual-data inpainting performance, both low-rank and piecewise smooth structures are incorporated in our model. Instead of directly applying local smooth constraint on the data surface, we impose the smoothness on its latent TR-space, which greatly reduces computational cost especially for large-scale data. Extensive experiments on real-world visual data show that our model not only obtains the state-of-the-art performance, but also is rather stable to the TR-ranks owing to the local smooth constraint.











Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Data availability
The data that support the finding of this study are available from the corresponding author or first author upon reasonable request.
References
Meng Z, Zhou Y, Zhao Y (2019) Unsupervised learning low-rank tensor from incomplete and grossly corrupted data. Neural Comput Appl 31(12):8327–8335
Liu J, Musialski P, Wonka P, Ye J (2012) Tensor completion for estimating missing values in visual data. IEEE Trans Pattern Anal Mach Intell 35(1):208–220
Al-Obeidat F, Rocha Á, Khan MS, Maqbool F, Razzaq S (2021) Parallel tensor factorization for relational learning. Neural Comput Appl 34(11):8455–8464
Fan J (2021) Multi-mode deep matrix and tensor factorization. International Conference on Learning Representations
Candès EJ, Recht B (2009) Exact matrix completion via convex optimization. Found Comput Math 9(6):717–772
Fan J, Ding L, Chen Y, Udell M (2019) Factor group-sparse regularization for efficient low-rank matrix recovery. Adv Neural Inf Process Syst 33:5104-5114
Bro R (1997) PARAFAC. Tutorial and applications. Chemom Intell Lab Syst 38:149–171
Zhou G, Cichocki A, Xie S (2013) Accelerated canonical polyadic decomposition using mode reduction. IEEE Trans Neural Netw Learn Syst 24(12):2051–2062. https://doi.org/10.1109/TNNLS.2013.2271507
Sørensen M, De Lathauwer L (2015) New uniqueness conditions for the canonical polyadic decomposition of third-order tensors. SIAM J Matrix Anal Appl 36(4):1381–1403
Domanov I, De Lathauwer L (2017) Canonical polyadic decomposition of third-order tensors: relaxed uniqueness conditions and algebraic algorithm. Linear Algebra Appl 513:342–375
Gong X-F, Lin Q-H, Cong F-Y, De Lathauwer L (2018) Double coupled canonical polyadic decomposition for joint blind source separation. IEEE Trans Signal Process 66(13):3475–3490
Tucker LR (1966) Some mathematical notes on three-mode factor analysis. Psychometrika 31(3):279–311. https://doi.org/10.1007/BF02289464
Tan Q, Yang P, Wen G (2021) Deep non-negative tensor factorization with multi-way emg data. Neural Comput Appl 34:1307–1317
Yang L, Fang J, Li H, Zeng B (2016) An iterative reweighted method for tucker decomposition of incomplete tensors. IEEE Trans Signal Process 64(18):4817–4829
Li X, Ng MK, Cong G, Ye Y, Wu Q (2017) Mr-ntd: manifold regularization nonnegative tucker decomposition for tensor data dimension reduction and representation. IEEE Trans Neural Netw Learn Syst 28(8):1787–1800
Smith S, Karypis G (2017) Accelerating the tucker decomposition with compressed sparse tensors. Springer, Berlin, pp 653–668
Chen X et al (2018) A generalized model for robust tensor factorization with noise modeling by mixture of gaussians. IEEE Trans Neural Netw Learn Syst 99:1–14
Oseledets IV (2011) Tensor-train decomposition. SIAM J Sci Comput 33(5):2295–2317. https://doi.org/10.1137/090752286
Bigoni D, Engsig-Karup AP, Marzouk YM (2016) Spectral tensor-train decomposition. SIAM J Sci Comput 38(4):A2405–A2439
Huber B, Schneider R, Wolf S (2017) A randomized tensor train singular value decomposition. Springer, Berlin, pp 261–290
Chen Z, Batselier K, Suykens JA, Wong N (2017) Parallelized tensor train learning of polynomial classifiers. IEEE Trans Neural Netw Learn Syst 99:1–12
Dian R, Li S, Fang L (2019) Learning a low tensor-train rank representation for hyperspectral image super-resolution. IEEE Trans Neural Netw Learn Syst 30:2672–2683
Hackbusch W (2012) Tensor spaces and numerical tensor calculus, vol 42. Springer, Berlin
Zhao Q, Zhou G, Xie S, Zhang L, Cichocki A (2016) Tensor ring decomposition. CoRR arXiv:1606.05535
Zhao Q, Sugiyama M, Yuan L, Cichocki A (2019) Learning efficient tensor representations with ring-structured networks. IEEE, New York, pp 8608–8612
Wang W, Aggarwal V, Aeron S (2017) Efficient low rank tensor ring completion. Proceedings of the IEEE International Conference on Computer Vision, pp 5698–5706
Yuan L, Li C, Mandic D, Cao J, Zhao Q (2019) Tensor ring decomposition with rank minimization on latent space: an efficient approach for tensor completion. Proc Conf AAAI Artif Intell 33(01):9151–9158
Yu J, Li C, Zhao Q, Zhao G (2019) Tensor-ring nuclear norm minimization and application for visual-data completion. ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp 3142–3146
Yu J, Zhou G, Li C, Zhao Q, Xie S (2020) Low tensor-ring rank completion by parallel matrix factorization. IEEE Trans Neural Netw Learn Syst 32:3020–3033
Chen Y-L, Hsu C-T, Liao H-YM (2014) Simultaneous tensor decomposition and completion using factor priors. IEEE Trans Pattern Anal Mach Intell 36(3):577–591
Zhao Q, Zhang L, Cichocki A (2015) Bayesian cp factorization of incomplete tensors with automatic rank determination. IEEE Trans Pattern Anal Mach Intell 37(9):1751–1763. https://doi.org/10.1109/TPAMI.2015.2392756
Yokota T, Zhao Q, Cichocki A (2016) Smooth parafac decomposition for tensor completion. IEEE Trans Signal Process 64(20):5423–5436
Li X, Ye Y, Xu X (2017) Low-rank tensor completion with total variation for visual data inpainting. Proc Conf AAAI Artif Intell 31(1):2210–2216
He W, Yokoya N, Yuan L, Zhao Q (2019) Remote sensing image reconstruction using tensor ring completion and total variation. IEEE Trans Geosci Remote Sens 57(11):8998–9009
Ji T-Y, Huang T-Z, Zhao X-L, Ma T-H, Liu G (2016) Tensor completion using total variation and low-rank matrix factorization. Inf Sci 326:243–257
Jiang T-X, Huang T-Z, Zhao X-L, Ji T-Y, Deng L-J (2018) Matrix factorization for low-rank tensor completion using framelet prior. Inf Sci 436:403–417
Zheng Y-B et al (2019) Low-rank tensor completion via smooth matrix factorization. Appl Math Model 70:677–695
Kolda TG, Bader BW (2009) Tensor decompositions and applications. SIAM Rev 51(3):455–500. https://doi.org/10.1137/07070111X
Bengua JA, Phien HN, Tuan HD, Do MN (2017) Efficient tensor completion for color image and video recovery: low-rank tensor train. IEEE Trans Image Process 26(5):2466–2479
Yuan L, Zhao Q, Cao J (2018) High-order tensor completion for data recovery via sparse tensor-train optimization. IEEE, New York, pp 1258–1262
Yuan L, Cao J, Wu Q, Zhao Q (2018) Higher-dimension tensor completion via low-rank tensor ring decomposition. arXiv:1807.01589
Wang Z, Bovik AC, Sheikh HR, Simoncelli EP (2004) Image quality assessment: from error visibility to structural similarity. IEEE Trans Image Process 13(4):600–612
Liu J, Musialski P, Wonka P, Ye J (2013) Tensor completion for estimating missing values in visual data. IEEE Trans Pattern Anal Mach Intell 35(1):208–220. https://doi.org/10.1109/TPAMI.2012.39
Acknowledgements
This work has been supported in part by the National Natural Science Foundation of China (No. 62203128, 52171331), in part by the Guangdong Province Key Field R &D Program, China (No. 2020B0101050001), and in part by the Science and Technology Planning Project of Guangzhou City under Grants 202102010411.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Yu, J., Zou, T. & Zhou, G. Low tensor-ring rank completion: parallel matrix factorization with smoothness on latent space. Neural Comput & Applic 35, 7003–7016 (2023). https://doi.org/10.1007/s00521-022-08023-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-022-08023-5