Abstract
The low-rank matrix completion problem is a fundamental machine learning and data mining problem with many important applications. The standard low-rank matrix completion methods relax the rank minimization problem by the trace norm minimization. However, this relaxation may make the solution seriously deviate from the original solution. Meanwhile, most completion methods minimize the squared prediction errors on the observed entries, which is sensitive to outliers. In this paper, we propose a new robust matrix completion method to address these two problems. The joint Schatten \(p\)-norm and \(\ell _p\)-norm are used to better approximate the rank minimization problem and enhance the robustness to outliers. The extensive experiments are performed on both synthetic data and real-world applications in collaborative filtering prediction and social network link recovery. All empirical results show that our new method outperforms the standard matrix completion methods.



Similar content being viewed by others
Notes
When \(p \ge 1, \left\| v\right\| _p = (\sum _{i=1}^n|v_i|^p)^{\frac{1}{p}}\) strictly defines a norm that satisfies the three norm conditions, while it defines a quasinorm when \(0 < p < 1\). The quasinorm extends the standard norm in the sense that it replaces the triangle inequality by \(\left\| \mathrm{x}+\mathrm{y}\right\| _p \le K (\left\| \mathrm{x}\right\| _p + \left\| \mathrm{y}\right\| _p)\) for some \(K > 1\). Because the mathematical formulations and derivations in this paper equally apply to both norm and quasinorm, we do not differentiate these two concepts for notation brevity.
References
Srebro N, Rennie J, Jaakkola T (2004) Maximum margin matrix factorization. Conf Neural Inf Process Syst (NIPS) 17:1329–1336
Rennie J, Srebro N (2005) Fast maximum margin matrix factorization for collaborative prediction. In: International conference on machine learning (ICML)
Abernethy J, Bach F, Evgeniou T, Vert JP (2009) A new approach to collaborative filtering: operator estimation with spectral regularization. J Mach Learn Res (JMLR) 10:803–826
Candès E, Recht B (2009) Exact matrix completion via convex optimization. Found Comput Math 9(6):717–772
Candes EJ, Tao T (2009) The power of convex relaxation: near-optimal matrix completion. IEEE Trans Inform Theory 56(5):2053–2080
Recht B, Fazel M, Parrilo PA (2010) Guaranteed minimum rank solutions of linear matrix equations via nuclear norm minimization. SIAM Rev 52(3):471–501
Mazumder R, Hastie T, Tibshirani R (2010) Spectral regularization algorithms for learning large incomplete matrices. J Mach Learn Res (JMLR) 11:2287–2322
Cai J-F, Candes EJ, Shen Z (2008) A singular value thresholding algorithm for matrix completion. SIAM J Opt 20(4):1956–1982
Toh K, Yun S (2010) An accelerated proximal gradient algorithm for nuclear norm regularized linear least squares problems. Pac J Opt 6:615–640
Ji S, Ye Y (2009) An accelerated gradient method for trace norm minimization. In: International conference on machine learning (ICML)
Liu Y-J, Sun D, Toh K-C (2012) An implementable proximal point algorithmic framework for nuclear norm minimization. Math Program 133(1–2):399–436
Ma S, Goldfarb D, Chen L (2011) Fixed point and Bregman iterative methods for matrix rank minimization. Math Program 128(1–2):321–353
Recht B (2011) A simpler approach to matrix completion. J Mach Learn Res 12:3413–3430
Vishwanath S (2010) Information theoretic bounds for low-rank matrix completion. In: IEEE international symposium on information theory proceedings (ISIT), pp 1508–1512
Koltchinskii V, Lounici K, Tsybakov A (2011) Nuclear norm penalization and optimal rates for noisy low rank matrix completion. Ann Stat 39:2302–2329
Salakhutdinov R, Srebro N (2010) Collaborative filtering in a non-uniform world: learning with the weighted trace norm. Adv Neural Inf Process Syst (NIPS) 23:1–8
Pong TK, Tseng P, Ji S, Ye J (2010) Trace norm regularization: reformulations, algorithms, and multi-task learning. SIAM J Opt 20(6):3465–3489
Nie F, Wang H, Cai X, Huang H, Ding C (2012) Robust matrix completion via joint schatten \(p\)-norm and \(l_p\)-norm minimization. In: IEEE international conference on data mining (ICDM), pp 566–574
Huang J, Nie F, Huang H, Lei Y, Ding C (2013) Social trust prediction using rank-k matrix recovery. In: 23rd international joint conference on artificial intelligence
Huang J, Nie F, Huang H (2013) Robust discrete matrix completion. In: Twenty-seventh AAAI conference on artificial intelligence (AAAI-13), pp 424–430
Tan VY, Balzano L, Draper SC (2011) Rank minimization over finite fields. In: IEEE international symposium on information theory proceedings (ISIT), pp 1195–1199
Meka R, Jain P, Dhillon IS (2010) Guaranteed rank minimization via singular value projection. In: Conference on neural information processing systems (NIPS)
Gabidulin EM (1985) Theory of codes with maximum rank distance. Problemy Peredachi Informatsii 21(1):3–16
Loidreau P (2008) Properties of codes in rank metric. In: Eleventh international workshop on algebraic and combinatorial coding theory, pp 192–198
Fazel M, Hindi H, Boyd SP (2001) A rank minimization heuristic with application to minimum order system approximation. IEEE Am Control Conf 6:4734–4739
Fazel M, Hindi H, Boyd SP (2003) Log-det heuristic for matrix rank minimization with applications to Hankel and Euclidean distance matrices. IEEE Am Control Conf 3:2156–2162
Blomer J, Karp R, Welzl E (1997) The rank of sparse random matrices over finite fields. Random Struct Algorithms 10(4):407–420
Ma S, Goldfarb D, Chen L (2011) Fixed point and Bregman iterative methods for matrix rank minimization. Math Program 128(1–2):321–353
Nie F, Huang H, Ding CHQ (2012) Low-rank matrix recovery via efficient schatten p-norm minimization. In: AAAI conference on artificial intelligence
Nie F, Huang H, Cai X, Ding C (2010) Efficient and robust feature selection via joint \(\ell _{2,1}\)-norms minimization. In: Conference on neural information processing systems (NIPS)
Powell MJD (1969) A method for nonlinear constraints in minimization problems. In: Fletcher R (ed) Optimization. Academic Press, London
Hestenes MR (1969) Multiplier and gradient methods. J Opt Theory Appl 4:303–320
Bertsekas DP (1996) Constrained optimization and lagrange multiplier methods. Athena Scientific, Belmont
Gabay D, Mercier B (1969) A dual algorithm for the solution of nonlinear variational problems via finite element approximation. Comput Math Appl 2(1):17–40
Wright J, Ganesh A, Rao S, Ma Y (2009) Robust principal component analysis: exact recovery of corrupted low-rank matrices via convex optimization. In: The proceedings of the conference on neural information processing systems. pp 1–9
Candes E, Plan Y (2010) Matrix completion with noise. Proc IEEE 98(6):925–936
Salakhutdinov R, Mnih A (2008) Probabilistic matrix factorization. Adv Neural Inf Process Syst (NIPS) 20:1257–1264
Gu Q, Zhou J, Ding C (2010) Collaborative filtering: weighted nonnegative matrix factorization incorporating user and item graphs. In: Siam data mining conference
Leskovec J, Huttenlocher D, Kleinberg J (2010) Predicting positive and negative links in online social networks. In: International world wide web conference (WWW). ACM, pp 641–650
Leskovec J, Lang K, Dasgupta A, Mahoney M (2009) Community structure in large networks: natural cluster sizes and the absence of large well-defined clusters. Int Math 6(1):29–123
Newman M (2001) Clustering and preferential attachment in growing networks. Phys Rev E 64(2):025102
Billsus D, Pazzani M (1998) Learning collaborative information filters. In: International conference on machine learning (ICML), pp 46–54
Acknowledgments
This research was partially supported by NSF DMS-0915228, IIS-1117965, IIS-1302675, and IIS-1344152.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Nie, F., Wang, H., Huang, H. et al. Joint Schatten \(p\)-norm and \(\ell _p\)-norm robust matrix completion for missing value recovery. Knowl Inf Syst 42, 525–544 (2015). https://doi.org/10.1007/s10115-013-0713-z
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10115-013-0713-z