Skip to main content

Low-Rank and Sparse Matrix Completion for Recommendation

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2017)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 10638))

Included in the following conference series:

Abstract

Recently, recommendation algorithms have been widely used to improve the benefit of businesses and the satisfaction of users in many online platforms. However, most of the existing algorithms generate intermediate output when predicting ratings and the error of intermediate output will be propagated to the final results. Besides, since most algorithms predict all the unrated items, some predicted ratings may be unreliable and useless which will lower the efficiency and effectiveness of recommendation. To this end, we propose a Low-rank and Sparse Matrix Completion (LSMC) method which recovers rating matrix directly to improve the quality of rating prediction. Following the common methodology, we assume the structure of the predicted rating matrix is low-rank since rating is just connected with some factors of user and item. However, different from the existing methods, we assume the matrix is sparse so some unreliable predictions will be removed and important results will be retained. Besides, a slack variable will be used to prevent overfitting and weaken the influence of noisy data. Extensive experiments on four real-world datasets have been conducted to verify that the proposed method outperforms the state-of-the-art recommendation algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Notes

  1. 1.

    http://openresearch.baidu.com.

  2. 2.

    http://snap.stanford.edu/data/web-FineFoods.html.

  3. 3.

    http://eigentaste.berkeley.edu/dataset.

  4. 4.

    http://grouplens.org/datasets/movielens.

References

  1. Argyriou, A., Evgeniou, T., Pontil, M.: Multi-task feature learning. In: Advances in Neural Information Processing Systems, pp. 41–48 (2006)

    Google Scholar 

  2. Bhaskar, S.A.: Probabilistic low-rank matrix recovery from quantized measurements: application to image denoising. In: 2015 49th Asilomar Conference on Signals, Systems and Computers, pp. 541–545 (2015)

    Google Scholar 

  3. Boyd, S.P., Parikh, N., Chu, E., Peleato, B., Eckstein, J.: Distributed optimization and statistical learning via the alternating direction method of multipliers. Found. Trends Mach. Learn. 3(1), 1–122 (2011)

    Article  MATH  Google Scholar 

  4. Cai, J., Candès, E.J., Shen, Z.: A singular value thresholding algorithm for matrix completion. SIAM J. Optim. 20(4), 1956–1982 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  5. Candès, E.J., Li, X., Ma, Y., Wright, J.: Robust principal component analysis. J. ACM 58(3), 1–39 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  6. Chao, T., Lin, Y., Kuo, Y., Hsu, W.H.: Scalable object detection by filter compression with regularized sparse coding. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 3900–3907 (2015)

    Google Scholar 

  7. Cheng, Y., Yin, L., Yu, Y.: LorSLIM: low rank sparse linear methods for Top-N recommendations. In: 2014 IEEE International Conference on Data Mining, pp. 90–99 (2014)

    Google Scholar 

  8. Guo, G., Zhang, J., Yorke-Smith, N.: A novel Bayesian similarity measure for recommender systems. In: Twenty-Third International Joint Conference on Artificial Intelligence, pp. 2619–2625 (2013)

    Google Scholar 

  9. Lemire, D., Maclachlan, A.: Slope one predictors for online rating-based collaborative filtering. In: Proceedings of the 2005 SIAM International Conference on Data Mining, pp. 471–475 (2005)

    Google Scholar 

  10. Lin, Z., Liu, R., Su, Z.: Linearized alternating direction method with adaptive penalty for low rank representation. In: Advances in Neural Information Processing Systems, pp. 612–620 (2011)

    Google Scholar 

  11. Paterek, A.: Improving regularized singular value decomposition for collaborative filtering. In: Kdd Cup & Workshop, pp. 39–42 (2007)

    Google Scholar 

  12. Pirasteh, P., Hwang, D., Jung, J.J.: Exploiting matrix factorization to asymmetric user similarities in recommendation systems. Knowl.-Based Syst. 83, 51–57 (2015)

    Article  Google Scholar 

  13. Roberge, J., Rispal, S., Wong, T., Duchaine, V.: Unsupervised feature learning for classifying dynamic tactile events using sparse coding. In: 2016 IEEE International Conference on Robotics and Automation, pp. 2675–2681 (2016)

    Google Scholar 

  14. Shi, J., Wang, N., Xia, Y., Yeung, D.Y., King, I., Jia, J.: SCMF: sparse covariance matrix factorization for collaborative filtering. In: Twenty-Third International Joint Conference on Artificial Intelligence, pp. 2705–2711 (2013)

    Google Scholar 

  15. Wang, J., de Vries, A.P., Reinders, M.J.: Unifying user-based and item-based collaborative filtering approaches by similarity fusion. In: Proceedings of the 29th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 501–508 (2006)

    Google Scholar 

  16. Zhang, D.C., Li, M., Wang, C.D.: Point of interest recommendation with social and geographical influence. In: IEEE International Conference on Big Data, pp. 1070–1075 (2016)

    Google Scholar 

  17. Zhang, Y., Jiang, Z., Davis, L.S.: Learning structured low-rank representations for image classification. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 676–683 (2013)

    Google Scholar 

  18. Zhang, Z., Bai, L., Liang, Y., Hancock, E.R.: Joint hypergraph learning and sparse regression for feature selection. Pattern Recogn. 63, 291–309 (2017)

    Article  Google Scholar 

  19. Zhao, Z.L., Wang, C.D., Lai, J.H.: AUI&GIV: recommendation with asymmetric user influence and global importance value. PLoS ONE 11(2), e0147944 (2016)

    Article  Google Scholar 

  20. Zhao, Z.L., Wang, C.D., Wan, Y.Y., Lai, J.H., Huang, D.: FTMF: recommendation in social network with feature transfer and probabilistic matrix factorization. In: 2016 International Joint Conference on Neural Networks, pp. 847–854 (2016)

    Google Scholar 

Download references

Acknowledgments

This work was supported by the Fundamental Research Funds for the Central Universities (16lgzd15) and Tip-top Scientific and Technical Innovative Youth Talents of Guangdong special support program (No. 2016TQ03X542).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chang-Dong Wang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhao, ZL., Huang, L., Wang, CD., Lai, JH., Yu, P.S. (2017). Low-Rank and Sparse Matrix Completion for Recommendation. In: Liu, D., Xie, S., Li, Y., Zhao, D., El-Alfy, ES. (eds) Neural Information Processing. ICONIP 2017. Lecture Notes in Computer Science(), vol 10638. Springer, Cham. https://doi.org/10.1007/978-3-319-70139-4_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-70139-4_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-70138-7

  • Online ISBN: 978-3-319-70139-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics