Advertisement

An Efficient Deep Learning Model for Recommender Systems

  • Kourosh Modarresi
  • Jamie Diner
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10861)

Abstract

Recommending the best and optimal content to user is the essential part of digital space activities and online user interactions. For example, we like to know what items should be sent to a user, what promotion is the best one for a user, what web design would fit a specific user, what ad a user would be more susceptible to or what creative cloud package is more suitable to a specific user.

In this work, we use deep learning (autoencoders) to create a new model for this purpose. The previous art includes using Autoencoders for numerical features only and we extend the application of autoencoders to non-numerical features.

Our approach in coming up with recommendation is using “matrix completion” approach which is the most efficient and direct way of finding and evaluating content recommendation.

Keywords

Recommender systems Artificial intelligence Deep learning 

References

  1. 1.
    Becker, S., Bobin, J., Candès, E.J.: NESTA, a fast and accurate first-order method for sparse recovery. SIAM J. Imaging Sci. 4(1), 1–39 (2009)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Bjorck, A.: Numerical Methods for Least Squares Problems. SIAM, Philadelphia (1996)CrossRefGoogle Scholar
  3. 3.
    Boyd, S., Vandenberghe, L.: Convex Optimization. Cambridge University Press, Cambridge (2004)Google Scholar
  4. 4.
    Breese, J.S., Heckerman, D., Kadie, C.: Empirical analysis of predictive algorithms for collaborative filtering. In: Proceedings of Fourteenth Conference on Uncertainty in Artificial Intelligence. Morgan Kaufmann (1998)Google Scholar
  5. 5.
    Cai, J.-F., Candès, E.J., Shen, Z.: A singular value thresholding algorithm for matrix completion. SIAM J. Optim. 20(4), 1956–1982 (2008)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Candès, E.J., Recht, B.: Exact matrix completion via convex optimization. Found. Comput. Math. 9, 717–772 (2008)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Candès, E.J.: Compressive sampling. In: Proceedings of the International Congress of Mathematicians, Madrid, Spain (2006)Google Scholar
  8. 8.
    Chen, P.-Y., Wu, S.-Y., Yoon, J.: The impact of online recommendations and consumer feedback on sales. In: Proceedings of the 25th International Conference on Information Systems, pp. 711–724 (2004)Google Scholar
  9. 9.
    Cho, Y.H., Kim, J.K., Kim, S.H.: A personalized recommender system based on web usage mining and decision tree induction. Expert Syst. Appl. 23, 329–342 (2002)CrossRefGoogle Scholar
  10. 10.
    Claypool, M., Gokhale, A., Miranda, T., Murnikov, P., Netes, D., Sartin M.: Combining content-based and collaborative filters in an online newspaper. In: Proceedings of the ACM SIGIR 1999 Workshop on Recommender Systems (1999)Google Scholar
  11. 11.
    Çoba, L., Zanker, M.: rrecsys: an R-package for prototyping recommendation algorithms. In: RecSys 2016 Poster Proceedings (2016)Google Scholar
  12. 12.
  13. 13.
  14. 14.
  15. 15.
  16. 16.
  17. 17.
  18. 18.
    Data, Census: Click on the “Compare Large Cities and Towns for Population, Housing, Area, and Density” link on Census 2000. https://factfinder.census.gov/faces/nav/jsf/pages/community_facts.xhtml
  19. 19.
  20. 20.
  21. 21.
  22. 22.
  23. 23.
  24. 24.
  25. 25.
  26. 26.
  27. 27.
    d’Aspremont, A., El Ghaoui, L., Jordan, M.I., Lanckriet, G.R.G.: A direct formulation for sparse PCA using semidefinite programming. SIAM Rev. 49(3), 434–448 (2007)MathSciNetCrossRefGoogle Scholar
  28. 28.
    Davies, A.R., Hassan, M.F.: Optimality in the regularization of ill-posed inverse problems. In: Sabatier, P.C. (ed.) Inverse Problems: An Interdisciplinary Study. Academic Press, London (1987)Google Scholar
  29. 29.
    DeMoor, B., Golub, G.H.: The restricted singular value decomposition: properties and applications. SIAM J. Matrix Anal. Appl. 12(3), 401–425 (1991)MathSciNetCrossRefGoogle Scholar
  30. 30.
    Donoho, D.L., Tanner, J.: Sparse nonnegative solutions of underdetermined linear equations by linear programming. Proc. Natl. Acad. Sci. 102(27), 9446–9451 (2005)MathSciNetCrossRefGoogle Scholar
  31. 31.
    Efron, B., Hastie, T., Johnstone, I., Tibshirani, R.: Least angle regression. Ann. Stat. 32, 407–499 (2004)MathSciNetCrossRefGoogle Scholar
  32. 32.
    Elden, L.: Algorithms for the regularization of ill-conditioned least squares problems. BIT 17, 134–145 (1977)MathSciNetCrossRefGoogle Scholar
  33. 33.
    Elden, L.: A note on the computation of the generalized cross-validation function for ill-conditioned least squares problems. BIT 24, 467–472 (1984)MathSciNetCrossRefGoogle Scholar
  34. 34.
    Engl, H.W., Hanke, M., Neubauer, A.: Regularization methods for the stable solution of inverse problems. Surv. Math. Ind. 3, 71–143 (1993)Google Scholar
  35. 35.
    Engl, H.W., Hanke, M., Neubauer, A.: Regularization of Inverse Problems. Kluwer, Dordrecht (1996)CrossRefGoogle Scholar
  36. 36.
    Engl, H.W., Kunisch, K., Neubauer, A.: Convergence rates for Tikhonov regularisation of non-linear ill-posed problems. Inverse Prob. 5, 523–540 (1998)CrossRefGoogle Scholar
  37. 37.
    Engl, H.W., Groetsch, C.W. (eds.): Inverse and Ill-Posed Problems. Academic Press, London (1987)zbMATHGoogle Scholar
  38. 38.
    Gander, W.: On the linear least squares problem with a quadratic Constraint. Technical report STAN-CS-78–697, Stanford University (1978)Google Scholar
  39. 39.
    Golub, G.H., Van Loan, C.F.: Matrix Computations. Computer Assisted Mechanics and Engineering Sciences, 4th edn. Johns Hopkins University Press, US, (2013)Google Scholar
  40. 40.
    Golub, G.H., Van Loan, C.F.: An analysis of the total least squares problem. SIAM J. Numer. Anal. 17, 883–893 (1980)MathSciNetCrossRefGoogle Scholar
  41. 41.
    Golub, G.H., Kahan, W.: Calculating the singular values and pseudo-inverse of a matrix. SIAM J. Numer. Anal. Ser. B 2, 205–224 (1965)MathSciNetzbMATHGoogle Scholar
  42. 42.
    Golub, G.H., Heath, M., Wahba, G.: Generalized cross-validation as a method for choosing a good ridge parameter. Technometrics 21, 215–223 (1979)MathSciNetCrossRefGoogle Scholar
  43. 43.
    Guo, S., Wang, M., Leskovec, J.: The role of social networks in online shopping: information passing, price of trust, and consumer choice. In: ACM Conference on Electronic Commerce (EC) (2011)Google Scholar
  44. 44.
    Häubl, G., Trifts, V.: Consumer decision making in online shopping environments: the effects of interactive decision aids 19, 4–21 (2000)CrossRefGoogle Scholar
  45. 45.
    Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning; Data mining, Inference and Prediction. Springer, New York (2001).  https://doi.org/10.1007/978-0-387-84858-7
  46. 46.
    Hastie, T.J., Tibshirani, R.: Handwritten Digit Recognition via Deformable Prototypes. AT&T Bell Laboratories Technical report (1994)Google Scholar
  47. 47.
    Hastie, T., Tibshirani, R., Eisen, M., Brown, P., Ross, D., Scherf, U., Weinstein, J., Alizadeh, A., Staudt, L., Botstein, D.: ‘Gene Shaving’ as a method for identifying distinct sets of genes with similar expression patterns. Genome Biol. 1, 1–21 (2000)CrossRefGoogle Scholar
  48. 48.
    Hastie, T., Mazumder, R.: Matrix Completion via Iterative Soft-Thresholded SVD (2015)Google Scholar
  49. 49.
    Hastie, T., Tibshirani, R., Narasimhan, B., Chu, G.: Package ‘impute’. CRAN (2017)Google Scholar
  50. 50.
    Hofmann, B.: Regularization for Applied Inverse and Ill-Posed problems. Teubner, Stuttgart, Germany (1986)CrossRefGoogle Scholar
  51. 51.
    Honaker, J., King, G., Blackwell, M.: Amelia II: A program for Missing Data (2012)Google Scholar
  52. 52.
    Anger, G., Gorenflo, R., Jochum, H., Moritz, H., Webers, W. (eds.): Inverse Problems: principles and Applications in Geophysics, Technology, and Medicine. Akademic Verlag, Berlin (1993)Google Scholar
  53. 53.
    Hua, T.A., Gunst, R.F.: Generalized ridge regression: a note on negative ridge parameters. Commun. Stat. Theory Methods 12, 37–45 (1983)MathSciNetCrossRefGoogle Scholar
  54. 54.
    Iyengar, V.S., Zhang, T.: Empirical study of recommender systems using linear classifiers. In: Cheung, D., Williams, G.J., Li, Q. (eds.) PAKDD 2001. LNCS (LNAI), vol. 2035, pp. 16–27. Springer, Heidelberg (2001).  https://doi.org/10.1007/3-540-45357-1_5CrossRefGoogle Scholar
  55. 55.
    Jeffers, J.: Two case studies in the application of principal component. Appl. Stat. 16, 225–236 (1967)CrossRefGoogle Scholar
  56. 56.
    Jolliffe, I.: Principal Component Analysis. Springer, New York (1986).  https://doi.org/10.1007/978-1-4757-1904-8CrossRefzbMATHGoogle Scholar
  57. 57.
    Jolliffe, I.T.: Rotation of principal components: choice of normalization constraints. J. Appl. Stat. 22, 29–35 (1995)MathSciNetCrossRefGoogle Scholar
  58. 58.
    Jolliffe, I.T., Trendafilov, N.T., Uddin, M.: A modified principal component technique based on the LASSO. J. Comput. Graph. Stat. 12(3), 531–547 (2003)MathSciNetCrossRefGoogle Scholar
  59. 59.
    Josse, J., Husson, F.: missMDA: a package for handling missing values in multivariate data analysis. J. Stat. Softw. 70(1) (2016)Google Scholar
  60. 60.
    Linden, G., Smith, B., York, J.: Amazon.com recommendations: item-to-item collaborative filtering. Internet Comput. 7(1), 76–80 (2003)CrossRefGoogle Scholar
  61. 61.
    Mazumder, R., Hastie, T., Tibshirani, R.: Spectral regularization algorithms for learning large incomplete matrices. JMLR 2010(11), 2287–2322 (2010)MathSciNetzbMATHGoogle Scholar
  62. 62.
    McCabe, G.: Principal variables. Technometrics 26, 137–144 (1984)MathSciNetCrossRefGoogle Scholar
  63. 63.
    Modarresi, K., Golub, G.H.: An adaptive solution of linear inverse problems. In: Proceedings of Inverse Problems Design and Optimization Symposium (IPDO2007), 16–18 April 2007, Miami Beach, Florida, pp. 333–340 (2007)Google Scholar
  64. 64.
    Modarresi, K.: A Local Regularization Method Using Multiple Regularization Levels, Stanford, April 2007Google Scholar
  65. 65.
    Modarresi, K., Golub, G.H.: An efficient algorithm for the determination of multiple regularization parameters. In: Proceedings of Inverse Problems Design and Optimization Symposium (IPDO), 16–18 April 2007, Miami Beach, Florida, pp. 395–402 (2007)Google Scholar
  66. 66.
    Modarresi, K.: Recommendation system based on complete personalization. Procedia Comput. Sci. 80C (2016)Google Scholar
  67. 67.
    Modarresi, K.: Computation of recommender system using localized regularization. Procedia Comput. Sci. 51C (2015)CrossRefGoogle Scholar
  68. 68.
    Modarresi, K.: Algorithmic Approach for Learning a Comprehensive View of Online Users. Procedia Comput. Sci. 80C (2016)CrossRefGoogle Scholar
  69. 69.
    Sedhain, S., Menon, A.K., Sanner, S., Xie, L.: AutoRec: autoencoders meet collaborative. In: WWW 2015 (2015)Google Scholar
  70. 70.
    Stekhoven, D.: Using the missForest Package. CRAN (2012)Google Scholar
  71. 71.
    Strub, F., Mary, J., Gaudel, R.: Hybrid Collaborative Filtering with Autoencoders (2016)Google Scholar
  72. 72.
    Van Buuren, S., Groothuis-Oudshoorn, K.: MICE: multivariate imputation by chained equations in R. J. Stat. Softw. 45(3), 1–67 (2011)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Adobe Inc.San JoseUSA

Personalised recommendations