Advertisement

Distributed Logistic Regression for Separated Massive Data

  • Peishen Shi
  • Puyu Wang
  • Hai ZhangEmail author
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 1120)

Abstract

In this paper, we study the distributed logistic regression to process the separated large scale data which is stored in different linked computers. Based on the Alternating Direction Method of Multipliers (ADMM) algorithm, we transform the solving of logistic problem into the multistep iteration process, and propose the distributed logistic algorithm which has controllable communication cost. Specifically, in each iteration of the distributed algorithm, each computer updates the local estimators and interacts the local estimators with the neighbors simultaneously. Then we prove the convergence of distributed logistic algorithm. Due to the decentralized property of computer network, the proposed distributed logistic algorithm is robust. The classification results of our distributed logistic method are same as the non-distributed approach. Numerical studies have shown that our approach are both effective and efficient which perform well in distributed massive data analysis.

Keywords

Distributed Logistic regression ADMM algorithm 

References

  1. 1.
    Mcdonald, R., Mohri, M., Silberman, N., Walker, D., Mann, G.: Efficient large-scale distributed training of conditional maximum entropy models. Advances in Neural Information Processing Systems, vol. 1, pp. 1231–1239. NIPS, La Jolla (2009)Google Scholar
  2. 2.
    McDonald, R., Hall, K., Mann, G.: Distributed training strategies for the structured perceptron. In: Human Language Technologies: The 2010 Annual Conference of the North American Chapter of the Association for Computational Linguistics, pp. 456–464. ACL, Los Angeles (2010)Google Scholar
  3. 3.
    Zhang, Y., Duchi, J., Wainwright, M.: Communication-efficient algorithms for statistical optimization. J. Mach. Learn. Res. 14(1), 3321–3363 (2013)MathSciNetzbMATHGoogle Scholar
  4. 4.
    Zhang, Y., Duchi, J., Wainwright, M.: Divide and conquer Kernel ridge regression: a distributed algorithm with minimax optimal rates. J. Mach. Learn. Res. 30(1), 592–617 (2013)zbMATHGoogle Scholar
  5. 5.
    Mateos, G., Bazerque, J., Giannakis, G.: Distributed sparse linear regression. IEEE Trans. Signal Process. 58(10), 5262–5276 (2010)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Wang, P., Zhang, H., Liang, Y.: Model selection with distributed SCAD penalty. J. Appl. Stat. 45(11), 1938–1955 (2017)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Wang J., Kolar M., Srebro N., Zhang T.: Efficient distributed learning with sparsity. In: International Conference on Machine Learning, vol. 70, pp. 3636–3645. PMLR, Sydney (2017)Google Scholar
  8. 8.
    Menendez, M.L., Pardo, L., Pardo, M.C.: Preliminary \(phi\)-divergence test estimators for linear restrictions in a logistic regression model. Stat. Pap. 50(2), 277–300 (2009)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Pardo, J.A., Pardo, L., Pardo, M.C.: Minimum \(\phi -\)divergence estimator in logistic regression models. Stat. Pap. 47(1), 91–108 (2006)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Revan, O.M.: Iterative algorithms of biased estimation methods in binary logistic regression. Stat. Pap. 57(4), 991–1016 (2016)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Lange, T., Mosler, K., Mozharovskyi, P.: Fast nonparametric classification based on data depth. Stat. Pap. 55(1), 49–69 (2015)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Boyd, S., Parikh, N., Chu, E.: Distributed optimization and statistical learning via the alternating direction method of multipliers. Found. Trends Mach. Learn. 3(1), 1–122 (2011)CrossRefGoogle Scholar
  13. 13.
    Xie, P., Jin, K., Xing, E.: Distributed machine learning via sufficient factor broadcasting. Arxiv, http://arxiv.org/abs/1409.5705. Accessed 7 Sep 2015
  14. 14.
    Gopal, S., Yang, Y.: Distributed training of large-scale logistic models. In: Proceedings of the 30th International Conference on Machine Learning, vol. 28, pp. 289–297. PMLR, Atlanta (2013)Google Scholar
  15. 15.
    Peng, H., Liang, D., Choi, C.: Evaluating parallel logistic regression models. In: IEEE International Conference on Big Data, pp. 119–126. IEEE, Silicon Valley (2013)Google Scholar
  16. 16.
    Kang, D., Lim, W., Shin, K.: Data/feature distributed stochastic coordinate descent for logistic regression. In: Proceedings of the 23rd ACM International Conference on Conference on Information and Knowledge Management, pp. 1269–1278. ACM, Shanghai (2014)Google Scholar
  17. 17.
    Gabay, D., Mercier, B.: A dual algorithm for the solution of nonlinear variational problems via finite element approximation. Comput. Math. Appl. 2(1), 17–40 (1976)CrossRefGoogle Scholar
  18. 18.
    Glowinski, R., Marroco, A.: On the solution of a class of non linear Dirichlet problems by a penalty-duality method and finite elements of order one. In: Marchuk, G.I. (ed.) Optimization Techniques IFIP Technical Conference. LNCS, pp. 327–333. Springer, Berlin (1974).  https://doi.org/10.1007/978-3-662-38527-2_45CrossRefGoogle Scholar
  19. 19.
    Bertsekas, D., Tsitsiklis, J.: Parallel and Distributed Computation: Numerical Methods, 2nd edn. Athena Scientific, Belmont (1997)zbMATHGoogle Scholar
  20. 20.
    Wang, Y., Yin, W., Zeng, J.: Global convergence of ADMM in nonconvex nonsmooth optimization. J. Sci. Comput. 78(1), 29–63 (2019)MathSciNetCrossRefGoogle Scholar
  21. 21.
    Alon, U., et al.: Broad patterns of gene expression revealed by clustering analysis of tumor and normal colon tissues probed by oligonucleotide arrays. Proc. Natl. Acad. Sci. U.S.A. 96(12), 6745–6750 (1999)CrossRefGoogle Scholar
  22. 22.
    Blake, C., Merz, C.: UCI repository of machine learning databases (1998). http://www.ics.uci.edu/~mlearn/MLRepository.html

Copyright information

© Springer Nature Singapore Pte Ltd. 2019

Authors and Affiliations

  1. 1.School of MathematicsNorthwest UniversityXi’anChina
  2. 2.Faculty of Information Technology and State Key Laboratory of Quality Research in Chinese MedicinesMacau University of Science and TechnologyMacauChina

Personalised recommendations