Peer-to-Peer Multi-class Boosting

  • István Hegedűs
  • Róbert Busa-Fekete
  • Róbert Ormándi
  • Márk Jelasity
  • Balázs Kégl
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7484)


We focus on the problem of data mining over large-scale fully distributed databases, where each node stores only one data record. We assume that a data record is never allowed to leave the node it is stored at. Possible motivations for this assumption include privacy or a lack of a centralized infrastructure. To tackle this problem, earlier we proposed the generic gossip learning framework (GoLF), but so far we have studied only basic linear algorithms. In this paper we implement the well-known boosting technique in GoLF. Boosting techniques have attracted growing attention in machine learning due to their outstanding performance in many practical applications. Here, we present an implementation of a boosting algorithm that is based on FilterBoost. Our main algorithmic contribution is a derivation of a pure online multi-class version of FilterBoost, so that it can be employed in GoLF. We also propose improvements to GoLF, with the aim of maximizing the diversity of the evolving models gossiped in the network, a feature that we show to be important. We evaluate the robustness and the convergence speed of the algorithm empirically over three benchmark databases.We compare the algorithm with the sequential AdaBoost algorithm and we test its performance in a failure scenario involving message drop and delay, and node churn.


P2P gossip multi-class classification boosting FilterBoost 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Filelist (2005),
  2. 2.
    Asuncion, A.U., Smyth, P., Welling, M.: Asynchronous distributed estimation of topic models for document analysis. Statistical Methodology 8(1), 3–17 (2011)MathSciNetCrossRefGoogle Scholar
  3. 3.
    Babenko, B., Yang, M., Belongie, S.: A family of online boosting algorithms. In: Computer Vision Workshops (ICCV Workshops), pp. 1346–1353 (2009)Google Scholar
  4. 4.
    Bottou, L.: Large-scale machine learning with stochastic gradient descent. In: Intl. Conf. on Computational Statistics, vol. 19, pp. 177–187 (2010)Google Scholar
  5. 5.
    Bradley, J., Schapire, R.: FilterBoost: Regression and classification on large datasets. In: Advances in Neural Information Processing Systems, vol. 20. The MIT Press (2008)Google Scholar
  6. 6.
    Collins, M., Schapire, R., Singer, Y.: Logistic regression, AdaBoost and Bregman distances. Machine Learning 48, 253–285 (2002)zbMATHCrossRefGoogle Scholar
  7. 7.
    Datta, S., Bhaduri, K., Giannella, C., Wolff, R., Kargupta, H.: Distributed data mining in peer-to-peer networks. IEEE Internet Comp. 10(4), 18–26 (2006)CrossRefGoogle Scholar
  8. 8.
    Fan, W., Stolfo, S.J., Zhang, J.: The application of AdaBoost for distributed, scalable and on-line learning. In: Proc. 5th ACM SIGKDD Intl. Conf. on Knowledge Discovery and Data Mining, pp. 362–366 (1999)Google Scholar
  9. 9.
    Frank, A., Asuncion, A.: UCI machine learning repository (2010)Google Scholar
  10. 10.
    Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: Machine Learning: Proc. Thirteenth Intl. Conf., pp. 148–156 (1996)Google Scholar
  11. 11.
    Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. of Comp. and Syst. Sci. 55, 119–139 (1997)MathSciNetzbMATHCrossRefGoogle Scholar
  12. 12.
    Friedman, J.: Stochastic gradient boosting. Computational Statistics and Data Analysis 38(4), 367–378 (2002)MathSciNetzbMATHCrossRefGoogle Scholar
  13. 13.
    Jelasity, M., Canright, G., Engø-Monsen, K.: Asynchronous Distributed Power Iteration with Gossip-Based Normalization. In: Kermarrec, A.-M., Bougé, L., Priol, T. (eds.) Euro-Par 2007. LNCS, vol. 4641, pp. 514–525. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  14. 14.
    Jelasity, M., Montresor, A., Babaoglu, O.: Gossip-based aggregation in large dynamic networks. ACM Trans. on Computer Systems 23(3), 219–252 (2005)CrossRefGoogle Scholar
  15. 15.
    Kégl, B., Busa-Fekete, R.: Boosting products of base classifiers. In: Intl. Conf. on Machine Learning, Montreal, Canada, vol. 26, pp. 497–504 (2009)Google Scholar
  16. 16.
    Kempe, D., Dobra, A., Gehrke, J.: Gossip-based computation of aggregate information. In: Proc. 44th Annual IEEE Symposium on Foundations of Computer Science (FOCS 2003), pp. 482–491. IEEE Computer Society (2003)Google Scholar
  17. 17.
    Kowalczyk, W., Vlassis, N.: Newscast EM. In: 17th Advances in Neural Information Processing Systems (NIPS), pp. 713–720. MIT Press, Cambridge (2005)Google Scholar
  18. 18.
    Luo, P., Xiong, H., Lü, K., Shi, Z.: Distributed classification in peer-to-peer networks. In: Proc. 13th ACM SIGKDD Intl. Conf. on Knowledge Discovery and Data Mining (KDD 2007, pp. 968–976. ACM, New York (2007)CrossRefGoogle Scholar
  19. 19.
    Ormándi, R., Hegedűs, I., Jelasity, M.: Asynchronous Peer-to-Peer Data Mining with Stochastic Gradient Descent. In: Jeannot, E., Namyst, R., Roman, J. (eds.) Euro-Par 2011, Part I. LNCS, vol. 6852, pp. 528–540. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  20. 20.
    Ormándi, R., Hegedüs, I., Jelasity, M.: Efficient p2p ensemble learning with linear models on fully distributed data. CoRR abs/1109.1396 (2011)Google Scholar
  21. 21.
    Oza, N., Russell, S.: Online bagging and boosting. In: Proc. Eighth Intl. Workshop on Artificial Intelligence and Statistics (2001)Google Scholar
  22. 22.
    Park, B.H., Kargupta, H.: Distributed data mining: Algorithms, systems, and applications. In: Ye, N. (ed.) The Handbook of Data Mining. CRC Press (2003)Google Scholar
  23. 23.
  24. 24.
    Schapire, R.E., Singer, Y.: Improved boosting algorithms using confidence-rated predictions. Machine Learning 37(3), 297–336 (1999)zbMATHCrossRefGoogle Scholar
  25. 25.
    Siersdorfer, S., Sizov, S.: Automatic Document Organization in a P2P Environment. In: Lalmas, M., MacFarlane, A., Rüger, S.M., Tombros, A., Tsikrika, T., Yavlinsky, A. (eds.) ECIR 2006. LNCS, vol. 3936, pp. 265–276. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  26. 26.
    Tölgyesi, N., Jelasity, M.: Adaptive Peer Sampling with Newscast. In: Sips, H., Epema, D., Lin, H.-X. (eds.) Euro-Par 2009. LNCS, vol. 5704, pp. 523–534. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  27. 27.
    Widrow, B., Hoff, M.E.: Adaptive Switching Circuits. In: 1960 IRE WESCON Convention Record, vol. 4, pp. 96–104 (1960)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • István Hegedűs
    • 1
  • Róbert Busa-Fekete
    • 1
    • 2
  • Róbert Ormándi
    • 1
  • Márk Jelasity
    • 1
  • Balázs Kégl
    • 2
    • 3
  1. 1.Research Group on AIHungarian Acad. Sci. and Univ. of SzegedHungary
  2. 2.Linear Accelerator Laboratory (LAL)University of Paris-Sud, CNRSOrsayFrance
  3. 3.Computer Science Laboratory (LRI)University of Paris-Sud, CNRS and INRIA-SaclayOrsayFrance

Personalised recommendations