Advertisement

Application of Alternating Decision Trees in Selecting Sparse Linear Solvers

  • Sanjukta Bhowmick
  • Victor Eijkhout
  • Yoav Freund
  • Erika Fuentes
  • David Keyes
Chapter

Abstract

The solution of sparse linear systems, a fundamental and resource-intensive task in scientific computing, can be approached through multiple algorithms. Using an algorithm well adapted to characteristics of the task can significantly enhance the performance, such as reducing the time required for the operation, without compromising the quality of the result. However, the “best” solution method can vary even across linear systems generated in course of the same PDE-based simulation, thereby making solver selection a very challenging problem. In this paper, we use a machine learning technique, Alternating Decision Trees (ADT), to select efficient solvers based on the properties of sparse linear systems and runtime-dependent features, such as the stages of simulation. We demonstrate the effectiveness of this method through empirical results over linear systems drawn from computational fluid dynamics and magnetohydrodynamics applications. The results also demonstrate that using ADT can resolve the problem of “over-fitting”, which occurs when limited amount of data is available.

Keywords

Linear System Receiver Operating Characteristic Curve Linear Solver Sparse Linear System Suitable Solver 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgments

We would like to thank Jin Chen of the Princeton Plasma Physics Lab for providing us with the M3D matrices. We are also grateful to Raphael Pelossof of Columbia University for his package to render ROC curves from the MLJava output files.

References

  1. 1.
    Axelsson O (1987) A survey of preconditioned iterative methods for linear systems of equations. BITGoogle Scholar
  2. 2.
    Balay S, Buschelman K, Gropp W, Kaushik D, Knepley M, McInnes L, Smith BF, Zhang H (2004) PETSc users manual. Technical Report ANL-95/11 - Revision 2.2.1, Argonne National Laboratory, http://www.mcs.anl.gov/petsc
  3. 3.
    Barrett R, Berry M, Dongarra J, Eijkhout V, Romine C (1996) Algorithmic bombardment for the iterative solution of linear systems: a polyiterative approach. J Comput Appl Math 74:91–110MathSciNetMATHCrossRefGoogle Scholar
  4. 4.
    Bennett BAV, Smooke MD (1999) Local rectangular refinement with application to nonreacting and reacting fluid flow problems. J Comput Phys 151:648–727MathSciNetCrossRefGoogle Scholar
  5. 5.
    Bhowmick S, McInnes LC, Norris B, Raghavan P (2003) The role of multi-method linear solvers in pde-based simulations. In: Sloot PMA, Tan CJK, Dongarra JJ, Hoekstra AG (eds) Lecture Notes in computer science, computational science and its applications-ICCSA 2003, vol 2667. Springer, pp 828–839Google Scholar
  6. 6.
    Bhowmick S, Raghavan P, McInnes L, Norris B (2004) Faster PDE-based simulations using robust composite linear solvers. Future Generation Comput Syst 20:373–386CrossRefGoogle Scholar
  7. 7.
    Bhowmick S, Raghavan P, Teranishi K (2002) A combinatorial scheme for developing efficient composite solvers. In: Sloot PMA, Tan CJK, Dongarra JJ, Hoekstra AG (eds) Lecture notes in computer science, computational science-ICCS 2002, vol 2330. Springer, pp 325–334Google Scholar
  8. 8.
    Bhowmick S, Toth B, Raghavan P (2009) Towards low-cost, high-accuracy classifiers for linear solver selection. In: ICCS (1), pp 463–472Google Scholar
  9. 9.
    Breiman L (1998) Arcing classifiers. Ann Stat 26(3):801–849MathSciNetMATHCrossRefGoogle Scholar
  10. 10.
    Davis T (1997) University of Florida Sparse Matrix Collection. NA Digest, 97(23). http://www.cise.ufl.edu/research/sparse/matrices
  11. 11.
    Demmel J, Dongarra J, Eijkhout V, Fuentes E, Petitet A, Vuduc R, Whaley RC, Yelick K (2004) Self adapting linear algebra algorithms and software. IEEE ProceedingsGoogle Scholar
  12. 12.
    Dongarra J, Eijkhout V (2003) Self adapting numerical algorithm for next generation applications. Int J High Perform Comput Appl 17(2):125–132CrossRefGoogle Scholar
  13. 13.
    Dongarra J, Eijkhout V (2003) Self-adapting numerical software and automatic tuning of heuristics. In: Proceedings of the International Conference on Computational Science, June 2–4, 2003, St. Petersburg (Russia) and Melbourne (Australia), Lecture Notes in Computer Science 2660, Springer, pp 759–770Google Scholar
  14. 14.
    Driven-Cavity. Nonlinear Driven Cavity and Pseudotransient Timestepping in 2D. http://www-unix.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/src/snes/examples/tutorials/ex27.c.html.
  15. 15.
    Drucker H, Cortes C (1996) Boosting decision trees. In: NIPS8, pp 479–485Google Scholar
  16. 16.
    Duff IS, Erisman AM, Rei JK (1986) Direct methods for sparse matrices. Clarendon, OxfordMATHGoogle Scholar
  17. 17.
    Eijkhout V, Fuentes E Anamod online documentation. http://www.tacc.utexas.edu/~eijkhout/doc/anamod/html/
  18. 18.
    Eijkhout V, Fuentes E A proposed standard for numerical metadata. submitted to ACM Trans Math SoftwareGoogle Scholar
  19. 19.
    Ern A, Giovangigli V, Keyes DE, Smooke MD (1994) Towards polyalgorithmic linear system solvers for nonlinear elliptic problems. SIAM J Sci Comput 15(3):681–703MathSciNetMATHCrossRefGoogle Scholar
  20. 20.
    Falgout RD, Yang UM (2002) hypre: A library of high performance preconditioners. In: International Conference on Computational Science, vol 3. pp 632–641Google Scholar
  21. 21.
    Freund Y, Mason L (1999) The alternating decision tree learning algorithm. In: Proceedings of the 16th International Conference on Machine Learning. pp 124–133Google Scholar
  22. 22.
    Freund Y, Schapire RE (1997) A decision-theoretic generalization of on-line learning and an application to boosting. J Comput Syst Sci 55(1):119–139MathSciNetMATHCrossRefGoogle Scholar
  23. 23.
    Freund Y, Schapire RE (1999) A short introduction to boosting. J Jpn Society Artif Intell 14(5):771–780Google Scholar
  24. 24.
    Fuentes E (2007) Statistical and machine learning techniques applied to algorithm selection for solving sparse linear systems. Doctoral Dissertation, University of TennesseeGoogle Scholar
  25. 25.
    Gannon D, Bramley R, Stuckey T, Balasubramanian J, Villacis J, Akman E, Berg F, Diwan S, Govindaraju M (2000) The linear system analyzer. In: Houstis EN, Rice JR, Gallopoulos E, Bramley R (eds) Enabling technologies for computational science. Kluwer, DordrechtGoogle Scholar
  26. 26.
    Gropp WD, Keyes DE, McInnes LC, Tidriri MD (2000) Globalized Newton-Krylov-Schwarz algorithms and software for parallel implicit CFD. Int J High Perform Comput Appl 14: 102–136CrossRefGoogle Scholar
  27. 27.
    Hastie T, Tibshirani R, Friedman JH (2001) The elements of statistical learning. SpringerGoogle Scholar
  28. 28.
    Holloway A, Chen T-Y (2007) Neural networks for predicting the behavior of preconditioned iterative solvers. To appear in the International Conference on Computational ScienceGoogle Scholar
  29. 29.
    Holloway A, Chen T-Y (2007) Neural networks for predicting the behavior of preconditioned iterative solvers. In: ICCS ’07: Proceedings of the 7th international conference on Computational Science, Part I, Springer, Berlin, Heidelberg, pp 302–309Google Scholar
  30. 30.
    Houstis EN, Catlin AC, Rice JR, Verykios VS, Ramakrishnan N, Houstis CE (2000) PYTHIA-II: a knowledge/database system for managing performance data and recommending scientific software. Trans Math Softw 26(2):227–253MATHCrossRefGoogle Scholar
  31. 31.
    Kelley CT, Keyes DE (1998) Convergence analysis of pseudo-transient continuation. SIAM J Numer Anal 35:508–523MathSciNetMATHCrossRefGoogle Scholar
  32. 32.
    Kuefler E, Chen T-Y (2008) On using reinforcement learning to solve sparse linear systems. In: ICCS ’08: Proceedings of the 8th international conference on Computational Science, Part I, Springer, Berlin, Heidelberg, pp 955–964Google Scholar
  33. 33.
    LCRC. Argonne National Laboratory Computing Project. http://www.lcrc.anl.gov/jazz/index.php
  34. 34.
  35. 35.
    McCormick SF, Copper Mountain Conference on Multigrid Methods (1988) In: McCormick SF, Dekker M (eds) Multigrid methods: theory, applications, and supercomputing. New YorkGoogle Scholar
  36. 36.
    McInnes L, Norris B, Bhowmick S, Raghavan P (2003) Adaptive sparse linear solvers for implicit cfd using Newton-Krylov algorithms. In: Proceedings of the Second MIT Conference on Computational Fluid and Solid Mechanics, June 17–20Google Scholar
  37. 37.
  38. 38.
    Nocedal J, Wright SJ (1999) Numerical optimization. Springer, New YorkMATHCrossRefGoogle Scholar
  39. 39.
    Park W, Belova EV, Fu GY, Tang XZ, Strauss HR, Sugiyama LE (1999) Plasma simulation studies using multilevel physics models. Phys Plasmas 6(5):1796–1803CrossRefGoogle Scholar
  40. 40.
    Quinlan JR (1996) Bagging, boosting, and C4.5. In: Proceedings of the Thirteenth National Conference on Artificial Intelligence, pp 725–730Google Scholar
  41. 41.
    Saad Y (1995) Iterative methods for sparse linear systems. PWS PublishingGoogle Scholar
  42. 42.
    Schapire RE (1990) The strength of weak learnability. Mach Learn 5(2):197–227Google Scholar
  43. 43.
  44. 44.
    SuperLU. Sparse direct solver. http://crd.lbl.gov/~xiaoye/SuperLU
  45. 45.
    Wikipedia. Receiver operating characteristic. http://en.wikipedia.org/wiki/Receiver_operating_characteristic
  46. 46.
    Witten IH, Frank E (2005) Data mining:practical machine learning tools and techniques, 2nd edn. Morgan KaufmannGoogle Scholar
  47. 47.
    Witten IH, Frank E (2005) Data mining: practical machine learning tools and techniques. Morgan Kaufmann, San FranciscoMATHGoogle Scholar
  48. 48.
    Xu S, Zhang J. A data mining approach to matrix preconditioning problem. In: Proceedings of the Eighth Workshop on Mining Scientific and Engineering Datasets (MSD05)Google Scholar

Copyright information

© Springer New York 2011

Authors and Affiliations

  • Sanjukta Bhowmick
    • 1
  • Victor Eijkhout
  • Yoav Freund
  • Erika Fuentes
  • David Keyes
  1. 1.Department of Computer ScienceUniversity of NebraskaOmahaUSA

Personalised recommendations