Application of lightweight threading techniques to computational chemistry

  • John Thornley
  • Richard P. Muller
  • Daniel T. Mainz
  • Tahir Çağin
  • William A. GoddardIII


The recent advent of inexpensive commodity multiprocessor computers with standardized operating system support for lightweight threads provides computational chemists and other scientists with an exciting opportunity to develop sophisticated new approaches to materials simulation. We contrast the flexible performance characteristics of lightweight threading with the restrictions of traditional scientific supercomputing, based on our experiences with multithreaded molecular dynamics simulation. Motivated by the results of our molecular dynamics experiments, we propose an approach to multi-scale materials simulation using highly dynamic thread creation and synchronization within and between concurrent simulations at many different scales. This approach will enable extremely realistic simulations, with computing resources dynamically directed to areas where they are needed. Multi-scale simulations of this kind require large amounts of processing power, but are too sophisticated to be expressed using traditional supercomputing programming models. As a result, we have developed a high-level programming system called Sthreads that allows highly dynamic, nested multithreaded algorithms to be expressed. Program development is simplified through the use of innovative synchronization operations that allow multithreaded programs to be tested and debugged using standard sequential methods and tools. For this reason, Sthreads is very well suited to the complex multi-scale simulation applications that we are developing.

Molecular dynamics Lightweight multithreading Parallel programming 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Warren, M.S. and Salmon, J.K., Proceedings of Supercomputing '93 (Portland, OR, November 15-19). IEEE Computer Society, Los Alamitos, CA, (1993) 12-21.Google Scholar
  2. 2.
    Grama, A.Y., Kumar, V. and Sameh, A., 1995. Proceedings of the Seventh SIAM Conference on Parallel Processing for Scientific Computing (San Francisco, CA, February 15-17). SIAM, Philadelphia, PA, 355-359.Google Scholar
  3. 3.
    Singh, J.P., Holt, C., Totsuka, T., Gupta, A. and Hennesy, J., J. Parallel Distributed Computing, 27, (1995) 118.CrossRefGoogle Scholar
  4. 4.
    Beveridge, J. and Wiener, R., Multithreaded Applications in Win32: The Complete Guide to Threads. Addison Wesley, Reading, MA, 1996.Google Scholar
  5. 5.
    Nichols, B., Buttlar, D. and Proulx Farrell, J., Pthreads Programming. O'Reilly, Sebastopol, CA, 1996.Google Scholar
  6. 6.
    Open MP Architecture Review Board. 1998. OpenMP C and C++ Application Program Interface, Version 1.0. Scholar
  7. 7.
    Thornley, J., Chandy, K.M. and Ishii, H., Proceedings of the 2nd USENIX Windows NT Symposium (Seattle, WA, August 3-5). USENIX Association, Berkeley, CA, 1998, pp. 67-76.Google Scholar
  8. 8.
    Lim, K.T., Mega-Molecular Dynamics on Highly Parallel Computers: Methods and Applications. Ph.D. Thesis, California Institute of Technology, Pasadena, CA, 1995.Google Scholar
  9. 9.
    Lim, K.T., Brunett, S., Iotov, M., McClurg, R.B., Vaidehi, N., Dasgupta, S., Taylor, S. and Goddard, W.A. III, J. Comput. Chem., 18 (1997) 501.CrossRefGoogle Scholar
  10. 10.
    Thornley, J., Hui, M., Li, H., Cagin, T. and Goddard, W.A. III, Proceedings of the High Performance Computing Symposium 1999 (San Diego, CA, April 11-15). The Society for Computer Simulation International, San Diego, CA, 1999, pp. 17-24.Google Scholar
  11. 11.
    Ding, H.Q., Karasawa, N. and Goddard, W.A. III, J. Chem. Phys., 97 (1992) 4309.CrossRefGoogle Scholar
  12. 12.
    Greengard, L. and Rokhlin, V., J. Comput. Phys., 73 (1987) 325.CrossRefGoogle Scholar
  13. 13.
    Jaguar 3.5, Schrödinger, Inc., Jersey City, NJ 1998.Google Scholar

Copyright information

© Kluwer Academic Publishers 2001

Authors and Affiliations

  • John Thornley
    • 1
  • Richard P. Muller
    • 2
  • Daniel T. Mainz
    • 2
  • Tahir Çağin
    • 2
  • William A. GoddardIII
    • 2
  1. 1.Department of Computer ScienceUniversity of VirginiaU.S.A
  2. 2.Materials and Process Simulation CenterCalifornia Institute of TechnologyU.S.A

Personalised recommendations