Massively parallel symbolic computing

  • David L. Waltz
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 748)


Advances in hardware and computer architecture continue to change the economics of various AI (as well as all other) computing paradigms. The new generation of massively parallel machines extends the potential for applications at the high end of the computing spectrum, offering higher computing and I/O performance, much larger memories, and MIMD as well as SIMD capabilities. Computing costs for the same level of performance are substantially less, and will continue to drop steeply for the foreseeable future.

All this has clear consequences for AI: for example, larger knowledge bases can be stored; hand coding will continue to become less cost-effective relative to learning and simple-to-program brute-force methods as time goes on; and just about any parallel AI paradigm should be capable of executing efficiently.

A brief overview will be provided of recent successful Connection Machine projects: automatic keyword assignment for news articles using MBR nearest-neighbor methods (MBR = Memory-Based Reasoning); automatic classification of Census Bureau returns; protein structure prediction using MBR together with backpropagation nets, and statistics; work on “database mining”; and Karl Sims' generation of graphics using genetically-inspired operations on s-expressions.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Appel, M. & E. Hellerman. “Census Bureau Experiments with Automated Industry and Occupation Coding.” Proc. Amer. Statistical Assoc., 1983, 32–40.Google Scholar
  2. 2.
    Creecy, R., B. Masand, S. Smith, & D. L. Waltz. “Trading MIPS and Memory for Knowledge Engineering.” CACM 35, 8, August 1992, 48–64.Google Scholar
  3. 3.
    Goldberg, D. E. Genetic Algorithms in Search, Optimization & Machine Learning. Reading, MA: Addison-Wesley, 1989.Google Scholar
  4. 4.
    Kabsch, W. & C. Sander. “Dictionary of protein secondary structures: pattern recognition off hydrogen-bonded and geomerical features.” Biopolymers 22, 1983, 2577–2637.CrossRefGoogle Scholar
  5. 5.
    Koza, J. Genetic Programmming: On the Programming of Computers by Means on Natural Selection. Cambridge: MIT Press, 1992.Google Scholar
  6. 6.
    Masand, B. “Effects of query and database sizes on classification of news stories using memory based reasoning.” AAAI Spring Symposium on CBR, Stanford, March 1993.Google Scholar
  7. 7.
    Masand, B. “Optimizing Confidence of Text Classification by Evolution of Symbolic Expressions.” Unpublished paper, Thinking Machines Corp., Cambridge, MA, 1993.Google Scholar
  8. 8.
    Masand, B., G. Linoff, & D. L. Waltz. “Classifying news stories using memorybased reasoning.” Proc. SIGIR Conf., Copenhagen, July 1992.Google Scholar
  9. 9.
    Qian, N. & T. J. Sejnowski. “Predicting the secondary structure of globular proteins using neural network models.” J. Molecular Biology 202, 1988, 865–884.CrossRefGoogle Scholar
  10. 10.
    Quinlan, R. “Learning efficient classification procedures and their application to chess end games.” In R.S. Michalski, J. Carbonell, & T. Mitchell (eds.) Machine Learning: An Artificial Intelligence Approach, Los Angeles: Tioga Publishing, 1988, 463–482.Google Scholar
  11. 11.
    Rumelhart, D, J. McClelland, et al. Parallel Distributed Processing, Cambridge: MIT Press, 1986.Google Scholar
  12. 12.
    Sims, K. “Artificial Evolution for Computer Graphics.” Computer Graphics 25, 4, July 1991, 319–28.MathSciNetGoogle Scholar
  13. 13.
    Stanfill, C. & D. L. Waltz. “Toward Memory-Based Reasoning.” CACM 29, December 1986, 1213–1228.Google Scholar
  14. 14.
    Waltz, D. L. “Memory-Based Reasoning.” In M. Arbib & A. Robinson (eds.) Natural and Artificial Parallel Computation, Cambridge: MIT Press, 1989, 251–276.Google Scholar
  15. 15.
    Waltz, D. L. “Massively Parallel AI.” Proc. AAAI-90, Boston, 1117–1122.Google Scholar
  16. 16.
    Waltz, D. & J. Feldman, Connectionist Models and Their Implications, Hillsdale, NJ: Ablex Publishing, 1988.Google Scholar
  17. 17.
    Zhang, X. & J. Hutchinson. “Practical Issues in Nonlinear Time Series Prediction.” In A. Weigend & N. Gershenfeld (eds.), Predicting the Future and Understanding the Past: Proceedings of the 1992 Santa Fe Institute Time Series Competition, Addison-Wesley, 1993, to appear.Google Scholar
  18. 18.
    Zhang, X., J. Mesirov, & D. L. Waltz. “A Hybrid System for Protein Secondary Structure Prediction.” J. Molecular Biology 225, 1992, 1049–1063.CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1993

Authors and Affiliations

  • David L. Waltz
    • 1
    • 2
  1. 1.Thinking Machines CorporationCambridge
  2. 2.Department of Computer Science and Center for Complex SystemsBrandeis UniversityWaltham

Personalised recommendations