Advertisement

The von Neumann Syndrome and the CS Education Dilemma

  • Reiner Hartenstein
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4943)

Abstract

Computing the von Neumann style is tremendously inefficient because multiple layers of massive overhead phenomena often lead to code sizes of astronomic dimensions, thus requiring large capacity slow off-chip memory. The dominance of von-Neumann-based computing will become unaffordable during next decade because of growing very high energy consumption and increasing cost of energy. For most application domains a von-Neumann-based parallelization does not scale well, resulting in the escalating many-core programming crisis by requiring complete remapping and re-implementation—often promising only disappointing results. A sufficiently large population of manycore-qualified programmers is far from being available. Efficient solutions for the many-core crisis are hardly possible by fully instruction-stream-based approaches. Several HPC celebrities call for a radical re-design of the entire computing discipline. The solution is a dual paradigm approach, which includes fundamental concepts known already for a long time from Reconfigurable Computing. Whistle blowing is overdue, since these essential qualifications for our many-core future and for low energy computing are obstinately ignored by CE, CS and IT curriculum task forces. This talk also sketches a road map.

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Reiner Hartenstein
    • 1
  1. 1.Technical University of KaiserslauternGermany

Personalised recommendations