Skip to main content

DRAM-Based Processor for Deep Neural Networks Without SRAM Cache

  • Conference paper
  • First Online:
Intelligent Computing

Abstract

Modern computing architectures use cache memory as the buffer between high speed computing units and low latency main memory. Higher capacity caches are thought to be critical for deep neural network processors, which handle large amounts of data. However, as cache memory capacity increases, it occupies large die area that can otherwise be used for computing units. This is the inherent trade off between memory capacity and performance. In this work, we present a deep neural network processing chip, with a near-memory computing architecture. We eliminate the SRAM cache and use DRAM only as on-chip memory, delivering high performance and high memory capacity.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Lapedus, M.: Memory Issues for AI Edge Chips, 23 March 2020. https://semiengineering.com/memory-issues-for-ai-edge-chips/

  2. Khan, S.M., Mann, A.: AI Chips: What They Are and Why They Matter, April 2020. https://cset.georgetown.edu/research/why-ai-chips-matter

  3. Joel, H.: How L1 and L2 CPU Caches Work, and Why They’re an Essential Part of Modern Chips. Extreme Tech, 14 April 2020

    Google Scholar 

  4. Microsoft, Turing-NLG: A 17-billion-parameter language model by Microsoft. https://www.microsoft.com/en-us/research/blog/turing-nlg-a-17-billion-parameter-language-model-by-microsoft/

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eugene Tam .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Tam, E. et al. (2021). DRAM-Based Processor for Deep Neural Networks Without SRAM Cache. In: Arai, K. (eds) Intelligent Computing. Lecture Notes in Networks and Systems, vol 284. Springer, Cham. https://doi.org/10.1007/978-3-030-80126-7_52

Download citation

Publish with us

Policies and ethics