A Novel Cache Organization for Tiled Chip Multiprocessor

Download Book (12,635 KB) As a courtesy to our readers the eBook is provided DRM-free. However, please note that Springer uses effective methods and state-of-the art technology to detect, stop, and prosecute illegal sharing to safeguard our authors’ interests.
Download Chapter (470 KB)

Abstract

Increased device density and working set size are driving a rise in cache capacity, which comes at the cost of high access latency. Based on the characteristic of shared data, which is accessed frequently and consumes a little capacity, a novel two-level directory organization is proposed to minimize the cache access time in this paper. In this scheme, a small Fast Directory is used to offer fast hits for a great fraction of memory accesses. Detailed simulation results show that on a 16-core tiled chip multiprocessor, this approach reduces average access latency by 17.9% compared to the general cache organization, and improves the overall performance by 13.3% on average.