摘要

Loop caches provide an effective method for decreasing memory hierarchy energy consumption by storing frequently executed code (critical regions) in a more energy efficient structure than the level one cache. However, due to code structure restrictions or costly design time pre-analysis efforts, previous loop cache designs are not suitable for all applications and system scenarios. We present an adaptive loop cache that is amenable to a wider range of system scenarios, which can provide an additional 20% average instruction cache energy savings (with individual benchmark energy savings as high as 69%) compared to the next best loop cache, the preloaded loop cache.

  • 出版日期2013-3