Skip to Main Content
We study the problem of reducing power during data-retention in a standby static random access memory (SRAM). For successful data-retention, the supply voltage of an SRAM cell should be greater than a critical data retention voltage (DRV). Due to circuit parameter variations, the DRV for different cells on the same chip exhibits variation with a distribution having diminishing tail. For reliable data retention, the existing low-power design uses a worst-case technique in which a standby supply voltage that is larger than the highest DRV among all cells in an SRAM is used. Instead, our approach uses aggressive voltage reduction and counters the ensuing unreliability through a fault-tolerant memory architecture. The main results of this work are as follows: (i) We establish fundamental bounds on the power reduction in terms of the DRV-distribution using techniques from information theory. For the DRV-distribution of test-chip in (Qin, H, et al., 2006), we show that 49% power reduction with respect to (w.r.t.) the worst-case is a fundamental lower bound while 40% power reduction w.r.t. the worst-case is achievable with a practical combinatorial scheme, (ii) We study the power reduction as a function of the block-length for low-latency codes since most applications using SRAM are latency constrained. We propose a reliable memory architecture based on the Hamming code for the next test-chip implementation with a predicted power reduction of 33% while accounting for coding overheads.