Skip to Main Content
Lowering the supply voltage of static random access memories (SRAMs) during standby modes is an effective technique to reduce their leakage power consumption. To maximize leakage reductions, it is desirable to reduce the supply voltage as much as possible. SRAM cells can retain their data down to a certain voltage, called the data-retention voltage (DRV). Due to intra-die variations in process parameters, the DRV of cells differ within a single memory die. Hence, the minimum applicable standby voltage to a memory die (VDDLmin) is determined by the maximum DRV among its constituent cells. On the other hand, inter-die variations result in a die-to-die variation of VDDLmin. Applying an identical standby voltage to all dies, regardless of their corresponding VDDLmin, can result in the failure of some dies, due to data-retention failures (DRFs), entailing yield losses. In this work, we first show that the yield losses can be significant if the standby voltage of SRAMs is reduced aggressively. Then, we propose a postsilicon standby voltage tuning scheme to avoid the yield losses due to DRFs, while reducing the leakage currents effectively. Simulation results in a 45-nm predictive technology show that tuning standby voltage of SRAMs can enhance data-retention yield by 10%-50%.