SRAM leakage power dominates the total power of low duty-cycle applications, e.g., sensor nodes. Accordingly, leakage power reduction during data-retention in SRAM standby is often addressed by reducing the supply voltage. Each SRAM cell has a minimum supply voltage parameter called the data-retention voltage ($DRV$), above which the stored bit can be retained reliably. The $DRV$ exhibits significant intra-chip variation in the deep sub-micron era. As supply voltage is lowered, leakage power reduces, but a larger fraction of SRAM cells is prone to retention failures. Use of appropriate error-correction to mitigate cell-reliability is proposed. Using this approach, the standby supply voltage is selected to minimize leakage power per useful bit. The fundamental limits on the leakage power per useful bit, while taking the $DRV$ distribution into account, are established. Minimization of power per bit results in a supply-voltage at which a small fraction of cells fail to retain the data. For experimental $DRV$-distributions, a $[31, 26, 3]$ Hamming code based implementation achieves a significant portion of the leakage power reduction compared to the fundamental limit. These analytical results are verified by twenty-four experimental chips manufactured in an industrial 90nm CMOS process.