Skip to Main Content
Sphere decoding (SD) allows the high-dimensional MIMO maximum likelihood detection problem to be solved with significantly lower complexity than other methods. The SD algorithm has, however, mostly only been analyzed with DSP implementations in mind. We show that VLSI implementations call for new performance metrics, analyze the resulting implementation tradeoffs for the decoding of complex signal constellations, and develop design guidelines and a generic architecture. When using the ℓ ∞-norm for the sphere constraint instead of the ℓ 2-norm, significant reductions in circuit complexity and improvements in tree pruning efficiency are possible at a minimum performance penalty. As a proof of concept, a high performance ASIC implementation is presented.