Skip to Main Content
A near-optimal signal detection algorithm with complexity of O(K log K) is proposed for K -input, K -output linear Gaussian vector channels. The proposed algorithm is based on the searching for a monotone sequence with maximum likelihood, under the ranking of sufficient statistics. It is proved that the algorithm can reach the optimal detection result in the case that all cross-correlation values in the linear Gaussian vector channel are identical. Also some simulation results are provided for the case that the crosscorrelation values are different. The simulation results show that the performance of the proposed algorithm degrades with the divergence of the cross-correlation values in the linear vector channels. Finally, a method of modifying the correlation matrix is suggested by an example. In this method, a transformation is derived for reducing the divergence of the cross-correlation values of the correlation matrix. A simulation result shows that the proposed algorithm is enhanced further with the transformation.