By Topic

Linear Regression With Gaussian Model Uncertainty: Algorithms and Bounds

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Wiesel, A. ; Dept. of Electr. Eng. & Comput. Sci., Michigan Univ, Ann Arbor, MI ; Eldar, Y.C. ; Yeredor, A.

In this paper, we consider the problem of estimating an unknown deterministic parameter vector in a linear regression model with random Gaussian uncertainty in the mixing matrix. We prove that the maximum-likelihood (ML) estimator is a (de)regularized least squares estimator and develop three alternative approaches for finding the regularization parameter that maximizes the likelihood. We analyze the performance using the Cramer-Rao bound (CRB) on the mean squared error, and show that the degradation in performance due the uncertainty is not as severe as may be expected. Next, we address the problem again assuming that the variances of the noise and the elements in the model matrix are unknown and derive the associated CRB and ML estimator. We compare our methods to known results on linear regression in the error in variables (EIV) model. We discuss the similarity between these two competing approaches, and provide a thorough comparison that sheds light on their theoretical and practical differences.

Published in:

Signal Processing, IEEE Transactions on  (Volume:56 ,  Issue: 6 )