By Topic

Robust least-squares estimation with a relative entropy constraint

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
B. C. Levy ; Dept. of Electr. & Comput. Eng., California Univ., Davis, CA, USA ; R. Nikoukhah

Given a nominal statistical model, we consider the minimax estimation problem consisting of finding the best least-squares estimator for the least favorable statistical model within a neighborhood of the nominal model. The neighborhood is formed by placing a bound on the Kullback-Leibler (KL) divergence between the actual and nominal models. For a Gaussian nominal model and a finite observations interval, or for a stationary Gaussian process over an infinite interval, the usual noncausal Wiener filter remains optimal. However, the worst case performance of the filter is affected by the size of the neighborhood representing the model uncertainty. On the other hand, standard causal least-squares estimators are not optimal, and a characterization is provided for the causal estimator and the corresponding least favorable model. The causal estimator takes the form of a risk-sensitive estimator with an appropriately selected risk sensitivity coefficient.

Published in:

IEEE Transactions on Information Theory  (Volume:50 ,  Issue: 1 )