Robust Nonlinear Distributed Estimation Using Maximum Correntropy | IEEE Conference Publication | IEEE Xplore

Robust Nonlinear Distributed Estimation Using Maximum Correntropy


Abstract:

With the development of information theoretical learning, maximum correntropy criterion (MCC) has shown its utility in non-Gaussian information approximation. The MCC has...Show More

Abstract:

With the development of information theoretical learning, maximum correntropy criterion (MCC) has shown its utility in non-Gaussian information approximation. The MCC has been applied in Gaussian filters to provide robust estimation under non-Gaussian environment. The extension of MCC to its information form enables robust distributed estimation. In this paper, a new MCC based diffusion information filter is developed for distributed multiple sensor estimation. Non-Gaussianity due to nonlinear dynamics and measurement can be accounted for by incorporating both state estimation error and measurement uncertainty into the correntropy. A numerical example is used to demonstrate the effectiveness of the proposed MCC based diffusion information filter.
Date of Conference: 10-12 July 2019
Date Added to IEEE Xplore: 29 August 2019
ISBN Information:

ISSN Information:

Conference Location: Philadelphia, PA, USA

I. Introduction

Estimation is of great importance to many engineering and science fields such as tracking, communication, navigation, robotics, and finance. The Kalman filter (KF) [1] is the most widely used optimal recursive estimator based on the minimum mean square error for linear Gaussian systems. The extended Kalman filter (EKF) [1], the unscented Kalman filter (UKF) [2], the Gauss-Hermite quadrature filter[3], the sparse-grid quadrature filter [4], and so forth, expand the capability of the KF to nonlinear estimation problems. Nevertheless, these Gaussian filters may degrade under highly non-Gaussian uncertainties. Research has been conducted in the design of filters that are robust to non-Gaussian problems, such as the Gaussian mixture filter (GMF) [5] and particle filters (PF) [6], [7]. The former can approximate the probability density function (pdf) via the mixture of multiple Gaussian distributions, while the latter applies the Monte Carlo (MC) sampling method for estimation. Both methods suffer from the curse of dimensionality and result in excessive computational load. A pdf correction method in [8] applies the Edgeworth expansion to better approximate the non-Gaussian pdf by modifying the weights of the quadrature point-based filters. However, the method requires computation of higher order moments for multi-dimensional problems.

Contact IEEE to Subscribe

References

References is not available for this document.