Skip to Main Content
The choice of the number of parameters is an important problem in adaptive filtering. In the general ground of maximum likelihood (ML) parameter estimation, Akaïke criterion has proved to be a very useful tool. Significant results and the importance of the Kullback information measure as the implicit measure of distance associated with ML estimation are discussed. Application of these results to adaptive filtering is then considered for time-invariant and time-varying parametric models. The corresponding practical forms of Akaïke criterion are given. In the time-varying case, this criterion may be used as a way to fit the model into the nonstationarity of an exact filter. The possible generalization of the Akaïke criterion to non-ML estimation is discussed and application is made for weighted least squares estimation. A more general approach to the nonstationary case is given by assuming a linear stochastic model for the vector of the filter coefficients, in which case the ML estimate is computed recursively using the Kalman filtering equation. Then the random sequence of Kalman gains is close to a deterministic average sequence. This sequence may be precomputed and gives an optimal choice for the gain matrices in an adaptive gradient-type filter.