Skip to Main Content
A general consistency theorem for stationary nonlinear prediction error estimators is presented. Since this theorem does not require the existence of a parameterized system generating the observations, it applies to the practical problem of modeling complex systems with simple parameterized models. In order to measure the quality of fit between a set of observed processes and a given candidate set of predictors, the notion of predictor set completeness is introduced. Several examples are given to illustrate this idea; in particular, a negative result concerning the completehess of certain sets of linear predictors is presented. The relationship of Ljung's definitions of identifiability to various notions of predictor set completeness is examined, and the strong consistency of maximum likelihood estimators for Gaussian autoregressive moving average systems is obtained via an application of our techniques. Finally, problems for future research are described.