Skip to Main Content
In this paper, a prediction method for nonlinear time series based on a set membership (SM) approach is proposed. The method does not require the choice of the functional form of the model used for prediction, but assumes a bound on the rate of variation of the regression function defining the model. At the contrary, most of the existing prediction methods need the choice of a functional form of the regression function or of state equations (piecewise linear, quadratic, etc.) and this choice is usually the result of heuristic searches. These searches may be quite time consuming, and lead only to approximate model structures, whose errors may be responsible of bad propagation of prediction errors, especially for the multistep ahead prediction. Moreover, the method proposed in this paper assumes only that the noise is bounded, in contrast with statistical approaches, which rely on noise assumptions such as stationarity, ergodicity, uncorrelation, type of distribution, etc. The validity of these assumptions may be difficult to be reliably tested in many applications and is certainly lost in presence of approximate modeling. In the present SM approach, using a result developed in a previous paper, the values of the bounds on the gradient of the regression function and on the noise can be suitably assessed to verify the validity tests. Two almost optimal prediction algorithms are then derived, the second one having improved optimal properties over the first one, at the expense of an increased computational complexity. The method is tested and compared with other literature methods on the well-known Wolf Sunspot Numbers series, widely used in the time series literature as a benchmark test, and on the prediction of vertical dynamics of vehicles with controlled suspensions. A simulation example is also presented to investigate how much conservative the SM approach may be in the most adverse situation where data are generated by a linear autoregressive (AR) model driven by i.i.d. gaussian white noise and the SM prediction is compared with the optimal statistical predictor, which makes use of the exact assumptions.