Skip to Main Content
In this correspondence, we propose two hypothesis testing (HT) for nonlinearity detection. These HT are based on an moving average (MA) model, allowing us to model signals with null spectral density values [unlike autoregressive (AR) models used in the usual parametric tests for nonlinearity detection]. These indexes are tested on simulated nonlinear and linear time series. Performances and drawbacks of these indexes are discussed with respect to the robustness of the indexes and to the difference between the theoretical and estimated laws under the hypothesis of linearity.