Skip to Main Content
A new approach measuring the predictability of a process is proposed. The predictor is defined as the median of the distribution conditioned by a sequence of L-1 previous samples (i.e., a pattern). A function referred to as the corrected mean squared predictor error is defined to prevent the perfect adequacy to the data (i.e., the decrease to zero of the prediction error), thus avoiding to divide the whole set of data in learning and test sets. This function exhibits a minimum and this minimum is taken as a measure of predictability of the series. The use of the minimization procedure avoids to fix a priori the pattern length L. This approach permits one a reliable measure of predictability on short data sequences (around 300 samples). Moreover, this method, in connection with a surrogate data approach, is useful to detect nonlinear dynamics. The analysis indicates that, in simulated and real data, predictability and nonlinearity measures provide different information. The application of this approach to the analysis of cardiovascular variability series of the heart period (RR interval) and systolic arterial pressure (SAP) shows: (1) SAP series is more predictable than RR interval series; (2) predictability of the RR interval series is larger during tilt, during controlled respiration at 10 breaths/min (bpm) and after high-dose administration of atropine; (3) SAP series is dominated by linear correlation; (4) RR interval series exhibits nonlinear dynamics during controlled respiration at 10 bpm and after low-dose administration of atropine, while it is linear during sympathetic activation produced by tilt and after peripheral parasympathetic blockade caused by high-dose administration of atropine.