Skip to Main Content
Order-selection criteria for vector autoregressive (AR) modeling are discussed. The performance of an order-selection criterion is optimal if the model of the selected order is the most accurate model in the considered set of estimated models: here vector AR models. Suboptimal performance can be a result of underfit or overfit. The Akaike (1969) information criterion (AIC) is an asymptotically unbiased estimator of the Kullback-Leibler discrepancy (KLD) that can be used as an order-selection criterion. AIC is known to suffer from overfit: The selected model order can be greater than the optimal model order. Two causes of overfit are finite sample effects and asymptotic effects. As a consequence of finite sample effects, AIC underestimates the KLD for higher model orders, leading to overfit. Asymptotically, overfit is the result of statistical variations in the order-selection criterion. To derive an accurate order-selection criterion, both causes of overfit have to be addressed. Moreover, the cost of underfit has to be taken into account. The combined information criterion (CIC) for vector signals is robust to finite sample effects and has the optimal asymptotic penalty factor. This penalty factor is the result of a tradeoff of underfit and overfit. The optimal penalty factor depends on the number of estimated parameters per model order. The CIC is compared to other criteria such as the AIC, the corrected Akaike information criterion (AICc), and the consistent minimum description length (MDL).