Loading [MathJax]/extensions/MathMenu.js
Regularized least square support vector machines for order and structure selection of LPV-ARX models | IEEE Conference Publication | IEEE Xplore

Regularized least square support vector machines for order and structure selection of LPV-ARX models


Abstract:

Least Squares Support Vector Machine (LS-SVM) is a computationally efficient kernel-based regression approach which has been recently applied to nonparametric identificat...Show More

Abstract:

Least Squares Support Vector Machine (LS-SVM) is a computationally efficient kernel-based regression approach which has been recently applied to nonparametric identification of Linear Parameter Varying (LPV) systems. In contrast to parametric LPV identification approaches, LS-SVM based methods obviate the need to parameterize the scheduling dependence of the LPV model coefficients in terms of a-priori specified basis functions. However, an accurate selection of the underlying model order (in terms of number of input lags, output lags and input delay) is still a critical issue in the identification of LPV systems in the LS-SVM setting. In this paper, we address this issue by extending the LS-SVM method to sparse LPV model identification, which, besides non-parametric estimation of the model coefficients, achieves datadriven model order selection via convex optimization. The main idea of the proposed method is to first estimate the coefficients of an over-parameterized LPV model through LS-SVM. The estimated coefficients are then scaled by polynomial weights, which are shrunk towards zero to enforce sparsity in the final LPV model estimate. The properties of the proposed approach are illustrated via simulation examples.
Date of Conference: 29 June 2016 - 01 July 2016
Date Added to IEEE Xplore: 09 January 2017
ISBN Information:
Conference Location: Aalborg, Denmark

Contact IEEE to Subscribe

References

References is not available for this document.