Skip to Main Content
In many realistic applications, process noise is known to be neither white nor normally distributed. When identifying models in these cases, it may be more effective to minimize a different penalty function than the standard sum of squared errors (as in a least-squares identification method). This paper investigates model identification based on two different penalty functions: the 1-norm of the prediction errors and a Huber-type penalty function. For data characteristic of some realistic applications, model identification based on these latter two penalty functions is shown to result in more accurate estimates of parameters than the standard least-squares solution, and more accurate model predictions for test data. The identification techniques are demonstrated on a simple toy problem as well as a physiological model of type 1 diabetes.