Skip to Main Content
The literature dealing with the question of convergence of the least squares (LS) identification algorithm - is usually utilizing the properties of the sequential estimator, e.g., the fact that the sequence of estimates is a Martingale process, if the noise is an independent sequence, has been used to establish convergence in . In this paper emphasis is put on the fact that the least squares estimates are obtained by minimizing a (quadratic) cost functional. Convergence results for a sequence of random variables obtained by minimizing a parameterized random sequence with respect to its parameter are presented. These results in turn are utilized to establish strong convergence (w.p.l and MS) of the LS procedure under milder conditions than those in previous proofs. Landau's recursive algorithms - are shown to be variations of the LS and, thus, their convergence is also established. The self-tuning regulator - is also discussed and the importance of the use of the LS procedure in it is demonstrated. The importance of this paper, beyond extending previous convergence results is in its approach-utilizing the foundation on which LS procedures are based.