Scheduled System Maintenance:
On May 6th, single article purchases and IEEE account management will be unavailable from 8:00 AM - 5:00 PM ET (12:00 - 21:00 UTC). We apologize for the inconvenience.
By Topic

2-D system theory based iterative learning control for linear continuous systems with time delays

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Xiao-Dong Li ; Dept. of Manuf. Eng. & Eng. Manage., City Univ. of Hong Kong, China ; Chow, T.W.S. ; Ho, J.K.L.

This paper presents two-dimensional (2-D) system theory based iterative learning control (ILC) methods for linear continuous multivariable systems with time delays in state or with time delays in input. Necessary and sufficient conditions are given for convergence of the proposed ILC rules. In this paper, we demonstrate that the 2-D linear continuous-discrete Roesser's model can be applied to describe the ILC process of linear continuous time-delay systems. Three numerical examples are used to illustrate the effectiveness of the proposed ILC methods.

Published in:

Circuits and Systems I: Regular Papers, IEEE Transactions on  (Volume:52 ,  Issue: 7 )