By Topic

Discrete-time LQ-optimal control problems for infinite Markov jump parameter systems

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Costa, O.L.V. ; Escola Politecnica, Sao Paulo Univ., Brazil ; Fragoso, M.D.

Optimal control problems for discrete-time linear systems subject to Markovian jumps in the parameters are considered for the case in which the Markov chain takes values in a countably infinite set. Two situations are considered: the noiseless case and the case in which an additive noise is appended to the model. The solution for these problems relies, in part, on the study of a countably infinite set of coupled algebraic Riccati equations (ICARE). Conditions for existence and uniqueness of a positive semidefinite solution to the ICARE are obtained via the extended concepts of stochastic stabilizability (SS) and stochastic detectability (SD), which turn out to be equivalent to the spectral radius of certain infinite dimensional linear operators in a Banach space being less than one. For the long-run average cost, SS and SD guarantee existence and uniqueness of a stationary measure and consequently existence of an optimal stationary control policy. Furthermore, an extension of a Lyapunov equation result is derived for the countably infinite Markov state-space case

Published in:

Automatic Control, IEEE Transactions on  (Volume:40 ,  Issue: 12 )