Skip to Main Content
In this paper we consider the stochastic optimal control problem under a mean variance criterion for discrete-time linear systems subject to Markov jumps and multiplicative noise. First we analyze an unconstrained mean-variance trade-off performance criterion along the time. In the sequence we consider the problem of minimizing the variance of an output along the time with constrains on the expectation of this output. We present explicit necessary and sufficient conditions for the existence of an optimal control strategy for the problems, generalizing previous results in the literature.