Skip to Main Content
This paper treats the problem of optimal control of finite-state Markov processes observed in noise. Two types of noisy observations are considered: additive white Gaussian noise and jump-type observations. Sufficient conditions for the optimality of a control law are obtained similar to the stochastic Hamilton-Jacobi equation for perfectly observed Markov processes. An illustrative example concludes the paper.