Skip to Main Content
This paper is concerned with model reduction for complex Markov chain models. The Kullback-Leibler divergence rate is employed as a metric to measure the difference between the Markov model and its approximation. For a certain relaxation of the bi-partition model reduction problem, the solution is shown to be characterized by an associated eigenvalue problem. The form of the eigenvalue problem is closely related to the Markov spectral theory for model reduction. This result is the basis of a heuristic proposed for the m-ary partition problem, resulting in a practical recursive algorithm. The results are illustrated with examples.