Skip to Main Content
Updating probabilistic belief matrices as new observations arrive, in the presence of noise, is a critical part of many algorithms for target tracking in sensor networks. These updates have to be carried out while preserving sum constraints, arising for example, from probabilities. This paper addresses the problem of updating belief matrices to satisfy sum constraints using scaling algorithms. We show that the convergence behavior of the Sinkhorn scaling process, used for scaling belief matrices, can vary dramatically depending on whether the prior unscaled matrix is exactly scalable or only almost scalable. We give an efficient polynomial-time algorithm based on the maximum-flow algorithm that determines whether a given matrix is exactly scalable, thus determining the convergence properties of the Sinkhorn scaling process. We prove that the Sinkhorn scaling process always provides a solution to the problem of minimizing the Kullback-Leibler distance of the physically feasible scaled matrix from the prior constraint-violating matrix, even when the matrices are not exactly scalable. We pose the scaling process as a linearly constrained convex optimization problem, and solve it using an interior-point method. We prove that even in cases in which the matrices are not exactly scalable, the problem can be solved to e-optimality in strongly polynomial time, improving the best known bound for the problem of scaling arbitrary nonnegative rectangular matrices to prescribed row and column sums.