Skip to Main Content
Video transmitted over an error-prone network may be received at the decoder with degradations due to packet losses. No-reference quality monitoring algorithms are the most practical way to measure the quality of the received video, since they do not impose any change with respect to the network architecture. Conventionally, these methods assume the availability of the corrupted bitstream. In some situations this is not possible, e.g., because the bitstream is encrypted or processed by third-party decoders, and only the decoded pixel values can be used. The major issue in this scenario is the lack of knowledge about which regions of the video have been actually lost, which is a fundamental ingredient for estimating channel-induced distortion. In this paper, we propose a maximum a posteriori estimation of the pattern of lost macroblocks, which assumes the knowledge of the decoded pixels only. This information can be used as input to a no-reference quality monitoring system, which produces an accurate estimate of the mean-square-error (MSE) distortion introduced by channel errors. The results of the proposed method are well correlated with the MSE distortion computed in full-reference mode, with a linear correlation coefficient equal to 0.9 at frame level and 0.98 at sequence level.