Skip to Main Content
Due to the rapidly growth of multimedia communication, video image deblurring has become a hot issue in image processing. As the image size increases, it is difficult to perform the large size image deblurring computation on a single computer in a limited time. One of the most efficient solving methods is parallel computing. In this paper, we first introduce a variational equation for the blurred image, analyze its computing scheme and improve the discretization, then detail the parallel implementation with multithreads programming using OpenMP and MPI on a dual core cluster. Some experimental results show that this method can produce the large size video image deblurring.