Skip to Main Content
Lossless turbo source coding employs an iterative encoding algorithm to search for the smallest codeword length that guarantees zero distortion. Although such encoder achieves promising compression rates, running the iterative algorithm for each individual message block imposes a large delay on the system. To reduce this delay, we propose a two-stage encoding algorithm for turbo source coding. We show that converging to zero distortion after a definite number of iterations, can be predicted from the earlier behavior of the distortion function. This will enable us to produce a quick, and yet sufficiently accurate, estimate of the codeword length in the first encoding stage. In the second stage, we iteratively increase this estimated codeword length until reaching zero distortion. Also, we show that employing an auxiliary distortion measure at the first stage of encoding may allow for better estimates and decrease the delay furthermore. Numerical results show that the proposed algorithm will decrease the encoding delay up to 19%. Although there are previous works in the literature on delay reduction of turbo source coding, those works achieve lower delays by reducing the message block length. However, the proposed algorithm achieves lower delays for the same block length and therefore the actual "per bit" encoding delay is decreased.