Skip to Main Content
In this paper, we investigate the throughput and decoding-delay performance of random linear network coding as a function of the coding window size and the network size in an unreliable single-hop broadcast network setting. Our model consists of a source transmitting packets of a single flow to a set of N receivers over independent erasure channels. The source performs random linear network coding (RLNC) over K (coding window size) packets and broadcasts them to the receivers. We note that the broadcast throughput of RLNC must vanish with increasing N, for any fixed K. Hence, in contrast to other works in the literature, we investigate how the coding window size K must scale for increasing N. By appealing to the Central Limit Theorem, we approximate the Negative Binomial random variable arising in our analysis by a Gaussian random variable. We then obtain tight upper and lower bounds on the mean decoding delay and throughput in terms of K and N. Our analysis reveals that the coding window size of ln(N) represents a phase transition rate below which the throughput converges to zero, and above which it converges to the broadcast capacity. Our numerical investigations show that the bounds obtained using the Gaussian approximation also apply to the real system performance, thus illustrating the accuracy of the analysis.