Skip to Main Content
Transmitting large packets over wireless networks helps to reduce header overhead, but may have an adverse effect on loss rate due to corruptions in a radio link. Packet loss in lower layers, however, is typically hidden from the upper protocol layers by link or MAC layer protocols. For this reason, errors in the physical layer are observed by the application as higher variance in end-to-end delay rather than increased packet loss rate. We study the effect of packet size on loss rate and delay characteristics in a wireless real-time application. We derive an analytical model for the dependency between packet length and delay characteristics. We validate our theoretical analysis through experiments in an ad hoc network using WLAN technologies. We show that careful design of packetization schemes in the application layer may significantly improve radio link resource utilization in delay sensitive media streaming under difficult wireless network conditions.