Skip to Main Content
In this study, we examine the effects of channel and time-slot allocation algorithms on data loss events in multi-hop wireless networks. We investigate networks where all nodes have a single-radio antenna, the MAC layer protocol is synchronized OFDM, and each channel uses TDMA. In such networks, data collisions can be avoided by reserving data time-slots. Data loss can also occur when a receiving node changes its assigned channel, hence it does not completely receive the data which is on the link. We call this type of loss data corruption. Our simulations show that if there are several available time slots to reserve, a proper channel and time-slot selection algorithm is required to prevent data corruption. Consequently, it is better that a node changes channels as infrequently as possible and any necessary change should take place at an appropriate time. The same is true about switching between receiving and sending modes, i.e. switching modes also leads to data loss, so it should be avoided whenever possible. In this study we propose a new algorithm to allocate channels and time-slots so as to decrease data loss.