Skip to Main Content
We present an algorithm to assign time slots to nodes in a TDMA network that minimizes the jitter in time slot assignments. By reducing the jitter in time slots around the TDMA frame, we can provide more consistent network access and reduce the overall delay seen by an application with time-varying traffic patterns, such as normal web traffic. Our algorithm can reduce the average delay seen by all nodes in a network by up to 51%, based on our numerical analysis. The algorithm is designed for TDMA MAC layers where a node has the potential to reserve some number of time slots out of a larger selection of available slots, and one wishes to choose the slots that most evenly distribute them around the TDMA ring. The algorithm solves the Minimum Variance Placement problem for the special case of a directed ring. After describing an exact solution using dynamic programming, we present a much faster run time heuristic that closely approximates the exact solution.