Skip to Main Content
Request-Grant (RG) matching algorithms are now widely used for scheduling in high-speed packet switches. An RG algorithm is iterated for a few iteration cycles in a time-slot to achieve a higher matching size. In RG algorithms, each input sends request to all/some of the outputs at each iteration cycle. Then each output sends grant to one of its requesting inputs according to an input priority sequence. In this paper, we analyze the existing RG matching algorithms showing that they have fairness and convergence problem under a range of input load distributions. Our analysis and simulation show that our proposed algorithm called TRGA is a high throughput matching algorithm that presents an optimal fairness and convergence under all load distributions.