Skip to Main Content
This paper presents the hardware implementation of a feature density and distribution algorithm used for autonomous robot navigation. This algorithm is implemented on a Memec Virtex-II board, taking advantage of the parallelism to decrease the processing time and complexity of the algorithm. By comparing consecutive frames against each other and calculating the expansion rate of the features found in the images, we are able to determine how far away we are from the objects we are viewing. This rate of expansion is found by performing a linear search between the two images, using a scalar and a shift to find the best match of the feature between the two images. Once this match is found, a time to impact in terms of frames is calculated and with such information, the robot is able to discover the distance to the object and make plans accordingly.