Skip to Main Content
Traditional optical flow techniques applied to object tracking generally perform global searching and calculations of brightness and light intensity of the object in the image. In addition, traditional optical flow techniques assume that the light intensity is constant across a series of consecutive images. The goal is to obtain the displacement and moving direction of an object in a series of images. However, most of important information lies in the regions where optical flows vary significantly. Having relatively small optical flow variations usually implies that the information lying in this region is not important. As traditional optical flow techniques employs global searching to obtain optical flow values, the total computations are time consuming and most of time is spent on unimportant regions. If it is acceptable to exclude part of unimportant information then the overall algorithm can omit part of computations and hence shorten the time needed to calculate optical flow field. To speed up the optical flow calculation, this study proposes an edge-based algorithm for obtaining optical flows. The main ideas are to segments out objects in each of consecutive images and then compare every object's centroid with circumference to identify matching objects of each image. According to the movement data of corresponding objects in each image, optical flow field can be formed and as a result objects can be tracked. Finally, the proposed algorithm in this study has been experimented to effectively decrease computation time while preserving useful information.