Skip to Main Content
The paper discusses a panoramic vision system for autonomous-navigation purposes. It describes an economic PC-based method for integrating data from multiple camera sources in real time. The views from adjacent cameras are visualized together as a panorama of the scene using a modified correlation-based stitching algorithm. A separate operator is presented with a particular slice of the panorama matching the user's viewing direction. Additionally, a simulated environment is created where the operator can choose to augment the video by simultaneously viewing an artificial three-dimensional (3-D) view of the scene. Potential applications of this system include enhancing quality and range of visual cues, and navigation under hostile circumstances where direct view of the environment is not possible or desirable.