Skip to Main Content
To calculate and render the stereo disparity map in real time at a video rate is a challenging problem. In the approach to use the raster scan video system, typically correlations are measured from a point in the left image to a point in the right image along the ID raster, then stereo correspondence for each and every pixel is searched. The time warp algorithm based on the dynamic programming (DP) optimizes the search for the entire raster scan line. To ensure the accuracy up to a pixel distance, the pixel-to-pixel similarity matrix needs to be calculated. This makes it nearly impossible to calculate a dense stereo disparity map to show at a video rate such as 30 frames/sec. In this paper, a method to reduce this enormous time necessary to calculate the pixel-by-pixel similarity matrix is proposed. The idea is to use coarse quantization in the luma and chroma image represented in the YUV color space to capture the global transitional points in a ID raster image as well as reduced sampling in the regions of plateau. Such feature sampling naturally forms a sparse representation of feature points both at edges and plateaus. Thus, the size of the similarity matrix for the time warp algorithm can be dramatically reduced from say, 352 × 288 in CIF down by almost 2 orders of the magnitude.