Skip to Main Content
A scene can be defined as one of the subdivisions of a play in which the setting is fixed, or when it presents continuous action in one place. We propose a novel two-pass algorithm for scene boundary detection, which utilizes the motion content, shot length and color properties of shots as the features. In our approach, shots are first clustered by computing Backward Shot Coherence (BSC) - a shot color similarity measure that detects Potential Scene Boundaries (PSBs) in the videos. In the second pass we compute Scene Dynamics (SD), a function of shot length and the motion content in the potential scenes. In this pass, a scene merging criteria has been developed to remove weak PSBs in order to reduce over segmentation. We also propose a method to describe the content of each scene by selecting one representative image. The segmentation of video data into number of scenes facilitates an improved browsing of videos in electronic form, such as video on demand, digital libraries, Internet. The proposed algorithm has been tested on a variety of videos that include five Hollywood movies, one sitcom, and one interview program and promising results have been obtained.