This paper presents a new keypoint-based approach to near-duplicate images detection. It consists of three steps. Firstly, the keypoints of images are extracted and then matched. Secondly, the matched keypoints are voted for estimation of affine transform based on an affine invariant ratio of normalized lengths. Finally, to further confirm the matching, the color histograms of areas formed by matched keypoints in two images are compared. This method has the advantage for handling the case when there are only a few matched keypoints. The proposed algorithm has been tested on Columbia dataset and conducted the quantitative comparison with RANdom SAmple Consensus (RANSAC) algorithm and Scale-Rotation Invariant Pattern Entropy (SR-PE) algorithm. The experiment result turns out that the proposed method compares favorably against the state-of-the-arts.