Current rover localization techniques such as visual odometry have proven to be very effective on short to medium-length traverses (e.g., up to a few kilometres). This paper deals with the problem of long-range rover localization (e.g., 10km and up). An autonomous method to globally localize a rover is proposed by matching features detected from a 3D orbital elevation map and rover-based 3D lidar scans. The accuracy and efficiency of the algorithm is enhanced with visual odometry, and inclinometer/sun-sensor orientation measurements. The methodology was tested with real data, including 37 lidar scans of terrain from a Mars-Moon analogue site on Devon Island, Nunavut. When a scan contained a sufficient number of good topographic features, localization produced position errors of no more than 100m, and as low as a few metres in many cases. On a 10km traverse, the developed algorithm's localization estimates were shown to significantly outperform visual odometry estimates. It is believed that this architecture could be used to accurately and autonomously localize a rover on long-range traverses.