This paper presents a wheel sinkage detection method that may be used in robotic lunar exploration tasks. The method extracts the boundary line between a robot wheel and lunar soil by segmenting the wheel-soil image captured from a video camera that monitors wheel-soil interaction. The detected boundary is projected onto the soil-free image of the robot wheel to determine the parameters of wheel sinkage. The segmentation method is based on a graph theory. It first clusters a wheel-soil image into homogeneous regions called superpixels and constructs a graph on the superpixels. It then partitions the graph into segments by using normalized cuts. Compared with the existing wheel sinkage detection methods, the proposed algorithm is more robust to illumination condition, shadows, and dust (covering the wheel). The method's efficacy has been validated by experiments under various conditions.