An augmented reality (AR) cueing method designed to improve teleoperator performance under conditions of display-control misalignment is investigated. The teleoperation task was designed to mimic the operation of space robot arms, which are manipulated using hand controllers (HCs) to orient and translate the body-fixed coordinates at the end effector (EE). Cameras provide visual feedback. However, the pose of the EE seen through the camera hinders operator performance due to misalignments between the displayed EE axes and the HC axes. In this paper, the coordinate system of the EE is graphically overlaid in three dimensions on the video views with uniquely colored axes using AR. The same color scheme is used to label the corresponding axes on the HCs. Operators use these color cues to map each axis on the HCs to the corresponding colored axis of the augmented coordinates at the EE to obtain EE movement in the desired direction. Between-groups and within-participant experiments comparing EE trajectory distance, deviation from path, navigation errors, and HC axis usage were used to determine the effectiveness of the augmented coordinates over conventional teleoperation without augmented coordinates. Significant reductions in EE trajectory distance, deviation from path, navigation errors, and single-axis HC usage were observed when participants manipulated the remote robot with augmented coordinates. The results demonstrate that the use of simple AR cues in remote robot arm teleoperation is beneficial to the operator.