Skip to Main Content
In this paper, we present a new system to automatically estimate the rotational axis (orientation) of robotic joints relative to distributed accelerometers. We designed, implemented, and tested a method for the estimation of joint orientation. The method takes advantage of basic movement patterns of a robotic segment. The method uses considerably less input data compared to related methods for the estimation of joint orientation. As sensor input, it only needs the gravitational acceleration measured before and after a commanded joint rotation, dynamic acceleration components are not needed. We evaluated the implementation of the method on a Bioloid robot equipped with three Tactile Module prototypes. Our Tactile Modules are multimodal sensor systems and also feature a triaxial accelerometer. The robot successfully estimated the rotation axes of each DOF of its shoulder and elbow joints relative to the accelerometer frames of the Tactile Modules that are randomly distributed on the corresponding segments.