Skip to Main Content
The paper presents a 3D Virtual Environment (VE) for neurorehabilitation of the upper limb. Patients move one of their arms trying to simulate concrete daily actions, such as grasping a bottle, opening a door or putting a book on a shelve. They wear a special garment that integrates four inertial sensors providing in real time information on the orientation of the patient's shoulder, elbow and wrist. The VE is a complete scenario integrating all the objects needed to perform virtually the actions simulated by the patients. In the VE, patients are represented by 3D avatars that, using the data provided by the inertial sensors, reproduce in real time their arm movements. Since only the arm movement is monitorized, but neither the hand nor the trunk and the neck, the system must combine real movements with baked animations in order to show a realistic behavior of the 3D avatar. Moreover, it must take into account collisions between the 3D avatar and the virtual objects. Finally, it must detect when the patient is about to simulate an interaction with an object in order to realize it virtually. We describe the strategies that we have designed to provide these functionalities.