Skip to Main Content
Correcting PET data for patient movement is an important pre-processing step in modern PET studies with high resolution scanners. Many motion correction methods rely on external tracking of the patient's movement. An automatic and data-driven motion detection and correction scheme would simplify the measurement setup and the whole workflow of the PET study greatly. We propose a method which could be used as a part of this kind of scheme. The method estimates the rigid body motion parameters between two tracer distributions measured at different times. The estimation is done directly from projection data by registering two planar orthogonal views from the sinograms and utilizing Radon transform's angular shift property. The results from two numerical and one physical phantom datasets show that all six parameters can be estimated and that the method is accurate with a moderate amount of motion. The method could potentially be used to estimate the patient's head's pose and position without external equipment or image reconstruction.