By Topic

Visualization of raw 3D list-mode PET and TOFPET data without tomographic reconstruction using virtual space

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Sitek, A. ; Radiology Department, Brigham and Women¿s Hospital and Harvard Medical School, 75 Francis Street, Boston, MA 02115, USA

Progress in Positron emission tomography (PET) data-acquisition hardware and reconstruction/correction software has been tremendous in recent years; however, image visualization technology has not advanced as rapidly. We address this by proposing a new system for visualization of 3D PET and time-of-flight TOF-PET scans. The system is based on a fundamentally novel approach to emission data handling that does not require tomographic reconstruction. Our system directly links acquired data with a binocular human visual system with a minimal intermediate processing. A virtual reality environment is constructed on the fly during interactive visual examination of the raw data. Users view three-dimensional virtual scene in an interactive manner by modifying the virtual viewpoint location, zoom, contrast, and other viewing parameters. For every viewing frame, the raw data are analyzed using a massively parallel computing system to extract two views that, presented in stereoscopy, create a virtual reality environment. Binocular stereoscopic vision is used to enhance visual perception of the data by providing stereoscopic depth and to naturally reduce apparent noise by stereoscopically fusing the left- and right-eye views in the visual cortex of the brain. We used computer simulation of the 3D list-mode PET acquisition to test basic characteristics of the proposed system. We implemented binocular viewing using stereoscopic monitor. We found that system performs very well in terms of computational efficiency achieving 5 frames per second for 10 mln events using CPU. Number of displayable events varied from 2,000 to 200,000 per frame depending on a parameter that controls resolutionnoise trade-off in the virtual image and position of the viewing point. We found that the stereoscopic fusion in the brain of resulting images was feasible using stereoscopic display system. Further studies of the new visualization system and the evaluation of diagnostic task performance are needed.

Published in:

Nuclear Science Symposium Conference Record, 2008. NSS '08. IEEE

Date of Conference:

19-25 Oct. 2008