By Topic

Medical Imaging, IEEE Transactions on

Issue 2 • Date Jun 1990

Filter Results

Displaying Results 1 - 13 of 13
  • Mathematical simplification of a PET blood flow model

    Publication Year: 1990 , Page(s): 172 - 176
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (404 KB)  

    The positron emission tomography (PET) H215O bolus injection model for cerebral blood flow (CBF) requires calculation of a certain double integral that, when calculated, provides the pixel values of a reconstructed image (PET number) in terms of the tissue flow, the arterial input function, a decay constant for 15O, the partition coefficient and a camera calibration constant that relates the flow-dependent integrated tissue activity to the measured PET number (cts/pixel). The tissue activity is assumed to be zero at the time of injection. A mathematical simplification, changing the order of integration, enabled the integration with respect to time to be performed analytically before the integration of the arterial input function. As a result of this simplification, only single integrals remain to be calculated numerically; cubic spline integration was used to calculate numerically these remaining integrals. This technique increases the accuracy and speed of evaluating blood flow without making simplifying assumptions. Similar simplifications may be applicable to other physiological models View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • 3-D segmentation of MR images of the head for 3-D display

    Publication Year: 1990 , Page(s): 177 - 183
    Cited by:  Papers (83)  |  Patents (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1028 KB)  

    Algorithms for 3-D segmentation and reconstruction of anatomical surfaces from magnetic resonance imaging (MRI) data are presented. The 3-D extension of the Marr-Hildreth operator is described, and it is shown that its zero crossings are related to anatomical surfaces. For an improved surface definition, morphological filters-dilation and erosion-are applied. From these contours, 3-D reconstructions of skin, bone, brain, and the ventricular system can be generated. Results obtained with different segmentation parameters and surface rendering methods are presented. The fidelity of the generated images comes close to anatomical reality. It is noted that both the convolution and the morphological filtering are computationally expensive, and thus take a long time on a general-purpose computer. Another problem is assigning labels to the constituents of the head; in the current implementation, this is done interactively View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An efficient algorithm for MR image reconstruction without low spatial frequencies

    Publication Year: 1990 , Page(s): 184 - 189
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (560 KB)  

    It is demonstrated that if the image to be reconstructed is known to have some zero-valued pixels, the dynamic ranges can be better used by disregarding the largest signals and using signal restoration methods. Low-frequency and high-frequency signals are related, using the knowledge that some pixels are zero, by a set of linear equations in which the number of equations is equal to the number of zero pixels, and the number of unknowns is equal to the number of low-frequency signal samples rejected. An improved Fourier transform (FT), magnetic resonance (MR) imaging method based on a least-square-error (LSE) technique, and an efficient algorithm for signal restoration when the low-frequency components are discarded are presented. In this method, the regions of support in both the image domain and the frequency domain can have arbitrary shapes, and all zero pixels in the image domain can be taken into account. The algorithm has been tested on simulated and experimental data with acceptable results View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • High-resolution NMR chemical-shift imaging with reconstruction by the chirp z-transform

    Publication Year: 1990 , Page(s): 190 - 201
    Cited by:  Papers (1)  |  Patents (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (876 KB)  

    A study of a novel nuclear magnetic resonance (NMR) chemical-shift image reconstruction method with a high chemical-shift resolution achieved by the chirp z-transform (CZT) is presented. Phase encoding is used for the spatial coordinates x and y, and the frequency coordinate is reserved especially for the chemical shift. The Fourier transform (FT) image reconstruction algorithm, which forms the basis of the new CZT image reconstruction method, is introduced. The novel method, using the CZT instead of the FT to evaluate the chemical-shift spectrum at a much higher resolution, is studied. The chemical-shift resolutions, achieved by the FT and the CZT, are studied theoretically from the aspect of the peak height and the peak width of chemical-shift spectra. The chemical-shift spectra calculated at a selected point in the image plane, and the chemical shift-images reconstructed by this method, are shown for a simple phantom containing ethanol and methanol at different locations. The results obtained by this method and by the FT method are compared and discussed. The experimental results have shown that a chemical-shift as small as 39 Hz, relative to the proton resonance frequency of 21.34 MHz, can be resolved successfully by this method without improvements in magnetic field homogeneity View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Noise impact on error-free image compression

    Publication Year: 1990 , Page(s): 202 - 206
    Cited by:  Papers (16)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (432 KB)  

    Some radiological images with different levels of noise have been studied using various decomposition methods incorporated with Huffman and Lempel-Ziv coding. When more correlations exist between pixels, these techniques can be made more efficient. However, additional noise disrupts the correlation between adjacent pixels and leads to a less compressed result. Hence, prior to a systematic compression in a picture archiving and communication system (PACS), two main issues must be addressed: the true information range which exists in a specific type of radiological image, and the costs and benefits of compression for the PACS. It is shown that with laser film digitized magnetic resonance images, 10-12 b are produced, although the lower 2-4 b show the characteristics of random noise. The addition of the noise bits is shown to adversely affect the amount of compression given by various reversible compression techniques. The sensitivity of different techniques to different levels of noise is examined in order to suggest strategies for dealing with noise View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Effects of myocardial wall thickness on SPECT quantification

    Publication Year: 1990 , Page(s): 144 - 150
    Cited by:  Papers (20)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (724 KB)  

    The effects of changing myocardial wall thickness in single photon emission computed tomography (SPECT) imaging are characterized, and a method which may be used to compensate for these effects is presented. The underlying principle is that the phenomena of attenuation, Compton scatter, and finite resolution can be separated and treated independently. Only finite resolution and its effects, along with a proposed method for correcting these effects, are addressed. A cardiac phantom with varying wall thickness (9-23 mm) was developed to characterize the dependence effects on 201Tl myocardial SPECT images. Correction factors in the form of recovery coefficients have been developed with the use of a convolution simulation, and are shown to improve substantially the agreement of counts extracted from SPECT images of the phantom with the actual 201Tl concentration. The degree of improvement, however, is markedly affected by external attenuation. Clinical application of this method will require corrections for attenuation and scatter or the development of regional recovery coefficients which include these effects View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A fast Bayesian reconstruction algorithm for emission tomography with entropy prior converging to feasible images

    Publication Year: 1990 , Page(s): 159 - 171
    Cited by:  Papers (15)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1184 KB)  

    The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On the validity of hypothesis testing for feasibility of image reconstructions

    Publication Year: 1990 , Page(s): 226 - 230
    Cited by:  Papers (5)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (432 KB)  

    Feasible images have been defined as those images that could have generated the original data by the statistical process that governs the measurement. In the case of emission tomography, the statistical process of emission is Poisson and it is known that feasible images resulting from the maximum likelihood estimator (MLE) and Bayesian methods with entropy priors can be of high quality. Tests for feasibility have been described that are based on one critical assumption: the image that is being tested is independent of the data, even though the reconstruction algorithm has used those data in order to obtain the image. This fact could render the procedure unacceptable unless it is shown that its effects on the results of the tests are negligible. Experimental evidence is presented showing that images reconstructed by the MLE and stopped before convergence do indeed behave as if independent of the data. The results justify the use of hypothesis testing in practice, although they leave the problem of analytical proof still open View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Incremental algorithm-a new fast backprojection scheme for parallel beam geometries

    Publication Year: 1990 , Page(s): 207 - 217
    Cited by:  Papers (19)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (812 KB)  

    A fast backprojection scheme for parallel beam geometries is proposed. Known as the incremental algorithm, it performs backprojection on a ray-by-ray (beam-by-beam) basis rather than the pixel-by-pixel backprojection in the conventional algorithm. By restructuring a conventional backprojection algorithm, the interdependency of pixel computations (position and value) is transformed to a set of incremental relations for a beam, where a beam is a set of pixels enclosed by two adjacent rays in 2-D computed tomography (CT), and a set of voxels enclosed by four adjacent rays in 3-D CT. To minimize the overhead of searching for the next pixels, a searching flow technique has been developed to implement the first-order and second-order incremental relations for 2-D and 3-D CTs, respectively. The values of all the pixels in each beam (except the first pixel) are computed with additions only, the key idea of the proposed backprojection scheme. The incremental algorithm has been implemented on two different machines and compared to B.F. Shepp and L.A. Logan's (1974) algorithm. The present implementation results show the superiority of this approach over the conventional algorithm View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Simulated annealing image reconstruction method for a pinhole aperture single photon emission computed tomograph (SPECT)

    Publication Year: 1990 , Page(s): 128 - 143
    Cited by:  Papers (4)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1212 KB)  

    A series of computer experiments was performed to determine the relative performance of simulated annealing, quenched annealing, and a least-squares iterative technique for image reconstruction for single photon emission computed tomography (SPECT). The simulated SPECT geometry was of the pinhole aperture type, with 32 pinholes and 128 or 512 detectors. To test the robustness of the reconstruction techniques upon arbitrary geometries, a 360-detector geometry with a random pixel-detector-factor matrix was tested. Eight computer-simulated, 10-cm-diameter planar phantoms were used with 1961 2-mm2 reconstruction bins and a range of 3000 to 50,000,000 detected photon counts. Reconstruction quality was measured by a normalized, squared error picture distance measure. Over a wide range of noise, the simulated annealing method had slightly better reconstruction quality than the iterative method, although requiring greater reconstruction time. Quenched annealing was faster than simulated annealing, with comparable reconstruction quality. Methods of efficiently controlling the simulated annealing algorithm are presented View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Reconstruction of two-dimensional permittivity distribution using the distorted Born iterative method

    Publication Year: 1990 , Page(s): 218 - 225
    Cited by:  Papers (221)  |  Patents (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (624 KB)  

    The distorted Born iterative method (DBIM) is used to solve two-dimensional inverse scattering problems, thereby providing another general method to solve the two-dimensional imaging problem when the Born and the Rytov approximations break down. Numerical simulations are performed using the DBIM and the method proposed previously by the authors (Int. J. Imaging Syst. Technol., vol.1, no.1, p.100-8, 1989) called the Born iterative method (BIM) for several cases in which the conditions for the first-order Born approximation are not satisfied. The results show that each method has its advantages; the DBIM shows faster convergence rate compared to the BIM, while the BIM is more robust to noise contamination compared to the DBIM View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Application of multiresolution spatial filters to long-axis tracking

    Publication Year: 1990 , Page(s): 151 - 158
    Cited by:  Papers (2)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (780 KB)  

    A method was developed for automatically tracking the long axis of thin objects which have nonuniform width and arbitrary orientation in a two-dimensional image space. This method is used to determine the length of isolated contractile smooth muscle cells, but has applications in other medical areas such as angiographic imaging. Pattern recognition techniques that determine object size and orientation are used to identify the long axis of an imaged object from its responses to difference of Gaussian and orientation filters. This method needs no a priori knowledge of object location, adapts to varying image magnification, requires little human interaction, and yields reproducible results View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Information analysis of single photon emission computed tomography with count losses

    Publication Year: 1990 , Page(s): 117 - 127
    Cited by:  Papers (3)  |  Patents (10)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (824 KB)  

    An analysis is presented of the information transfer from emitter-space to detector-space in single photon emission computed tomography (SPECT) systems. The analysis takes into account the fact that count loss side information is generally not available at the detector. Side information corresponds to the number γ-rays lost deleted due to lack of interaction with the detector data. It is shown that the information transfer depends on the structure of the likelihood function of the emitter locations associated with the detector data. This likelihood function is the average of a set of ideal-detection likelihood functions, each matched to a particular set of possible deleted γ-ray paths. A lower bound is derived for the information gain due to incorporating the count loss side information at the detector. This is shown to be significant when the mean emission rate is small or when the γ-ray deletion probability is strongly dependent on emitter location. Numerical evaluations of the mutual information, with and without side information, associated with information-optimal apertures and uniform parallel-hole collimators are presented View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

IEEE Transactions on Medical Imaging (T-MI) encourages the submission of manuscripts on imaging of body structures, morphology and function, and imaging of microscopic biological entities. The journal publishes original contributions on medical imaging achieved by various modalities, such as ultrasound, X-rays (including CT) magnetic resonance, radionuclides, microwaves, and light, as well as medical image processing and analysis, visualization, pattern recognition, and related methods. Studies involving highly technical perspectives are most welcome. The journal focuses on a unified common ground where instrumentation, systems, components, hardware and software, mathematics and physics contribute to the studies.

Full Aims & Scope

Meet Our Editors

Editor-in-Chief
Michael Insana
Beckman Institute for Advanced Science and Technology
Department of Bioengineering
University of Illinois at Urbana-Champaign
Urbana, IL 61801 USA
m.f.i@ieee.org