By Topic

Medical Imaging, IEEE Transactions on

Issue 1 • Date Mar 1992

Filter Results

Displaying Results 1 - 17 of 17
  • Comparison of chi-square and join-count methods for evaluating digital image data

    Page(s): 28 - 33
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (468 KB)  

    The significance of each bit of the pixel in digital image data from various radiological modalities is tested to determine the contrast resolution. Two statistical methods, join-count statistic and chi-square goodness of fit test, are used to perform the test. Join-count statistic is used to measure the spatial coherence among pixels, while the chi-square test is used to determine if the bit data are randomly distributed. A residual image is formed by subtracting an original image from its smoothed version. The contrast revolution is determined by applying both statistics on each bit plane of the residual image starting from the least significant bit up to the bit plane whose statistic does not show a random pattern. Images from three digital modalities, computerized tomography, magnetic resonance, and computed radiography, are used to evaluate the gray-level dynamic range. Both methods are easy to implement and fast to perform View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Correction of T2 distortion in multi-excitation RARE sequence

    Page(s): 123 - 128
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (480 KB)  

    Correction schemes have been implemented to correct for T2 distortions in a multiexcitation RARE (rapid acquisition with relaxation enhancement) sequence where data from multiple echoes and multiple excitations are combined. Computer simulation studies and human imaging studies have been conducted to develop and test the correction procedures. A direct method and an iterative technique have been investigated. The direct technique utilizes Hermitian symmetry of the T2 weighted data and is shown to reduce distortions in T2 weighted images. The iterative scheme begins with an estimation of T2, wherefrom k-space data are computed and compared to the true data to provide error images. The error images are then used to refine iteratively the reconstructed images at a specified echo time. The iterative procedure has been used to improve T1 weighted images acquired through a sequence based on acquisition of two half-plane Fourier samples. These correction techniques should enable a practical implementation of RARE for producing T1 and T2 weighted images comparable to standard spin echo images View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Neuromagnetic localization using magnetic resonance images

    Page(s): 129 - 134
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (728 KB)  

    Ionic flow associated with neural activation of the brain produces a magnetic field, called the neuromagnetic field, that can be measured outside the head using a highly sensitive superconducting quantum interference device (SQUID)-based neuromagnetometer. Under certain conditions, the sources producing the neuromagnetic field can be localized from a sampling of the neuromagnetic field. Neuromagnetic measurements alone, however, do not contain sufficient information to visualize brain structure. Thus, it is necessary to combine neuromagnetic localization with an anatomical imaging technique such as magnetic resonance imaging (MRI) to visualize both function and anatomy in vivo. Using experimentally measured human neuromagnetic fields and magnetic resonance images, the authors have developed a technique to register accurately these two modalities and have applied the registration procedure to portray the spatiotemporal distribution of neural activity evoked by auditory stimulation View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • CT fan beam reconstruction with a nonstationary axis of rotation

    Page(s): 111 - 116
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (428 KB)  

    To reconstruct an image using computed tomography (CT), the axis of rotation must pivot at the same point on the reconstruction plane that the X-ray source and the CT detector assembly rotate about around the imaged object. This pivot point is used as a reference point for backprojecting pixel values to their proper coordinates. Reconstructing an image with a nonstationary axis of rotation would backproject pixel values to incorrect coordinate points. A convolution filtered backprojection algorithm has been derived for correcting images that were acquired with a nonstationary axis of rotation using the fan beam geometry with a collinear (flat) detector. The correction method accounts for the vertical displacements of the axis of rotation as the CT scanner rotates around the imaged object, as may be the case when sagging occurs. Software simulations are performed to show how the algorithm corrects for the shift in the axis of rotation View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A cone-beam filtered backprojection reconstruction algorithm for cardiac single photon emission computed tomography

    Page(s): 91 - 101
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (996 KB)  

    A filtered backprojection reconstruction algorithm was developed for cardiac single photon emission computed tomography with cone-beam geometry. The algorithm reconstructs cone-beam projections collected from `short scan' acquisitions of a detector traversing a noncircular planar orbit. Since the algorithm does not correct for photon attenuation, it is designed to reconstruct data collected over an angular range of slightly more than 180° with the assumption that the range of angles is oriented so as not to acquire the highly attenuated posterior projections of emissions from cardiac radiopharmaceuticals. This sampling scheme is performed to minimize the attenuation artifacts that result from reconstructing posterior projections. From computer simulations, it is found that reconstruction of attenuated projections has a greater effect upon quantitation and image quality than any potential cone-beam reconstruction artifacts resulting from insufficient sampling of cone-beam projections. With nonattenuated projection data, cone beam reconstruction errors in the heart are shown to be small (errors of at most 2%) View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Detection of edges from projections

    Page(s): 76 - 80
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (452 KB)  

    In a number of applications of computerized tomography, the ultimate goal is to detect and characterize objects within a cross section. Detection of edges of different contrast regions yields the required information. The problem of detecting edges from projection data is addressed. It is shown that the class of linear edge detection operators used on images can be used for detection of edges directly from projection data. This not only reduces the computational burden but also avoids the difficulties of postprocessing a reconstructed image. This is accomplished by a convolution backprojection operation. For example, with the Marr-Hildreth edge detection operator, the filtering function that is to be used on the projection data is the Radon transform of the Laplacian of the 2-D Gaussian function which is combined with the reconstruction filter. Simulation results showing the efficacy of the proposed method and a comparison with edges detected from the reconstructed image are presented View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Statistical approach to X-ray CT imaging and its applications in image analysis. I. Statistical analysis of X-ray CT imaging

    Page(s): 53 - 61
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (688 KB)  

    A statistical description of X-ray CT (computerized tomography) imaging, from the projection data to the reconstructed image, is presented. The Gaussianity of the pixel image generated by the convolution (image reconstruction) algorithm is justified. The conditions for two pixel images to be statistically independent (for a given probability) and the conditions for a group of pixel images to be a spatial stationary random process and ergodic in mean and autocorrelations are derived. These properties provide the basis for establishing the stochastic image model and conducting the statistical image analysis of X-ray CT images View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A simple method for determining the modulation transfer function in digital radiography

    Page(s): 34 - 39
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (488 KB)  

    The authors developed a simple method for determining the presampling modulation transfer function (MTF). which includes the unsharpness of the detector and the effect of the sampling aperture, in digital radiographic (DR) systems. With this method, the presampling MTF is determined by the Fourier transform of a `finely sampled' line spread function (LSF) obtained with a slightly angulated slit in a single exposure. Since the effective sampling distance becomes much smaller than the original sampling distance of the DR system, the effect of aliasing on the MTF calculations can be eliminated. The authors applied this method to the measurement of the presampling MTF of a compound radiographic system and examined the directional dependence, the effect of exponential extrapolation, and the effect of different sampling distances. It is shown that the technique of multiple slit exposure and exponential extrapolation of the LSF tail, which has been commonly used in analog seven-film systems, can be employed in DR systems. The authors determined the glare fraction in order to estimate the component of low-frequency drop mainly due to `glare' View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Correction of distortion in endoscope images

    Page(s): 117 - 122
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (556 KB)  

    Images formed with endoscopes suffer from a spatial distortion due to the wide-angle nature of the endoscope's objective lens. This change in the size of objects with position precludes quantitative measurement of the area of the objects, which is important in endoscopy for accurately measuring ulcer and lesion sizes over time. A method for correcting the distortion characteristic of endoscope images is presented. A polynomial correction formula was developed for the endoscope lens and validated by comparing quantitative test areas before and after the distortion correction. The distortion correction has been incorporated into a computer program that could readily be applied to electronic images obtained at endoscopy using a desk-top computer. The research presented here is a key step towards the quantitative determination of the area of regions of interest in endoscopy View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Improved iterative image reconstruction with automatic noise artifact suppression

    Page(s): 21 - 27
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (636 KB)  

    A method for stabilizing iterative image reconstruction techniques has been developed for improving the image quality of position emission tomography. A damping matrix is introduced, which suppresses noisy correction on a pixel-by-pixel basis, depending on the statistical precision of the iterative correction. The precision is evaluated by comparing a certain number of correction submatrices, each of which is formed from a subset of the projection data. Simulation studies showed that statistical noise is effectively suppressed, while the image of the source object is reconstructed with high resolution, as long as the signal level is higher than the local noise level. In the application to the MLE (maximum likelihood estimator), the minimum RMS error of the image was reduced to 84% for 500 k total counts, and the RMS error increased more slowly with further iterations as compared with the simple MLE. The method was also applied to the FIR (filtered iterative reconstruction) algorithm, and the images were found to be better than those obtained by the convolution backprojection method View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Segmentation of tomographic data without image reconstruction

    Page(s): 102 - 110
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1728 KB)  

    Geometric tomography (GT), a technique for processing tomographic projections in order to reconstruct the external and internal boundaries of objects, is presented. GT does not necessitate the reconstruction of an image of the slice of the object. It is shown that the segmentation can be performed directly with the raw data, the sinogram produced with the scanner, and that those segmented shapes can be geometrically transformed into reconstructed shapes in the usual space. If one is interested in only the boundaries of the objects, they do not need to reconstruct an image, and therefore the method needs much less computation than those using traditional computed tomography techniques. Experimental results are presented for both synthesized and real data, leading to subpixel positioning of the reconstructed boundaries. GT gives its best results for sparse, highly contrasted objects such as bones or blood vessels in angiograms, it allows `on the fly' processing of the data, and real time tracking of the object boundaries View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Vector-extrapolated fast maximum likelihood estimation algorithms for emission tomography

    Page(s): 9 - 20
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1508 KB)  

    A new class of fast maximum-likelihood estimation (MLE) algorithms for emission computed tomography (ECT) is developed. In these cyclic iterative algorithms, vector extrapolation techniques are integrated with the iterations in gradient-based MLE algorithms, with the objective of accelerating the convergence of the base iterations. This results in a substantial reduction in the effective number of base iterations required for obtaining an emission density estimate of specified quality. The mathematical theory behind the minimal polynomial and reduced rank vector extrapolation techniques, in the context of emission tomography, is presented. These extrapolation techniques are implemented in a positron emission tomography system. The new algorithms are evaluated using computer experiments, with measurements taken from simulated phantoms. It is shown that, with minimal additional computations, the proposed approach results in substantial improvement in reconstruction View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The GEM MAP algorithm with 3-D SPECT system response

    Page(s): 81 - 90
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (880 KB)  

    In single photon emission computed tomography (SPECT), every reconstruction algorithm must use some model for the response of the gamma camera to emitted γ-rays. The true camera response is both spatially variant and object dependent. These two properties result from the effects of scatter, septal penetration, and attenuation, and they forestall determination of the true response with any precision. This motivates the investigation of the performance of reconstruction algorithms when there are errors between the camera response used in the reconstruction algorithm and the true response of the gamma camera. In this regard, the authors compare the filtered backprojection algorithm, the expectation-maximization maximum likelihood algorithm, and the generalized expectation maximization (GEM) maximum a posteriori (MAP) algorithm, a Bayesian algorithm which uses a Markov random field prior View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Developments with maximum likelihood X-ray computed tomography

    Page(s): 40 - 52
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1300 KB)  

    An approach to the maximum-likelihood estimation of attenuation coefficients in transmission tomography is presented as an extension of earlier theoretical work by K. Lange and R. Carson (J. Comput. Assist. Tomography, vol.8, p.306-16, 1984). The reconstruction algorithm is based on the expectation-maximization (EM) algorithm. Several simplifying approximations are introduced which make the maximization step of the algorithm available. Computer simulations are presented using noise-free and Poisson randomized projections. The images obtained with the EM-type method are compared to those reconstructed with the EM method of Lange and Carson and with filtered backprojection. Preliminary results show that there are potential advantages in using the maximum likelihood approaches in situations where a high-contrast object, such as bone, is embedded in low-contrast soft tissue View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Two iterative image restoration algorithms with applications to nuclear medicine

    Page(s): 2 - 8
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (532 KB)  

    Two methods for recovering an image that has been degraded while being processed are presented. The restoration problem is formulated as a constrained optimization problem in which a measure of smoothness based on the second derivatives of the restored image is maximized subject to the constraint that noise energy is equal to the energy in the difference between the distorted and blurred images. The approach is based on the Lagrange multiplier method. The first algorithm reduces the problem to the computation of few discrete Fourier transforms and allows control of the degree of sharpness and smoothness of the restored image. The second algorithm with weight matrices included allows the handling of edges and flat regions in the image in a pleasing manner for the human visual system. In this case the iterative conjugate gradient method is used in conjunction with the discrete Fourier transform to minimize the Lagrangian function. The application of these algorithms to nuclear medicine images is presented View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Statistical approach to X-ray CT imaging and its applications in image analysis. II. A new stochastic model-based image segmentation technique for X-ray CT image

    Page(s): 62 - 69
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (680 KB)  

    For pt.I, see ibid., vol.11, no.1, p.53.61 (1992). Based on the statistical properties of X-ray CT imaging given in pt.I, an unsupervised stochastic model-based image segmentation technique for X-ray CT images is presented. This technique utilizes the finite normal mixture distribution and the underlying Gaussian random field (GRF) as the stochastic image model. The number of image classes in the observed image is detected by information theoretical criteria (AIC or MDL). The parameters of the model are estimated by expectation-maximization (EM) and classification-maximization (CM) algorithms. Image segmentation is performed by a Bayesian classifier. Results from the use of simulated and real X-ray computerized tomography (CT) image data are presented to demonstrate the promise and effectiveness of the proposed technique View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Multicriterion maximum entropy image reconstruction from projections

    Page(s): 70 - 75
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (444 KB)  

    A solution algorithm for the image reconstruction problem with three criteria, maximum entropy, minimum nonuniformity and peakedness, and least square error between the original projection data and projection due to reconstruction is presented. Theoretical results of precedence properties which are respected by all noninferior solutions are first derived. These precedence properties are then incorporated into a multiple-criteria optimization framework to improve the computational efficiency. Comparisons of the new algorithm to the MART and MENT algorithms are carried out using computer-generated noise-free and Gaussian noisy projections. Results of the computational experiment and the efficiency of the multiobjective entropy optimization algorithm (MEOA) are reported View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

IEEE Transactions on Medical Imaging (T-MI) encourages the submission of manuscripts on imaging of body structures, morphology and function, and imaging of microscopic biological entities. The journal publishes original contributions on medical imaging achieved by various modalities, such as ultrasound, X-rays (including CT) magnetic resonance, radionuclides, microwaves, and light, as well as medical image processing and analysis, visualization, pattern recognition, and related methods. Studies involving highly technical perspectives are most welcome. The journal focuses on a unified common ground where instrumentation, systems, components, hardware and software, mathematics and physics contribute to the studies.

Full Aims & Scope

Meet Our Editors

Editor-in-Chief
Milan Sonka
Iowa Institute for Biomedical Imaging
3016B SC, Department of Electrical and Computer Engineering
The University of Iowa
Iowa City, IA  52242  52242  USA
milan-sonka@uiowa.edu