By Topic

Nuclear Science Symposium Conference Record, 2001 IEEE

Date 4-10 Nov. 2001

Go

Filter Results

Displaying Results 1 - 25 of 139
  • 2001 IEEE Nuclear Science Symposium Conference Record [front matter]

    Page(s): i - l
    Save to Project icon | Request Permissions | PDF file iconPDF (3738 KB)  
    Freely Available from IEEE
  • Non-negative matrix factorization of dynamic images in nuclear medicine

    Page(s): 2027 - 2030 vol.4
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (279 KB) |  | HTML iconHTML  

    Recently suggested non-negative matrix factorization (NMF) seems to overcome fundamental limitations of factor analysis at least In theoretical aspect. NMF cost function uses Poisson statistics as a noise model, rather than the Gaussian statistics, and provides a simple learning rule, in contrast to the tricky optimization in factor analysis. To study the feasibility of NMF for the analysis of dynamic image sequences in nuclear medicine, NMF was applied to H/sub 2//sup 15/O dynamic myocardial PET Images acquired from dog studies, and the results were compared with those obtained by conventional factor analysis method. Using NMF we could obtain basis images corresponding to major cardiac components. Their time-activity curves showed reasonable shapes that we have been familiar with. With the assumption of proper number of factors, NMF presented good results at least similar with those by factor analysis. Our results showed that NMF would be feasible for image segmentation and factor extraction from dynamic image sequences in nuclear medicine. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Author index

    Page(s): 2505 - 2518
    Save to Project icon | Request Permissions | PDF file iconPDF (268 KB)  
    Freely Available from IEEE
  • Regularized one-pass list-mode EM algorithm for high resolution 3D PET image reconstruction into large arrays

    Page(s): 1853 - 1858
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4243 KB) |  | HTML iconHTML  

    High resolution 3D PET scanners with high count rate performance, such as the quad-HIDAC, place new demands on image reconstruction algorithms due to the large quantities of high precision list-mode data which are produced. Therefore a reconstruction algorithm is required which can, in a practical time frame, reconstruct into very large image arrays (submillimetre voxels, which range over a large field of view) whilst preferably retaining the precision of the data. This work presents an algorithm which meets these demands: Regularized One-Pass List-mode EM (ROPLE). The algorithm operates directly on list-mode data, passes through the data once only, accounts for finite resolution effects in the system model and also includes regularization. The algorithm performs multiple image updates during its single pass through the list-mode data, corresponding to the number of subsets that the data have been split into. The algorithm has been assessed using list-mode data from a quad-HIDAC, and is compared to the analytic reconstruction method 3D RP View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Iterative reconstruction of SPECT data with adaptive regularization

    Page(s): 1859 - 1863
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2762 KB) |  | HTML iconHTML  

    A least-square reconstruction criterion is proposed for simultaneously estimating a SPECT (Single Photon Emission Computed Tomography) emission distribution corrected for attenuation together with its degree of regularization. Only a regularization trend has to be defined and tuned once for all on a reference study. Given this regularization trend, the precise regularization weight, which is usually fixed a priori, is automatically computed for each data set to adapt to the noise content of the data. We demonstrate that this adaptive process yields better results when the noise conditions change than when the regularization weight is kept constant. This adaptation is illustrated on simulated cardiac data for noise variations due to changes in the acquisition duration, in the background intensity and in the attenuation map View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Lesion detection and quantitation of positron emission mammography

    Page(s): 2248 - 2252
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (250 KB) |  | HTML iconHTML  

    A Positron Emission Mammography (PEM) scanner dedicated to breast imaging is being developed at our laboratory. We have developed a fist mode likelihood reconstruction algorithm for this scanner. Here we theoretically study the lesion detection and quantitation. The lesion detectability is studied theoretically using computer observers. We found that for the zero-order quadratic prior, the region of interest observer can achieve the performance of the prewhitening observer with a properly selected smoothing parameter. We also study the lesion quantitation using the test statistic of the region of interest observer. The theoretical expressions for the bias, variance, and ensemble mean squared error of the quantitation are derived. Computer simulations show that the theoretical predictions are in good agreement with the Monte Carlo results for both lesion detection and quantitation View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A new pileup-prevention front-end electronic design for high resolution PET and gamma camera

    Page(s): 1969 - 1973
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (359 KB) |  | HTML iconHTML  

    A new method for processing signals from Anger position-sensitive detectors used in gamma cameras and PET (positron emission tomography) is proposed for very high count-rate imaging. It has a same concept as HYPER (high yield pileup-event recover) method we introduced before by using (a) dynamically integrating a present event, the integrating will stop immediately before the next event is detected; (b) estimating a weighted-value to indicate the total energy inside the scintillation detector; and (c) remnant correction to remove the residual energy of all the previous events from the weighted-value. This paper introduces two improved practical techniques to get a better weighted-value with low noise sensitivity in order to improve the final pileup-free energy resolution. One is applying a low-pass filter combined with multiple sampling to a weight-sum of the instantaneous signal and integrated signal. The other one is weighting the integration value of the income signal; the weighting also includes exponential distortion compensation. This paper also described the application of the improved HYPER electronics to a high resolution low cost PET camera with 12 PQS (PMT-quadrant-sharing) detector modules that can decode 38,016 BGO crystal elements using 924 PMTs. Each detector module has 4 HYPER circuits to further increase the count-rate. To use the HYPER circuit in coincidence imaging application, there is a serious synchronization problem between the arrival time of an event and the end of integration that is variable from event to event. This synchronization problem is solved by a FPGA circuit with real time remnant correction and a high-resolution trigger delay unit with small dead-time for recovering the synchronization of data and event-trigger View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Simulation of maximum-likelihood position estimation in small gamma camera with position-sensitive photomultiplier tube (PSPMT)

    Page(s): 1915 - 1918
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (787 KB) |  | HTML iconHTML  

    In the miniature scintillation camera that is composed of a position-sensitive photomultiplier tube (PSPMT) and a scintillation crystal, a computer simulation of maximum-likelihood position estimation (MLPE) for better correction of image than that by a conventional algorithm has been studied. So far MLPE has been proposed in several papers, and mostly experiments finding look-up table (LUT) is necessary to apply this method. In this paper, the MLPE using a simulation method instead of experiments, which can consider all stochastic processes of converting the location of gamma-ray interaction with scintillator into position signals, has been proposed, implemented, and compared to conventional algorithm. From the analysis and comparison of images, the quality of image was better than that of resulting from the conventional algorithm. We verified this simulation method through comparison this simulation images with experimental images using small gamma camera of a NaI(Tl) and a PSPMT. From the result, it is proposed that this simulation method could be applied for all PSPMT with only one simulation instead of every tedious experiment to obtain the LUT for different PSPMT even with the same specification View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Lead tungstate crystals of high light yield for medical imaging

    Page(s): 1910 - 1914
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1377 KB) |  | HTML iconHTML  

    Because of their high stopping power and fast scintillation, lead tungstate crystals have attracted much attention in the high energy physics and nuclear physics communities. The use of lead tungstate, how ever, is limited by its low light output. An effort has been made at the Shanghai Institute of Ceramics to improve this. The results indicate that a factor often increase of the light output, mainly in the microsecond decay component, may be achieved. The X-ray diffraction pattern, photoluminescence spectrum, light output, decay kinetics and transmittance spectrum of new samples are presented. Longitudinal uniformity of a sample of 22 radiation lengths is studied. This new type of lead tungstate crystals has attracted broad interest in the field of medical imaging View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An investigation of projection sampling for Ga-67 tumor detection

    Page(s): 2263 - 2267
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (317 KB) |  | HTML iconHTML  

    Psychophysical studies were used to investigate the effect of number of projection angles Φ on images reconstructed from simultaneous emission-transmission scans of fixed duration. The specific task was detection and localization of Ga-67 tumors in simulated images of the thorax, and a Tc-99m transmission source was modeled in the data acquisition. Reconstructions with the rescaled block-iterative (RBI) expectation-maximization algorithm included corrections for nonuniform attenuation and collimator response, but not scatter. A combination of channelized Hotelling observer (CHO) ROC and human observer localization ROC (LROC) studies were conducted. The CHO was used to optimize the number of RBI iterations and the post-reconstruction filtering level for reconstruction strategies based on Φ∈ {20, 30, 40, 60, 90, 120} angles. The LROC study compared these optimized strategies. No significant differences in human detection-task performance were found for Φ⩾30. This finding is attributed largely to collimator effects, the presence of uncorrected scatter in the data, and the post-filtering of the reconstructed images View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • PET/SPECT detectors with light intensifiers and fiber coding

    Page(s): 1919 - 1923
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2630 KB) |  | HTML iconHTML  

    NaI scintillators for PET/SPECT detectors with light intensifier and fiber readout are considered using a Monte Carlo model. A coding algorithm which uses light detected at both ends of the fibers is proposed and the simulated results are evaluated. The scheme looks very promising View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Performance of large-area CZT detectors for hard X-ray imaging

    Page(s): 2286 - 2290
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (860 KB) |  | HTML iconHTML  

    We have developed large, monolithic CZT detectors for hard x-ray imaging applications. Detectors measure 32×32 mm2 and 25×25 mm2×2 mm thick. Crossed strip readout gives 0.5 mm spatial resolution. Signals are read out by four 32-input ASICs, with spectroscopic analysis performed on the anodes. Results are presented on the detectors' spatial accuracy, spectral response, and effective area. Also presented are imaging results with the detectors employed in coded mask imagers View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Collimator blurring reduction method using fine angular sampling projection data in SPECT

    Page(s): 2139 - 2142
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (536 KB) |  | HTML iconHTML  

    Due to the collimator aperture, the spatial resolution of SPECT data varies with source-to-detector distance. Since radius of detector rotation is bigger when scanning larger patients, the spatial resolution is degraded in these cases. Emitted gamma rays travel not only along the central axis of the collimator hole but also off-axis due to collimator aperture. However, an off-axis ray at one angle would be a central-axis ray at another angle; therefore, raw projection data at one angle can be thought of as an ensemble of central-axis rays collected from a small arc equal to collimator aperture. Thus, fine angular sampling can compensate for collimator blurring. By using sampling pitch of less than half the collimator aperture angle, compensation was performed by subtracting the weighted sum of the projection data from the raw projection data. Collimator geometry and detector rotation radius determined the weighting function. A cylindrical phantom with four different-sized rods and torso phantom for Tl-201 cardiac SPECT simulation were used for evaluation. Aperture angle of the collimator was 7 degrees. Projection sampling pitch was 2 degrees. In both phantom studies, the proposed method showed improvement in contrast and reduction of partial volume effect, thereby indicating that the proposed method can compensate adequately for image blurring caused by collimator aperture View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Optimization of time and energy resolution at high count rates with a large volume coaxial high purity germanium detector

    Page(s): 2420 - 2423
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (586 KB) |  | HTML iconHTML  

    Timing and energy resolution verses count rate studies were performed using a 50% relative efficiency, cylindrical, co-axial High Purity Germanium Detector (HPGe). In the search for chemical agents, explosives or illicit drugs using neutrons as a probe, it can be beneficial to trade off some of the excellent HPGe energy resolution for better timing resolution on the gamma ray signal. We have measured timing resolutions (sigma) of better than 2.5 ns for rates up to 125 kHz with measured energy resolutions (FWHM) of 6 KeV at a gamma ray energy of 1.17 MeV View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Estimation of image noise in PET using the bootstrap method

    Page(s): 2075 - 2079
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (467 KB) |  | HTML iconHTML  

    The bootstrap method applied to PET data was evaluated as a technique to determine regional image noise in PET. To validate the method, 250 scans (5 min each) of a uniform cylinder filled with 68 Ge was acquired and reconstructed using filtered backprojection (FBP). A single 5 min list mode scan was also acquired. From the list mode data, 250 bootstrap replicates were generated by randomly drawing, with replacement, prompt and random events. In each replicate, the total numbers of prompt and random events were kept identical to the number in the original list mode data set. The 250 individual scans, and the bootstrap replicates, were reconstructed using FBP and OSEM. Mean and standard deviation (SD) images were generated from the reconstructed images. Mean and SD were also calculated in a central region of the image sets. Visual inspection showed no appreciable difference between the SD images derived from the repeated scans and the bootstrap replicates. Profiles through the images, showed no significant difference between image sets. Using an increased number of bootstrap replicates produced less noise in the standard deviation images. ROI analysis showed, that the standard deviations derived from the bootstrap replicates were very close to the ones derived from the repent scans, independent of reconstruction algorithm. The results indicate that the bootstrap method can accurately estimate regional image noise in PET. This could potentially provide a method to accurately compare image noise in phantom and patient data under various imaging and processing conditions, without the need for repeat scans View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Monte-Carlo modeling of scintillator crystal performance for stratified PET detectors with DETECT2000

    Page(s): 1997 - 2001
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (236 KB) |  | HTML iconHTML  

    In order to determine the theoretical performance of a multi-layer scintillator crystal used in small PET scanners, we have used the DETECT2000 Monte-Carlo simulation of the light transport in scintillation crystal software. The results given by this software demonstrate that some of the individual crystals in the block could not be distinguished. Also we have found out that the layer connecting the different crystals degrades the performance of the scintillation block. Simulation of thinner interconnecting layers suggest that much better crystal identification could be obtained View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Simulation studies on the detection efficiency for a phoswich detector with background rejection capability

    Page(s): 1924 - 1927
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (215 KB) |  | HTML iconHTML  

    Simulation studies were performed on calculation of the detection efficiency for a phoswich detector with background rejection capability for continuous blood sampling system. The detector consists of a plastic scintillator, a BGO scintillator, and a photomultiplier tube (PMT). The pulse shape information is used to distinguish whether the detected event is a true event or a background one. First, we calculated the transmission efficiency of the positron in the tube. And we calculated the detection efficiency of positrons detected by the plastic scintillator. We also calculated the detection efficiency of 511 keV gamma photons by the BGO View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • New ways for purifying lead iodide appropriate as spectrometric grade material

    Page(s): 2340 - 2343
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (265 KB) |  | HTML iconHTML  

    In order to overcome some unsolved problems of lead iodide purification such as lack of stoichiometry, decomposition and polytype admixture, several purification methods were compared. Lead iodide Alfa Aesar and synthesized from lead nitrate and potassium iodide was purified by zone refining, zone refining followed by sublimation, repeated sublimation and repeated evaporation, at different conditions. Zone refining was performed at 420°C, 3 cm/hr, 100 passes, repeated sublimation at 390°C and vacuum (10-5 mmHg) or Ar atmosphere (500 to 580 mmHg) and repeated evaporation at 10-5 mmHg or in Ar atmosphere (150 to 600 mmHg) and temperature from 450°C to 600°C. Purification methods were evaluated by studying parameters of the purified material like decomposition, stoichiometry, purity and polytype composition and also taking into account purification yield and rate. Stoichiometry was determined by wet procedures, purity by Inductively Coupled Plasma (ICP) and polytypes by powder X-ray diffraction. Evaporation of lead iodide at the highest temperature and moderate Ar pressure (600°C and 500-600 mmHg) proved to be the best way to avoid material decomposition. Sublimation and evaporation give the best stoichiometry (PbI1.90) especially when compared with zone refining (PbI1.40). Whatever the purification method, material has an appreciable polytype content. Purity showed similar results for 100 zone refining passes than for 3 evaporations. Furthermore, evaporation exhibits maximum yield and rate (16%/day). Therefore, the work performed points out evaporation in Ar atmosphere as the quickest and more efficient purification method for producing spectrometric grade lead iodide, avoiding material decomposition and achieving high purity, but maintaining the best stoichiometry View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The effect of attenuation on lesion detection in PET oncology

    Page(s): 2124 - 2128
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (335 KB) |  | HTML iconHTML  

    Using simulation and patient studies, we compare the lesion detection in whole-body PET oncology with and without performing attenuation correction (AC and non-AC). In addition, we discuss the prediction of lesion distortion in non-AC PET images. In all of the simulation studies, the OSEM algorithm is used to reconstruct 2D noise-free data. Lesions of different size, density, and uptake are simulated in the lungs. Lesions with the density of soft tissue and different uptake are simulated in the mediastinum and abdomen. The patient studies are performed using an ADAC dual-head hybrid SPECT-PET system. The lesion-to-background ratio (LBR) in the reconstructed images is used for quantitative analyses. Simulation studies and a total of 40 patient studies show that the LBR in AC images is higher in lungs but lower in the mediastinum and abdomen than in non-AC images. In addition, simulation studies show that, in non-AC images, for given lesion and lung uptakes, the LBR increases when the lesion density decreases and spherical lesions appear to be elongated in the direction with minimal attenuation and squeezed in the direction with maximal attenuation View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Compensation for head movement in 3D PET

    Page(s): 2013 - 2017
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (454 KB) |  | HTML iconHTML  

    Motion correction of neurological 3D PET data by the multiple acquisition frame (MAF) method presents some challenges that do not apply to the same extent in 2D. One of the most significant is that 3D PET scans have a much higher scatter component, and the inclusion of valid correction for scatter is essential for accurate quantification. This paper describes a technique for motion correction of PET data acquired in 3D that uses information provided by an optical motion tracking system, and includes corrections for temporal variations in attenuation and scatter. The technique is demonstrated in 3D dynamic and list mode scans of a Hoffman 3D brain phantom performed on a Siemens/CTI ECAT EXACT HR+ scanner in which multiple arbitrary movements were applied to the phantom during data acquisition. The MAF method cannot correct for motion that occurs during the acquisition of an individual frame. The feasibility of avoiding this limitation by acquiring the data in list mode, and re-framing into a dynamic scan in which a new frame is commenced whenever significant motion occurs was investigated. Both approaches provided a major reduction in motion distortion when compared to a conventional reconstruction without motion correction. It appears that effective motion compensation is achievable in 3D neurological PET View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Single detectors and pixel arrays based on TIBr

    Page(s): 2415 - 2419
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (503 KB) |  | HTML iconHTML  

    TIBr crystals, grown by hydrothermal and Bridgeman-Stockbarger methods, were investigated as potential single and pixel detectors of X-, and gamma-radiation. The technology of detectors manufacture from TlBr crystals is reported. The spectra of the radionuclides Fe-55, Am-241, Cd-109, Co-57 and Cs-137, measured by single TlBr detectors are presented. FWHM energy resolutions of 500 eV at 5.9 keV; 518 eV at 9.87 keV; 670 eV at 13.92 keV; 777 eV at 22.1 keV; 2.7 keV at 59.5 keV; 3.7 keV at 88 keV; 4.4 keV at 122 keV and 29 keV at 662 keV have been achieved. Small format (3×3) pixel detectors with pads 0.35×0.35 mm and gap 0.1 mm were also fabricated on single TIBr crystals of dimensions 2.7×2.7×1.0 mm3. The interpixel resistivities measured at a bias of 50 V were 400-600 GOhm. The pixel leakage currents at a bias of 250 V were typically less than 0.5 nA. The best spectra were obtained at a temperature of -30°C, a shaping time constant of 6 μs and high voltage of 400 V. Energy resolutions of 2.2, 3.0, 3.7 and 29 keV were measured at input energies of 59.5, 88, 122 and 662 keV, respectively. Values of μτ have been evaluated for two different ingots produced by the Bridgeman-Stockbarger method. They are μcτc=2.5×10-4 cm2 V-1 and μhτh~10-6 cm2V-1 for one ingot at 30°C and μ eτe=7×10-5 cm2 V -1 and μhτh=1.5×10-5 cm2 V-1 for the other ingot at -10°C View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A study of scintillation beta microprobes

    Page(s): 2002 - 2007
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4823 KB) |  | HTML iconHTML  

    Several types of scintillation microprobes have recently been developed to directly measure positron activity from radiotracers in live animals. These probes consist of either a small LSO crystal or plastic scintillator coupled to an optical fiber which is read out with a photomultiplier tube operated in a single photon counting mode. Each type of probe has certain advantages and disadvantages in its ability to detect positrons and reject background gamma rays due to their different relative conversion probabilities and light output of each scintillator. In this paper, a comparison has been made of the relative detection efficiency of these two types of probes for positrons and gamma rays, and in their ability to localize positron decays using pulse height discrimination. Results are also given on the use of the microprobe on live laboratory animals View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Effect of including detector response in SPECT quantification of focal I-131 therapy

    Page(s): 2179 - 2182
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (248 KB) |  | HTML iconHTML  

    With a regularized strip-integral (1D) SAGE reconstruction, circular-orbit SPECT estimates of phantom focal 131-I activity vary with changes in the level of uniform background. They also vary with changes in image resolution due to different settings of the radius of rotation. To solve these problems, we investigated the effect of employing two different depth-dependent detector-response models. A regularized plane-by-plane (2D) SAGE algorithm reduced dependence of the counts-to-activity conversion factor on relative background concentration by 37% compared to the 1D SAGE. With unregularized multi-plane (3D) OSEM reconstruction, initial results showed: 1) a conversion factor that was independent of relative background concentration, and 2) a recovery coefficient that was approximately 1 for any sphere volume down to 20cc. We conclude that using a 3D detector-response model has the potential to eliminate bias problems. For a patient, the preliminary activity-estimate changes using 3D OSEM compared to 1D SAGE were: 1) +16% for a large tumor, and 2) -35% for a small tumor for which recovery-coefficient-based-correction-factor errors can be large View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • High resolution Schottky CdTe diode detector

    Page(s): 2464 - 2468
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (984 KB) |  | HTML iconHTML  

    We describe recent progress on the use of Schottky CdTe diode detectors for spectrometry. The low leakage current of the CdTe diode allows us to apply a much higher bias voltage than was possible with previous CdTe detectors. For a relatively thin detector of 0.5-1 mm thick, the high bias voltage results in a high electric field in the device. Both the improved charge collection efficiency and the low-leakage current lead to an energy resolution of better than 600 eV FWHM at 60 keV for a 2×2 mm2 device without any charge-loss correction electronics. Large area detectors with dimensions of 21×21 mm2 are now available with an energy resolution of ~2.8 keV. Long term stability can be easily attained for relatively thin (< 1 mm) detectors, if they are cooled or operated under a high bias voltage View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Correction methods for random coincidences in 3D wholebody PET imaging

    Page(s): 2080 - 2084
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2684 KB) |  | HTML iconHTML  

    With the advantages of the increased sensitivity of 3D PET imaging for wholebody imaging come the challenges of more complicated quantitative corrections, and in particular an increase in the random coincidence field of view (FOV) relative to the true coincidence FOV. The most common method of correcting for random coincidences is the on-line subtraction of a delayed coincidence channel, which does not add bias but increases noise. An alternative approach is the post-acquisition subtraction of a low noise random coincidence estimate, which can be from a smoothed delayed coincidence channel, a calibration scan, or directly estimated. Each method makes different tradeoffs between noise amplification, bias, and data processing requirements. These tradeoffs are dependent on activity injected, the local imaging environment (e.g. near the bladder), and the reconstruction algorithm. Using 3D wholebody simulations and phantom studies, we show that the gains in sinogram NEC by using a noiseless random coincidence estimation method are translated to improvements in image SNR. The image SNR, however, depends on the image reconstruction method and the local imaging environment. For 3D wholebody imaging, a low noise estimate of random coincidences based on the single photon rates improves sinogram and image SNRs by approximately 15% compared to on-line subtraction of delayed coincidences, and performs only slightly worse than using a 3D extension of the Casey-Hoffman smoothing of a separately acquired delayed coincidence sinogram View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.