41 Matching Results

Search Results

Advanced search parameters have been applied.

Noise propagation in iterative reconstruction algorithms with line searches

Description: In this paper we analyze the propagation of noise in iterative image reconstruction algorithms. We derive theoretical expressions for the general form of preconditioned gradient algorithms with line searches. The results are applicable to a wide range of iterative reconstruction problems, such as emission tomography, transmission tomography, and image restoration. A unique contribution of this paper comparing to our previous work [1] is that the line search is explicitly modeled and we do not use the approximation that the gradient of the objective function is zero. As a result, the error in the estimate of noise at early iterations is significantly reduced.
Date: November 15, 2003
Creator: Qi, Jinyi
Partner: UNT Libraries Government Documents Department

Theoretical evaluation of the detectability of random lesions in bayesian emission reconstruction

Description: Detecting cancerous lesion is an important task in positron emission tomography (PET). Bayesian methods based on the maximum a posteriori principle (also called penalized maximum likelihood methods) have been developed to deal with the low signal to noise ratio in the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the prior parameters in Bayesian reconstruction control the resolution and noise trade-off and hence affect detectability of lesions in reconstructed images. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Most research has been based on Monte Carlo simulations, which are very time consuming. Building on the recent progress on the theoretical analysis of image properties of statistical reconstructions and the development of numerical observers, here we develop a theoretical approach for fast computation of lesion detectability in Bayesian reconstruction. The results can be used to choose the optimum hyperparameter for the maximum lesion detectability. New in this work is the use of theoretical expressions that explicitly model the statistical variation of the lesion and background without assuming that the object variation is (locally) stationary. The theoretical results are validated using Monte Carlo simulations. The comparisons show good agreement between the theoretical predications and the Monte Carlo results.
Date: May 2003
Creator: Qi, Jinyi
Partner: UNT Libraries Government Documents Department

Optimization of PET system design for lesion detection

Description: Traditionally, the figures of merit used in designing a PET scanner are spatial resolution, noise equivalent count rate, noise equivalent sensitivity, etc. These measures, however, do not directly reflect the lesion detectability using the PET scanner. Here we propose to optimize PET scanner design directly for lesion detection. The signal-to-noise ratio (SNR) of lesion detection can be easily computed using the theoretical expressions that we have previously derived. Because no time consuming Monte Carlo simulation is needed, the theoretical expressions allow evaluation of a large range of parameters. The PET system parameters can then be chosen to achieve the maximum SNR for lesion detection. The simulation study shown in this paper was focused a single ring PET scanner without depth of interaction measurement. Randoms and scatters were also ignored.
Date: October 13, 2000
Creator: Qi, Jinyi
Partner: UNT Libraries Government Documents Department

Image reconstruction for a Positron Emission Tomograph optimized for breast cancer imaging

Description: The author performs image reconstruction for a novel Positron Emission Tomography camera that is optimized for breast cancer imaging. This work addresses for the first time, the problem of fully-3D, tomographic reconstruction using a septa-less, stationary, (i.e. no rotation or linear motion), and rectangular camera whose Field of View (FOV) encompasses the entire volume enclosed by detector modules capable of measuring Depth of Interaction (DOI) information. The camera is rectangular in shape in order to accommodate breasts of varying sizes while allowing for soft compression of the breast during the scan. This non-standard geometry of the camera exacerbates two problems: (a) radial elongation due to crystal penetration and (b) reconstructing images from irregularly sampled data. Packing considerations also give rise to regions in projection space that are not sampled which lead to missing information. The author presents new Fourier Methods based image reconstruction algorithms that incorporate DOI information and accommodate the irregular sampling of the camera in a consistent manner by defining lines of responses (LORs) between the measured interaction points instead of rebinning the events into predefined crystal face LORs which is the only other method to handle DOI information proposed thus far. The new procedures maximize the use of the increased sampling provided by the DOI while minimizing interpolation in the data. The new algorithms use fixed-width evenly spaced radial bins in order to take advantage of the speed of the Fast Fourier Transform (FFT), which necessitates the use of irregular angular sampling in order to minimize the number of unnormalizable Zero-Efficiency Bins (ZEBs). In order to address the persisting ZEBs and the issue of missing information originating from packing considerations, the algorithms (a) perform nearest neighbor smoothing in 2D in the radial bins (b) employ a semi-iterative procedure in order to estimate the unsampled data and (c) ...
Date: April 1, 2000
Creator: Virador, Patrick R.G.
Partner: UNT Libraries Government Documents Department

Prospective on the potential of imaging gene expression

Description: The feasibility of the non-invasive imaging of gene expression is explored. Calculations of the possibility of the direct imaging of specific messenger RNA with radiolabeled antisense are discussed. In addition, possible mechanism for the amplification of the biological signal to enhance image detection are discussed.
Date: June 1, 2000
Creator: Taylor, Scott E & Budinger, Thomas F.
Partner: UNT Libraries Government Documents Department

Trends in PET imaging

Description: Positron Emission Tomography (PET) imaging is a well established method for obtaining information on the status of certain organs within the human body or in animals. This paper presents an overview of recent trends PET instrumentation. Significant effort is being expended to develop new PET detector modules, especially those capable of measuring depth of interaction. This is aided by recent advances in scintillator and pixellated photodetector technology. The other significant area of effort is development of special purpose PET cameras (such as for imaging breast cancer or small animals) or cameras that have the ability to image in more than one modality (such as PET / SPECT or PET / X-Ray CT).
Date: November 1, 2000
Creator: Moses, William W.
Partner: UNT Libraries Government Documents Department

A compact, discrete CsI(Tl) scintillator/Si photodiode gamma camera for breast cancer imaging

Description: Recent clinical evaluations of scintimammography (radionuclide breast imaging) are promising and suggest that this modality may prove a valuable complement to X-ray mammography and traditional breast cancer detection and diagnosis techniques. Scintimammography, however, typically has difficulty revealing tumors that are less than 1 cm in diameter, are located in the medial part of the breast, or are located in the axillary nodes. These shortcomings may in part be due to the use of large, conventional Anger cameras not optimized for breast imaging. In this thesis I present compact single photon camera technology designed specifically for scintimammography which strives to alleviate some of these limitations by allowing better and closer access to sites of possible breast tumors. Specific applications are outlined. The design is modular, thus a camera of the desired size and geometry can be constructed from an array (or arrays) of individual modules and a parallel hole lead collimator for directional information. Each module consists of: (1) an array of 64 discrete, optically-isolated CsI(Tl) scintillator crystals 3 x 3 x 5 mm{sup 3} in size, (2) an array of 64 low-noise Si PIN photodiodes matched 1-to-1 to the scintillator crystals, (3) an application-specific integrated circuit (ASIC) that amplifies the 64 photodiode signals and selects the signal with the largest amplitude, and (4) connectors and hardware for interfacing the module with a motherboard, thereby allowing straightforward computer control of all individual modules within a camera.
Date: Autumn 2000
Creator: Gruber, Gregory J.
Partner: UNT Libraries Government Documents Department

List mode reconstruction for PET with motion compensation: A simulation study

Description: Motion artifacts can be a significant factor that limits the image quality in high-resolution PET. Surveillance systems have been developed to track the movements of the subject during a scan. Development of reconstruction algorithms that are able to compensate for the subject motion will increase the potential of PET. In this paper we present a list mode likelihood reconstruction algorithm with the ability of motion compensation. The subject moti is explicitly modeled in the likelihood function. The detections of each detector pair are modeled as a Poisson process with time vary ingrate function. The proposed method has several advantages over the existing methods. It uses all detected events and does not introduce any interpolation error. Computer simulations show that the proposed method can compensate simulated subject movements and that the reconstructed images have no visible motion artifacts.
Date: July 3, 2002
Creator: Qi, Jinyi & Huesman, Ronald H.
Partner: UNT Libraries Government Documents Department

Current trends in scintillator detectors and materials

Description: The last decade has seen a renaissance in inorganic scintillator development for gamma ray detection. Lead tungstate (PbWO4) has been developed for high energy physics experiments, and possesses exceptionally high density and radiation hardness, albeit with low luminous efficiency. Lutetium orthosilicate or LSO (Lu2SiO5:Ce) possesses a unique combination of high luminous efficiency, high density, and reasonably short decay time, and is now incorporated in commercial positron emission tomography (PET) cameras. There have been advances in understanding the fundamental mechanisms that limit energy resolution, and several recently discovered materials (such as LaBr3:Ce) possess energy resolution that approaches that of direct solid state detectors. Finally, there are indications that a neglected class of scintillator materials that exhibit near band-edge fluorescence could provide scintillators with sub-nanosecond decay times and high luminescent efficiency.
Date: October 23, 2001
Creator: Moses, William W.
Partner: UNT Libraries Government Documents Department

Synergies between electromagnetic calorimetry and PET

Description: The instrumentation used for the nuclear medical imaging technique of Positron Emission Tomography (PET) shares many features with the instrumentation used for electromagnetic calorimetry. Both fields can certainly benefit from technical advances in many common areas, and this paper discusses both the commonalties and the differences between the instrumentation needs for the two fields. The overall aim is to identify where synergistic development opportunities exist. While such opportunities exist in inorganic scintillators, photodetectors, amplification and readout electronics, and high-speed computing, it is important to recognize that while the requirements of the two fields are similar, they are not identical, and so it is unlikely that advances specific to one field can be transferred without modification to the other.
Date: July 30, 2002
Creator: Moses, William W.
Partner: UNT Libraries Government Documents Department

Propagation of errors from the sensitivity image in list mode reconstruction

Description: List mode image reconstruction is attracting renewed attention. It eliminates the storage of empty sinogram bins. However, a single back projection of all LORs is still necessary for the pre-calculation of a sensitivity image. Since the detection sensitivity is dependent on the object attenuation and detector efficiency, it must be computed for each study. Exact computation of the sensitivity image can be a daunting task for modern scanners with huge numbers of LORs. Thus, some fast approximate calculation may be desirable. In this paper, we theoretically analyze the error propagation from the sensitivity image into the reconstructed image. The theoretical analysis is based on the fixed point condition of the list mode reconstruction. The non-negativity constraint is modeled using the Kuhn-Tucker condition. With certain assumptions and the first order Taylor series approximation, we derive a closed form expression for the error in the reconstructed image as a function of the error in the sensitivity image. The result provides insights on what kind of error might be allowable in the sensitivity image. Computer simulations show that the theoretical results are in good agreement with the measured results.
Date: November 15, 2003
Creator: Qi, Jinyi & Huesman, Ronald H.
Partner: UNT Libraries Government Documents Department

Image properties of list mode likelihood reconstruction for a rectangular positron emission mammography with DOI measurements

Description: A positron emission mammography scanner is under development at our Laboratory. The tomograph has a rectangular geometry consisting of four banks of detector modules. For each detector, the system can measure the depth of interaction information inside the crystal. The rectangular geometry leads to irregular radial and angular sampling and spatially variant sensitivity that are different from conventional PET systems. Therefore, it is of importance to study the image properties of the reconstructions. We adapted the theoretical analysis that we had developed for conventional PET systems to the list mode likelihood reconstruction for this tomograph. The local impulse response and covariance of the reconstruction can be easily computed using FFT. These theoretical results are also used with computer observer models to compute the signal-to-noise ratio for lesion detection. The analysis reveals the spatially variant resolution and noise properties of the list mode likelihood reconstruction. The theoretical predictions are in good agreement with Monte Carlo results.
Date: October 1, 2000
Creator: Qi, Jinyi; Klein, Gregory J. & Huesman, Ronald H.
Partner: UNT Libraries Government Documents Department

Fast approach to evaluate map reconstruction for lesion detection and localization

Description: Lesion detection is an important task in emission tomography. Localization ROC (LROC) studies are often used to analyze the lesion detection and localization performance. Most researchers rely on Monte Carlo reconstruction samples to obtain LROC curves, which can be very time-consuming for iterative algorithms. In this paper we develop a fast approach to obtain LROC curves that does not require Monte Carlo reconstructions. We use a channelized Hotelling observer model to search for lesions, and the results can be easily extended to other numerical observers. We theoretically analyzed the mean and covariance of the observer output. Assuming the observer outputs are multivariate Gaussian random variables, an LROC curve can be directly generated by integrating the conditional probability density functions. The high-dimensional integrals are calculated using a Monte Carlo method. The proposed approach is very fast because no iterative reconstruction is involved. Computer simulations show that the results of the proposed method match well with those obtained using the tradition LROC analysis.
Date: February 1, 2004
Creator: Qi, Jinyi & Huesman, Ronald H.
Partner: UNT Libraries Government Documents Department

Fundamental limits of positron emission mammography

Description: We explore the causes of performance limitation in positron emission mammography cameras. We compare two basic camera geometries containing the same volume of 511 keV photon detectors, one with a parallel plane geometry and another with a rectangular geometry. We find that both geometries have similar performance for the phantom imaged (in Monte Carlo simulation), even though the solid angle coverage of the rectangular camera is about 50 percent higher than the parallel plane camera. The reconstruction algorithm used significantly affects the resulting image; iterative methods significantly outperform the commonly used focal plane tomography. Finally, the characteristics of the tumor itself, specifically the absolute amount of radiotracer taken up by the tumor, will significantly affect the imaging performance.
Date: June 1, 2001
Creator: Moses, William W. & Qi, Jinyi
Partner: UNT Libraries Government Documents Department

Wavelet crosstalk matrix and its application to assessment of shift-variant imaging systems

Description: The objective assessment of image quality is essential for design of imaging systems. Barrett and Gifford [1] introduced the Fourier cross talk matrix. Because it is diagonal for continuous linear shift-invariant imaging systems, the Fourier cross talk matrix is a powerful technique for discrete imaging systems that are close to shift invariant. However, for a system that is intrinsically shift variant, Fourier techniques are not particularly effective. Because Fourier bases have no localization property, the shift-variance of the imaging system cannot be shown by the response of individual Fourier bases; rather, it is shown in the correlation between the Fourier coefficients. This makes the analysis and optimization quite difficult. In this paper, we introduce a wavelet cross talk matrix based on wavelet series expansions. The wavelet cross talk matrix allows simultaneous study of the imaging system in both the frequency and spatial domains. Hence it is well suited for shift variant systems. We compared the wavelet cross talk matrix with the Fourier cross talk matrix for several simulated imaging systems, namely the interior and exterior tomography problems, limited angle tomography, and a rectangular geometry positron emission tomograph. The results demonstrate the advantages of the wavelet cross talk matrix in analyzing shift-variant imaging systems.
Date: November 1, 2002
Creator: Qi, Jinyi & Huesman, Ronald H.
Partner: UNT Libraries Government Documents Department

Scatter correction for positron emission mammography

Description: In this paper we present a scatter correction method for a regularized list mode maximum likelihood reconstruction algorithm for the positron emission mammograph (PEM) that is being developed at our laboratory. The scatter events inside the object are modeled as additive Poisson random variables in the forward model of the reconstruction algorithm. The mean scatter sinogram is estimated using a Monte Carlo simulation program. With the assumption that the background activity is nearly uniform, the Monte Carlo scatter simulation only needs to run once for each PEM configuration. This saves computational time. The crystal scatters are modeled as a shift-invariant blurring in image domain because they are more localized. Thus, the useful information in the crystal scatters can be deconvolved in high-resolution reconstructions. The propagation of the noise from the estimated scatter sinogram into the reconstruction is analyzed theoretically. The results provide an easy way to calculate the required number of events in the Monte Carlo scatter simulation for a given noise level in the image. The analysis is also applicable to other scatter estimation methods, provided that the covariance of the estimated scatter sinogram is available.
Date: April 1, 2002
Creator: Qi, Jinyi & Huesman, Ronald H.
Partner: UNT Libraries Government Documents Department

Quantum mechanical cluster calculations of critical scintillationprocesses

Description: This paper describes the use of commercial quantum chemistrycodes to simu-late several critical scintillation processes. The crystalis modeled as a cluster of typically 50 atoms embedded in an array oftypically 5,000 point charges designed to reproduce the electrostaticfield of the infinite crystal. The Schrodinger equation is solved for theground, ionized, and excited states of the system to determine the energyand electron wavefunction. Computational methods for the followingcritical processes are described: (1) the formation and diffusion ofrelaxed holes, (2) the formation of excitons, (3) the trapping ofelectrons and holes by activator atoms, (4) the excitation of activatoratoms, and (5) thermal quenching. Examples include hole diffusion in CsI,the exciton in CsI, the excited state of CsI:Tl, the energy barrier forthe diffusion of relaxed holes in CaF2 and PbF2, and prompt hole trappingby activator atoms in CaF2:Eu and CdS:Te leading to an ultra-fast (<50ps) scintillation risetime.
Date: February 22, 2000
Creator: Derenzo, Stephen E.; Klintenberg, Mattias K. & Weber, Marvin J.
Partner: UNT Libraries Government Documents Department

A unified noise analysis for iterative image estimation

Description: Iterative image estimation methods have been widely used in emission tomography. Accurate estimate of the uncertainty of the reconstructed images is essential for quantitative applications. While theoretical approach has been developed to analyze the noise propagation from iteration to iteration, the current results are limited to only a few iterative algorithms that have an explicit multiplicative update equation. This paper presents a theoretical noise analysis that is applicable to a wide range of preconditioned gradient type algorithms. One advantage is that proposed method does not require an explicit expression of the preconditioner and hence it is applicable to some algorithms that involve line searches. By deriving fixed point expression from the iteration based results, we show that the iteration based noise analysis is consistent with the xed point based analysis. Examples in emission tomography and transmission tomography are shown.
Date: July 3, 2003
Creator: Qi, Jinyi
Partner: UNT Libraries Government Documents Department

Fast computation of statistical uncertainty for spatiotemporal distributions estimated directly from dynamic cone beam SPECT projections

Description: The estimation of time-activity curves and kinetic model parameters directly from projection data is potentially useful for clinical dynamic single photon emission computed tomography (SPECT) studies, particularly in those clinics that have only single-detector systems and thus are not able to perform rapid tomographic acquisitions. Because the radiopharmaceutical distribution changes while the SPECT gantry rotates, projections at different angles come from different tracer distributions. A dynamic image sequence reconstructed from the inconsistent projections acquired by a slowly rotating gantry can contain artifacts that lead to biases in kinetic parameters estimated from time-activity curves generated by overlaying regions of interest on the images. If cone beam collimators are used and the focal point of the collimators always remains in a particular transaxial plane, additional artifacts can arise in other planes reconstructed using insufficient projection samples [1]. If the projection samples truncate the patient's body, this can result in additional image artifacts. To overcome these sources of bias in conventional image based dynamic data analysis, we and others have been investigating the estimation of time-activity curves and kinetic model parameters directly from dynamic SPECT projection data by modeling the spatial and temporal distribution of the radiopharmaceutical throughout the projected field of view [2-8]. In our previous work we developed a computationally efficient method for fully four-dimensional (4-D) direct estimation of spatiotemporal distributions from dynamic SPECT projection data [5], which extended Formiconi's least squares algorithm for reconstructing temporally static distributions [9]. In addition, we studied the biases that result from modeling various orders temporal continuity and using various time samplings [5]. the present work, we address computational issues associated with evaluating the statistical uncertainty of spatiotemporal model parameter estimates, and use Monte Carlo simulations to a fast algorithm for computing the covariance matrix for the parameters. Given this covariance matrix, the covariance between ...
Date: April 9, 2001
Creator: Reutter, Bryan W.; Gullberg, Grant T. & Huesman, Ronald H.
Partner: UNT Libraries Government Documents Department

Spectral analysis for evaluation of myocardial tracers for medical imaging

Description: Kinetic analysis of dynamic tracer data is performed with the goal of evaluating myocardial radiotracers for cardiac nuclear medicine imaging. Data from experiments utilizing the isolated rabbit heart model are acquired by sampling the venous blood after introduction of a tracer of interest and a reference tracer. We have taken the approach that the kinetics are properly characterized by an impulse response function which describes the difference between the reference molecule (which does not leave the vasculature) and the molecule of interest which is transported across the capillary boundary and is made available to the cell. Using this formalism we can model the appearance of the tracer of interest in the venous output of the heart as a convolution of the appearance of the reference tracer with the impulse response. In this work we parameterize the impulse response function as the sum of a large number of exponential functions whose predetermined decay constants form a spectrum, and each is required only to have a nonnegative coefficient. This approach, called spectral analysis, has the advantage that it allows conventional compartmental analysis without prior knowledge of the number of compartments which the physiology may require or which the data will support.
Date: October 11, 2000
Creator: Huesman, Ronald H.; Reutter, Bryan W. & Marshall, Robert C.
Partner: UNT Libraries Government Documents Department

Effects of temporal modeling on the statistical uncertainty of spatiotemporal distributions estimated directly from dynamic SPECT projections

Description: Artifacts can result when reconstructing a dynamic image sequence from inconsistent single photon emission computed tomography (SPECT) projections acquired by a slowly rotating gantry. The artifacts can lead to biases in kinetic parameters estimated from time-activity curves generated by overlaying volumes of interest on the images. To overcome these biases in conventional image based dynamic data analysis, we have been investigating the estimation of time-activity curves and kinetic model parameters directly from dynamic SPECT projection data by modeling the spatial and temporal distribution of the radiopharmaceutical throughout the projected field of view. In previous work we developed computationally efficient methods for fully four-dimensional (4-D) direct estimation of spatiotemporal distributions [1] and their statistical uncertainties [2] from dynamic SPECT projection data, using a spatial segmentation and temporal B-splines. In addition, we studied the bias that results from modeling various orders of temporal continuity and using various time samplings [1]. In the present work, we use the methods developed in [1, 2] and Monte Carlo simulations to study the effects of the temporal modeling on the statistical variability of the reconstructed distributions.
Date: April 30, 2001
Creator: Reutter, Bryan W.; Gullberg, Grant T. & Huesman, Ronald H.
Partner: UNT Libraries Government Documents Department

Computationally efficient nonlinear edge preserving smoothing of n-D medical images via scale-space fingerprint analysis

Description: Nonlinear edge preserving smoothing often is performed prior to medical image segmentation. The goal of the nonlinear smoothing is to improve the accuracy of the segmentation by preserving changes in image intensity at the boundaries of structures of interest, while smoothing random variations due to noise in the interiors of the structures. Methods include median filtering and morphology operations such as gray scale erosion and dilation, as well as spatially varying smoothing driven by local contrast measures. Rather than irreversibly altering the image data prior to segmentation, the approach described here has the potential to unify nonlinear edge preserving smoothing with segmentation based on differential edge detection at multiple scales. The analysis of n-D image data is decomposed into independent 1-D problems that can be solved quickly. Smoothing in various directions along 1-D profiles through the n-D data is driven by a measure of local structure separation, rather than by a local contrast measure. Isolated edges are preserved independent of their contrast, given an adequate contrast to noise ratio.
Date: October 11, 2000
Creator: Reutter, B.W.; Algazi, V.R. & Huesman, R.H.
Partner: UNT Libraries Government Documents Department

Optimization of Bayesian Emission tomographic reconstruction for region of interest quantitation

Description: Region of interest (ROI) quantitation is an important task in emission tomography (e.g., positron emission tomography and single photon emission computed tomography). It is essential for exploring clinical factors such as tumor activity, growth rate, and the efficacy of therapeutic interventions. Bayesian methods based on the maximum a posteriori principle (or called penalized maximum likelihood methods) have been developed for emission image reconstructions to deal with the low signal to noise ratio of the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the smoothing parameter of the image prior in Bayesian reconstruction controls the resolution and noise trade-off and hence affects ROI quantitation. In this paper we present an approach for choosing the optimum smoothing parameter in Bayesian reconstruction for ROI quantitation. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Building on the recent progress on deriving the approximate expressions for the local impulse response function and the covariance matrix, we derived simplied theoretical expressions for the bias, the variance, and the ensemble mean squared error (EMSE) of the ROI quantitation. One problem in evaluating ROI quantitation is that the truth is often required for calculating the bias. This is overcome by using ensemble distribution of the activity inside the ROI and computing the average EMSE. The resulting expressions allow fast evaluation of the image quality for different smoothing parameters. The optimum smoothing parameter of the image prior can then be selected to minimize the EMSE.
Date: January 10, 2003
Creator: Qi, Jinyi
Partner: UNT Libraries Government Documents Department