1,056 Matching Results

Search Results

Advanced search parameters have been applied.

Comparing GPU Implementations of Bilateral and Anisotropic Diffusion Filters for 3D Biomedical Datasets

Description: We compare the performance of hand-tuned CUDA implementations of bilateral and anisotropic diffusion filters for denoising 3D MRI datasets. Our tests sweep comparable parameters for the two filters and measure total runtime, memory bandwidth, computational throughput, and mean squared errors relative to a noiseless reference dataset.
Date: May 6, 2010
Creator: Howison, Mark
Partner: UNT Libraries Government Documents Department

Synthesis of 2-D images from the Wigner distribution with applications to mammography and edge extraction

Description: A new method for the general application of quadratic spatial/spatial frequency domain filtering to imagery is presented in this dissertation. The major contribution of this research is the development of an original algorithm for approximating the inverse psuedo Wigner distribution through synthesis of an image in the spatial domain which approximates the result of filtering an original image in the DPWD domain.
Date: December 1995
Creator: Pettit, Elaine J. (Elaine Joyce)
Partner: UNT Libraries

Joint inversion of geophysical data for site characterization and restoration monitoring

Description: The purpose of this project is to develop a computer code for joint inversion of seismic and electrical data, to improve underground imaging for site characterization and remediation monitoring. The computer code developed in this project will invert geophysical data to obtain direct estimates of porosity and saturation underground, rather than inverting for seismic velocity and electrical resistivity or other geophysical properties. This is intended to be a significant improvement in the state-of-the-art of underground imaging, since interpretation of data collected at a contaminated site would become much less subjective. Potential users include DOE scientists and engineers responsible for characterizing contaminated sites and monitoring remediation of contaminated sites. In this three-year project, we use a multi-phase approach consisting of theoretical and numerical code development, laboratory investigations, testing on available laboratory and borehole geophysics data sets, and a controlled field experiment, to develop practical tools for joint electrical and seismic data interpretation.
Date: May 28, 1998
Creator: Berge, P. A.
Partner: UNT Libraries Government Documents Department

Sunfall: a collaborative visual analytics system for astrophysics

Description: Computational and experimental sciences produce and collect ever-larger and complex datasets, often in large-scale, multi-institution projects. The inability to gain insight into complex scientific phenomena using current software tools is a bottleneck facing virtually all endeavors of science. In this paper, we introduce Sunfall, a collaborative visual analytics system developed for the Nearby Supernova Factory, an international astrophysics experiment and the largest data volume supernova search currently in operation. Sunfall utilizes novel interactive visualization and analysis techniques to facilitate deeper scientific insight into complex, noisy, high-dimensional, high-volume, time-critical data. The system combines novel image processing algorithms, statistical analysis, and machine learning with highly interactive visual interfaces to enable collaborative, user-driven scientific exploration of supernova image and spectral data. Sunfall is currently in operation at the Nearby Supernova Factory; it is the first visual analytics system in production use at a major astrophysics project.
Date: July 7, 2008
Creator: Aragon, Cecilia R.; Aragon, Cecilia R.; Bailey, Stephen J.; Poon, Sarah; Runge, Karl & Thomas, Rollin C.
Partner: UNT Libraries Government Documents Department

Adaptive optics and phase diversity imaging for responsive space applications.

Description: The combination of phase diversity and adaptive optics offers great flexibility. Phase diverse images can be used to diagnose aberrations and then provide feedback control to the optics to correct the aberrations. Alternatively, phase diversity can be used to partially compensate for aberrations during post-detection image processing. The adaptive optic can produce simple defocus or more complex types of phase diversity. This report presents an analysis, based on numerical simulations, of the efficiency of different modes of phase diversity with respect to compensating for specific aberrations during post-processing. It also comments on the efficiency of post-processing versus direct aberration correction. The construction of a bench top optical system that uses a membrane mirror as an active optic is described. The results of characterization tests performed on the bench top optical system are presented. The work described in this report was conducted to explore the use of adaptive optics and phase diversity imaging for responsive space applications.
Date: November 1, 2004
Creator: Smith, Mark William & Wick, David Victor
Partner: UNT Libraries Government Documents Department

Distributed video coding for arrays of remote sensing nodes : final report.

Description: This document is the final report for the Sandia National Laboratory funded Student Fellowship position at New Mexico State University (NMSU) from 2008 to 2010. Ivan Mecimore, the PhD student in Electrical Engineering at NMSU, was conducting research into image and video processing techniques to identify features and correlations within images without requiring the decoding of the data compression. Such an analysis technique would operate on the encoded bit stream, potentially saving considerable processing time when operating on a platform that has limited computational resources. Unfortunately, the student has elected in mid-year not to continue with his research or the fellowship position. The student is unavailable to provide any details of his research for inclusion in this final report. As such, this final report serves solely to document the information provided in the previous end of year summary.
Date: June 1, 2010
Creator: Mecimore, Ivan (New Mexico State University); Creusere, Chuck D. (New Mexico State University) & Merchant, Bion John
Partner: UNT Libraries Government Documents Department

Poblano v1.0 : a Matlab toolbox for gradient-based optimization.

Description: We present Poblano v1.0, a Matlab toolbox for solving gradient-based unconstrained optimization problems. Poblano implements three optimization methods (nonlinear conjugate gradients, limited-memory BFGS, and truncated Newton) that require only first order derivative information. In this paper, we describe the Poblano methods, provide numerous examples on how to use Poblano, and present results of Poblano used in solving problems from a standard test collection of unconstrained optimization problems.
Date: March 1, 2010
Creator: Dunlavy, Daniel M.; Acar, Evrim (Sandia National Laboratories, Livermore, CA) & Kolda, Tamara Gibson (Sandia National Laboratories, Livermore, CA)
Partner: UNT Libraries Government Documents Department

A fast contour descriptor algorithm for supernova imageclassification

Description: We describe a fast contour descriptor algorithm and its application to a distributed supernova detection system (the Nearby Supernova Factory) that processes 600,000 candidate objects in 80 GB of image data per night. Our shape-detection algorithm reduced the number of false positives generated by the supernova search pipeline by 41% while producing no measurable impact on running time. Fourier descriptors are an established method of numerically describing the shapes of object contours, but transform-based techniques are ordinarily avoided in this type of application due to their computational cost. We devised a fast contour descriptor implementation for supernova candidates that meets the tight processing budget of the application. Using the lowest-order descriptors (F{sub 1} and F{sub -1}) and the total variance in the contour, we obtain one feature representing the eccentricity of the object and another denoting its irregularity. Because the number of Fourier terms to be calculated is fixed and small, the algorithm runs in linear time, rather than the O(n log n) time of an FFT. Constraints on object size allow further optimizations so that the total cost of producing the required contour descriptors is about 4n addition/subtraction operations, where n is the length of the contour.
Date: July 16, 2006
Creator: Aragon, Cecilia R. & Aragon, David Bradburn
Partner: UNT Libraries Government Documents Department

Techniques for Handling and Processing Emulsion Stacks

Description: The techniques for assembling, processing and handling large nuclear-emulsion stacks are discussed. Results of experiments varying the development procedure are presented.
Date: September 8, 1954
Creator: Birge, Robert W.; Kerth, Leroy T.; Richman, Chaim; Stork, DonaldH. & Whetstone, Stanley L.
Partner: UNT Libraries Government Documents Department

Total Sky Imager (TSI) Handbook

Description: The total sky imager (TSI) provides time series of hemispheric sky images during daylight hours and retrievals of fractional sky cover for periods when the solar elevation is greater than 10 degrees.
Date: June 1, 2005
Creator: Morris, VR
Partner: UNT Libraries Government Documents Department

Learning algorithms for stack filter classifiers

Description: Stack Filters define a large class of increasing filter that is used widely in image and signal processing. The motivations for using an increasing filter instead of an unconstrained filter have been described as: (1) fast and efficient implementation, (2) the relationship to mathematical morphology and (3) more precise estimation with finite sample data. This last motivation is related to methods developed in machine learning and the relationship was explored in an earlier paper. In this paper we investigate this relationship by applying Stack Filters directly to classification problems. This provides a new perspective on how monotonicity constraints can help control estimation and approximation errors, and also suggests several new learning algorithms for Boolean function classifiers when they are applied to real-valued inputs.
Date: January 1, 2009
Creator: Porter, Reid B; Hush, Don & Zimmer, Beate G
Partner: UNT Libraries Government Documents Department

Tracking Non-rigid Structures in Computer Simulations

Description: A key challenge in tracking moving objects is the correspondence problem, that is, the correct propagation of object labels from one time step to another. This is especially true when the objects are non-rigid structures, changing shape, and merging and splitting over time. In this work, we describe a general approach to tracking thousands of non-rigid structures in an image sequence. We show how we can minimize memory requirements and generate accurate results while working with only two frames of the sequence at a time. We demonstrate our results using data from computer simulations of a fluimix problem.
Date: January 10, 2008
Creator: Gezahegne, A & Kamath, C
Partner: UNT Libraries Government Documents Department

Stack filter classifiers

Description: Just as linear models generalize the sample mean and weighted average, weighted order statistic models generalize the sample median and weighted median. This analogy can be continued informally to generalized additive modeels in the case of the mean, and Stack Filters in the case of the median. Both of these model classes have been extensively studied for signal and image processing but it is surprising to find that for pattern classification, their treatment has been significantly one sided. Generalized additive models are now a major tool in pattern classification and many different learning algorithms have been developed to fit model parameters to finite data. However Stack Filters remain largely confined to signal and image processing and learning algorithms for classification are yet to be seen. This paper is a step towards Stack Filter Classifiers and it shows that the approach is interesting from both a theoretical and a practical perspective.
Date: January 1, 2009
Creator: Porter, Reid B & Hush, Don
Partner: UNT Libraries Government Documents Department

A generalized vector-valued total variation algorithm

Description: We propose a simple but flexible method for solving the generalized vector-valued TV (VTV) functional, which includes both the {ell}{sup 2}-VTV and {ell}{sup 1}-VTV regularizations as special cases, to address the problems of deconvolution and denoising of vector-valued (e.g. color) images with Gaussian or salt-andpepper noise. This algorithm is the vectorial extension of the Iteratively Reweighted Norm (IRN) algorithm [I] originally developed for scalar (grayscale) images. This method offers competitive computational performance for denoising and deconvolving vector-valued images corrupted with Gaussian ({ell}{sup 2}-VTV case) and salt-and-pepper noise ({ell}{sup 1}-VTV case).
Date: January 1, 2009
Creator: Wohlberg, Brendt & Rodriguez, Paul
Partner: UNT Libraries Government Documents Department

Hierarchical image feature extraction by an irregular pyramid of polygonal partitions

Description: We present an algorithmic framework for hierarchical image segmentation and feature extraction. We build a successive fine-to-coarse hierarchy of irregular polygonal partitions of the original image. This multiscale hierarchy forms the basis for object-oriented image analysis. The framework incorporates the Gestalt principles of visual perception, such as proximity and closure, and exploits spectral and textural similarities of polygonal partitions, while iteratively grouping them until dissimilarity criteria are exceeded. Seed polygons are built upon a triangular mesh composed of irregular sized triangles, whose spatial arrangement is adapted to the image content. This is achieved by building the triangular mesh on the top of detected spectral discontinuities (such as edges), which form a network of constraints for the Delaunay triangulation. The image is then represented as a spatial network in the form of a graph with vertices corresponding to the polygonal partitions and edges reflecting their relations. The iterative agglomeration of partitions into object-oriented segments is formulated as Minimum Spanning Tree (MST) construction. An important characteristic of the approach is that the agglomeration of polygonal partitions is constrained by the detected edges; thus the shapes of agglomerated partitions are more likely to correspond to the outlines of real-world objects. The constructed partitions and their spatial relations are characterized using spectral, textural and structural features based on proximity graphs. The framework allows searching for object-oriented features of interest across multiple levels of details of the built hierarchy and can be generalized to the multi-criteria MST to account for multiple criteria important for an application.
Date: January 1, 2008
Creator: Skurikhin, Alexei N
Partner: UNT Libraries Government Documents Department

Improving Image Quality and Reducing Drift Problems via Automated Data Acquisition and Averaging in Cs-corrected TEM

Description: Image acquisition with a CCD camera is a single-press-button activity: after selecting exposure time and adjusting illumination, a button is pressed and the acquired image is perceived as the final, unmodified proof of what was seen in the microscope. Thus it is generally assumed that the image processing steps of e.g., 'dark-current correction' and 'gain normalization' do not alter the information content of the image, but rather eliminate unwanted artifacts. Image quality therefore is, among a long list of other parameters, defined by the dynamic range of the CCD camera as well as the maximum allowable exposure time depending on sample drift (ignoring sample damage). Despite the fact that most microscopists are satisfied with present, standard image quality we found that it is a relatively easy to improve on existing routines in at least two aspects: (1) Suppression of lateral image drift during acquisition by using significantly shorter exposure times with a plurality of exposures (3D-data set); and (2) Improvement in the Signal/Noise ratio by averaging over a given data set by exceeding the dynamic range of the camera.
Date: August 29, 2008
Creator: Voelkl, E; Jiang, B; Dai, Z R & Bradley, J P
Partner: UNT Libraries Government Documents Department

Nanometer-scale imaging and pore-scale fluid flow modeling inchalk

Description: For many rocks of high economic interest such as chalk,diatomite, tight gas sands or coal, nanometer scale resolution is neededto resolve the 3D-pore structure, which controls the flow and trapping offluids in the rocks. Such resolutions cannot be achieved with existingtomographic technologies. A new 3D imaging method, based on serialsectioning and using the Focused Ion Beam (FIB) technology has beendeveloped. FIB allows for the milling of layers as thin as 10 nanometersby using accelerated Ga+ ions to sputter atoms from the sample surface.After each milling step, as a new surface is exposed, a 2D image of thissurface is generated. Next, the 2D images are stacked to reconstruct the3D pore or grain structure. Resolutions as high as 10 nm are achievableusing this technique. A new image processing method uses directmorphological analysis of the pore space to characterize thepetrophysical properties of diverse formations. In addition to estimationof the petrophysical properties (porosity, permeability, relativepermeability and capillary pressures), the method is used for simulationof fluid displacement processes, such as those encountered in variousimproved oil recovery (IOR) approaches. Computed with the new methodcapillary pressure curves are in good agreement with laboratory data. Themethod has also been applied for visualization of the fluid distributionat various saturations from the new FIB data.
Date: August 23, 2005
Creator: Tomutsa, Liviu; Silin, Dmitriy & Radmilovich, Velimir
Partner: UNT Libraries Government Documents Department

Quasar target selection fiber efficiency

Description: We present estimates of the efficiency for finding QSOs as a function of limiting magnitude and galactic latitude. From these estimates, we have formulated a target selection strategy that should net 80,000 QSOs in the north galactic cap with an average of 70 fibers per plate, not including fibers reserved for high-redshift quasars. With this plan, we expect 54% of the targets to be QSOs. The North Galactic Cap is divided into two zones of high and low stellar density. We use about five times as many fibers for QSO candidates in the half of the survey with the lower stellar density as we use in the half with higher stellar density. The current plan assigns 15% of the fibers to FIRST radio sources; if these are not available, those fibers would be allocated to lower probability QSO sources, dropping the total number of QSOs by a small factor (5%). We will find about 17,000 additional quasars in the southern strips, and maybe a few more at very high redshift. Use was made of two data sets: the star and quasar simulated test data generated by Don Schneider, and the data from UJFN plate surveys by Koo (1986) and Kron (1980). This data was compared to results from the Palomar-Green Survey and a recent survey by Pat Osmer and collaborators.
Date: May 1, 1996
Creator: Newberg, H. & Yanny, B.
Partner: UNT Libraries Government Documents Department