69 Matching Results

Search Results

Advanced search parameters have been applied.

ANALYZE Users' Guide

Description: This report is a reproduction of the visuals that were used in the ANALYZE Users' Guide lectures of the videotaped LLNL Continuing Education Course CE2018-H, State Space Lectures. The course was given in Spring 1982 through the EE Department Education Office. Since ANALYZE is menu-driven, interactive, and has self-explanatory questions (sort of), these visuals and the two 50-minute videotapes are the only documentation which comes with the code. More information about the algorithms contained in ANALYZE can be obtained from the IEEE book on Programs for Digital Signal Processing.
Date: October 1, 1982
Creator: Azevedo, S.
Partner: UNT Libraries Government Documents Department

On Optimal Bilinear Quadrilateral Meshes

Description: The novelty of this work is in presenting interesting error properties of two types of asymptotically ''optimal'' quadrilateral meshes for bilinear approximation. The first type of mesh has an error equidistributing property where the maximum interpolation error is asymptotically the same over all elements. The second type has faster than expected ''super-convergence'' property for certain saddle-shaped data functions. The ''superconvergent'' mesh may be an order of magnitude more accurate than the error equidistributing mesh. Both types of mesh are generated by a coordinate transformation of a regular mesh of squares. The coordinate transformation is derived by interpreting the Hessian matrix of a data function as a metric tensor. The insights in this work may have application in mesh design near corner or point singularities.
Date: March 17, 2000
Creator: D'Azevedo, E
Partner: UNT Libraries Government Documents Department

Experiments in the application of ultrasound diffraction tomography for nondestructive testing

Description: We have designed computer programs to simulate ultrasound projection scans and to reconstruct the tomographic planar image. We have also used the reconstruction algorithm on actual test data and have obtained a crude but promising image. 11 refs., 9 figs.
Date: July 1, 1988
Creator: Azevedo, S.G. & Fitch, J.P.
Partner: UNT Libraries Government Documents Department

Unique portable signal acquisition/processing station

Description: At Lawrence Livermore National Laboratory, there are experimental applications requiring digital signal acquisition as well as data reduction and analysis. A prototype Signal Acquisition/Processing Station (SAPS) has been constructed and is currently undergoing tests. The system employs an LSI-11/23 computer with Data Translation analog-to-digital hardware. SAPS is housed in a roll-around cart which has been designed to withstand most subtle EMI/RFI environments. A user-friendly menu allows a user to access powerful data acquisition packages with a minimum of training. The software architecture of SAPS involves two operating systems, each being transparent to the user. Since this is a general purpose workstation with several units being utilized, an emphasis on low cost, reliability, and maintenance was stressed during conception and design. The system is targeted for mid-range frequency data acquisition; between a data logger and a transient digitizer.
Date: May 16, 1983
Creator: Garron, R.D. & Azevedo, S.G.
Partner: UNT Libraries Government Documents Department

On Optimal Bilinear Quadrilateral Meshes

Description: The novelty of this work is in presenting interesting error properties of two types of asymptotically optimal quadrilateral meshes for bilinear approximation. The first type of mesh has an error equidistributing property where the maximum interpolation error is asymptotically the same over all elements. The second type has faster than expected super-convergence property for certain saddle-shaped data functions. The super-convergent mesh may be an order of magnitude more accurate than the error equidistributing mesh. Both types of mesh are generated by a coordinate transformation of a regular mesh of squares. The coordinate transformation is derived by interpreting the Hessian matrix of a data function as a metric tensor. The insights in this work may have application in mesh design near known corner or point singularities.
Date: October 26, 1998
Creator: D'Azevedo, E.
Partner: UNT Libraries Government Documents Department

Preliminary 2D design study for A&PCT

Description: Lawrence Livermore National Laboratory is currently designing and constructing a tomographic scanner to obtain the most accurate possible assays of radioactivity in barrels of nuclear waste in a limited amount of time. This study demonstrates a method to explore different designs using laboratory experiments and numerical simulations. In particular, we examine the trade-off between spatial resolution and signal-to-noise. The simulations are conducted in two dimensions as a preliminary study for three dimensional imaging. We find that the optimal design is entirely dependent on the expected source sizes and activities. For nuclear waste barrels, preliminary results indicate that collimators with widths of 1 to 3 inch and aspect ratios of 5:1 to 10:1 should perform well. This type of study will be repeated in 3D in more detail to optimize the final design.
Date: March 1, 1995
Creator: Keto, E.; Azevedo, S. & Roberson, P.
Partner: UNT Libraries Government Documents Department

Region-of-interest cone-beam computed tomography

Description: A methodology for solving the general cone-beam region-of-interest (ROI) problem on a circular trajectory is presented using the mathematical framework described by Grangeat. The algorithm, called Radon-ROI, takes scans at two different resolutions-low resolution covering the entire object and high resolution covering only the ROI-and combines the scans in both projection and Radon spaces so that the ROI is reconstructed at high resolution without artifacts from missing-data, under-sampling, or cone-beam errors. A circular source trajectory is assumed and the object must have low spatial frequencies outside the ROI. Simulated and experimental results of the Radon-ROI code show marked improvement on resolution within the ROI.
Date: June 1, 1995
Creator: Azevedo, S.; Rizo, P. & Grangeat, P.
Partner: UNT Libraries Government Documents Department

Bridge diagnosis at 55 mph

Description: The Federal Highway Administration (FHWA) has helped sponsor a research project at Lawrence Livermore that produced a beneficial new tool as well as an R&D 100 Award. The HERMES Bridge Inspector will provide an invaluable capability to diagnose the problems of deteriorating bridge decks and do it accurately, efficiently, nondestructively, and, perhaps most important to motorists, without closing bridges. Almost 30% of 600,000 large highway bridges in the U.S. are classified "deficient" by the FHWA, and HERMES can make a significant contribution toward solving the problem of infrastructure assessment and repair. With further development, HERMES holds promise for other concrete inspection problems, such as railroads, tunnels, and runways. HERMES, or High-performance Electromagnetic Roadway Mapping and Evaluation System, is a radar-based sensing system mounted in a trailer. It can be pulled by a vehicle at traffic speeds over a bridge deck to collect information about the roadway subsurface -- its sensors gathering data 30 centimeters or more into concrete. An onboard computer system processes the data into three-dimensional images that pinpoint problems in the roadway concrete and give engineers quantitative information about deterioration in the bridge deck. Engineers can then better assess what repairs or reconstruction is necessary and avoid the cost overruns and delays that result from inexact problem diagnoses.
Date: October 1, 1998
Creator: Azevedo, S
Partner: UNT Libraries Government Documents Department

Multiple-energy techniques in industrial computerized tomography

Description: Considerable effort is being applied to develop multiple-energy industrial CT techniques for materials characterization. Multiple-energy CT can provide reliable estimates of effective Z (Z{sub eff}), weight fraction, and rigorous calculations of absolute density, all at the spatial resolution of the scanner. Currently, a wide variety of techniques exist for CT scanners, but each has certain problems and limitations. Ultimately, the best multi-energy CT technique would combine the qualities of accuracy, reliability, and wide range of application, and would require the smallest number of additional measurements. We have developed techniques for calculating material properties of industrial objects that differ somewhat from currently used methods. In this paper, we present our methods for calculating Z{sub eff}, weight fraction, and density. We begin with the simplest case -- methods for multiple-energy CT using isotopic sources -- and proceed to multiple-energy work with x-ray machine sources. The methods discussed here are illustrated on CT scans of PBX-9502 high explosives, a lexan-aluminum phantom, and a cylinder of glass beads used in a preliminary study to determine if CT can resolve three phases: air, water, and a high-Z oil. In the CT project at LLNL, we have constructed several CT scanners of varying scanning geometries using {gamma}- and x-ray sources. In our research, we employed two of these scanners: pencil-beam CAT for CT data using isotopic sources and video-CAT equipped with an IRT micro-focal x-ray machine source.
Date: August 1, 1990
Creator: Schneberk, D.; Martz, H. & Azevedo, S.
Partner: UNT Libraries Government Documents Department

Adaptation of a cubic smoothing spline algortihm for multi-channel data stitching at the National Ignition Facility

Description: Some diagnostics at the National Ignition Facility (NIF), including the Gamma Reaction History (GRH) diagnostic, require multiple channels of data to achieve the required dynamic range. These channels need to be stitched together into a single time series, and they may have non-uniform and redundant time samples. We chose to apply the popular cubic smoothing spline technique to our stitching problem because we needed a general non-parametric method. We adapted one of the algorithms in the literature, by Hutchinson and deHoog, to our needs. The modified algorithm and the resulting code perform a cubic smoothing spline fit to multiple data channels with redundant time samples and missing data points. The data channels can have different, time-varying, zero-mean white noise characteristics. The method we employ automatically determines an optimal smoothing level by minimizing the Generalized Cross Validation (GCV) score. In order to automatically validate the smoothing level selection, the Weighted Sum-Squared Residual (WSSR) and zero-mean tests are performed on the residuals. Further, confidence intervals, both analytical and Monte Carlo, are also calculated. In this paper, we describe the derivation of our cubic smoothing spline algorithm. We outline the algorithm and test it with simulated and experimental data.
Date: December 28, 2010
Creator: Brown, C; Adcock, A; Azevedo, S; Liebman, J & Bond, E
Partner: UNT Libraries Government Documents Department

National Ignition Facility Shot Data Analysis Module Guidelines

Description: This document provides the guidelines for software development of modules to be included in Shot Data Analysis (SDA) for the National Ignition Facility (NIF). An Analysis Module is a software entity that groups a set of (typically cohesive) functions, procedures and data structures for performing an analysis task relevant to NIF shot operations. Each module must have its own unique identification (module name), clear interface specifications (data inputs and outputs), and internal documentation. It is vitally important to the NIF Program that all shot-related data be processed and analyzed in a consistent way that is reviewed by scientific and engineering experts. SDA is part of a NIF Integrated Product Team (IPT) whose goal is to provide timely and accurate reporting of shot results to NIF campaign experimentalists. Other elements of the IPT include the Campaign Management Tool (CMT) for configuring experiments, a data archive and provisioning system called CMS, a calibration and configuration database (CDMS), and a shot data visualization tool (SDV). We restrict our scope at this time to guidelines for modules written in Interactive Data Language, or IDL1. This document has sections describing example IDL modules and where to find them, how to set up a development environment, IDL programming guidelines, shared IDL procedures for general use, and revision control.
Date: October 3, 2007
Creator: Azevedo, S; Glenn, S; Lopez, A; Warrick, A & Beeler, R
Partner: UNT Libraries Government Documents Department

Automotive Underhood Thermal Management Analysis Using 3-D Coupled Thermal-Hydrodynamic Computer Models: Thermal Radiation Modeling

Description: The goal of the radiation modeling effort was to develop and implement a radiation algorithm that is fast and accurate for the underhood environment. As part of this CRADA, a net-radiation model was chosen to simulate radiative heat transfer in an underhood of a car. The assumptions (diffuse-gray and uniform radiative properties in each element) reduce the problem tremendously and all the view factors for radiation thermal calculations can be calculated once and for all at the beginning of the simulation. The cost for online integration of heat exchanges due to radiation is found to be less than 15% of the baseline CHAD code and thus very manageable. The off-line view factor calculation is constructed to be very modular and has been completely integrated to read CHAD grid files and the output from this code can be read into the latest version of CHAD. Further integration has to be performed to accomplish the same with STAR-CD. The main outcome of this effort is to obtain a highly scalable and portable simulation capability to model view factors for underhood environment (for e.g. a view factor calculation which took 14 hours on a single processor only took 14 minutes on 64 processors). The code has also been validated using a simple test case where analytical solutions are available. This simulation capability gives underhood designers in the automotive companies the ability to account for thermal radiation - which usually is critical in the underhood environment and also turns out to be one of the most computationally expensive components of underhood simulations. This report starts off with the original work plan as elucidated in the proposal in section B. This is followed by Technical work plan to accomplish the goals of the project in section C. In section D, background to the current work ...
Date: February 26, 2002
Creator: Pannala, S.; D'Azevedo, E. & Zacharia, T.
Partner: UNT Libraries Government Documents Department

DONIO: Distributed Object Network I/O Library

Description: This report describes the use and implementation of DONIO (Distributed Object Network I/O), a library of routines that provide fast file I/O capabilities in the Intel iPSC/860 and Paragon distributed memory parallel environments. DONIO caches a copy of the file in memory distributed across all processors. Disk I/O routines (such as read, write, and lseek) are replaced by calls to DONIO routines, which translate these operations into message communication to update the cached data. Experiments on the Intel Paragon show that the cost of concurrent disk I/O using DONIO for large files can be 15-30 times smaller than using standard disk I/O.
Date: January 1, 1994
Creator: D'Azevedo, E.F.
Partner: UNT Libraries Government Documents Department

Parallelization of a multiregion flow and transport code using software emulated global shared memory and high performance FORTRAN

Description: The objectives of this research are (1) to parallelize a suite of multiregion groundwater flow and solute transport codes that use Galerkin and Lagrangian- Eulerian finite element methods, (2) to test the compatibility of a global shared memory emulation software with a High Performance FORTRAN (HPF) compiler, and (3) to obtain performance characteristics and scalability of the parallel codes. The suite of multiregion flow and transport codes, 3DMURF and 3DMURT, were parallelized using the DOLIB shared memory emulation, in conjunction with the PGI HPF compiler, to run on the Intel Paragons at the Oak Ridge National Laboratory (ORNL) and a network of workstations. The novelty of this effort is first in the use of HPF and global shared memory emulation concurrently to facilitate the conversion of a serial code to a parallel code, and secondly the shared memory library enables efficient implementation of Lagrangian particle tracking along flow characteristics. The latter allows long-time-step-size simulation with particle tracking and dynamic particle redistribution for load balancing, thereby reducing the number of time steps needed for most transient problems. The parallel codes were applied to a pumping well problem to test the efficiency of the domain decomposition and particle tracking algorithms. The full problem domain consists of over 200,000 degrees of freedom with highly nonlinear soil property functions. Relatively good scalability was obtained for a preliminary test run on the Intel Paragons at the Center for Computational Sciences (CCS), ORNL. However, due to the difficulties we encountered in the PGI HPF compiler, as of the writing of this manuscript we are able to report results from 3DMURF only.
Date: February 1, 1997
Creator: D`Azevedo, E.F. & Gwo, Jin-Ping
Partner: UNT Libraries Government Documents Department

Packed storage extension for ScaLAPACK

Description: The authors describe a new extension to ScaLAPACK for computing with symmetric (Hermitian) matrices stored in a packed form. The new code is built upon the ScaLAPACK routines for full dense storage for a high degree of software reuse. The original ScaLAPACK stores a symmetric matrix as a full matrix but accesses only the lower or upper triangular part. The new code enables more efficient use of memory by storing only the lower or upper triangular part of a symmetric (Hermitian) matrix. The packed storage scheme distributes the matrix by block column panels. Within each panel, the matrix is stored as a regular ScaLAPACK matrix. This storage arrangement simplifies the subroutine interface and code reuse. Routines PxPPTRF/PxPPTRS implement the Cholesky factorization and solution for symmetric (Hermitian) linear systems in packed storage. Routines PxSPEV/PxSPEVX (PxHPEV/PxHPEVX) implement the computation of eigenvalues and eigenvectors for symmetric (Hermitian) matrices in packed storage. Routines PxSPGVX (PxHPGVX) implement the expert driver for the generalized eigenvalue problem for symmetric (Hermitian) matrices in packed storage. Performance results on the Intel Paragon suggest that the packed storage scheme incurs only a small time overhead over the full storage scheme.
Date: January 1, 1997
Creator: D'Azevedo, E.F. & Dongarra, J.J.
Partner: UNT Libraries Government Documents Department

Prediction of buried mine-like target radar signatures using wideband electromagnetic modeling

Description: Current ground penetrating radars (GPR) have been tested for land mine detection, but they have generally been costly and have poor performance. Comprehensive modeling and experimentation must be done to predict the electromagnetic (EM) signatures of mines to access the effect of clutter on the EM signature of the mine, and to understand the merit and limitations of using radar for various mine detection scenarios. This modeling can provide a basis for advanced radar design and detection techniques leading to superior performance. Lawrence Livermore National Laboratory (LLNL) has developed a radar technology that when combined with comprehensive modeling and detection methodologies could be the basis of an advanced mine detection system. Micropower Impulse Radar (MIR) technology exhibits a combination of properties, including wideband operation, extremely low power consumption, extremely small size and low cost, array configurability, and noise encoded pulse generation. LLNL is in the process of developing an optimal processing algorithm to use with the MIR sensor. In this paper, we use classical numerical models to obtain the signature of mine-like targets and examine the effect of surface roughness on the reconstructed signals. These results are then qualitatively compared to experimental data.
Date: April 6, 1998
Creator: Warrick, A.L.; Azevedo, S.G. & Mast, J.E.
Partner: UNT Libraries Government Documents Department

Research on computed tomography reconstructions from one or two radiographs: A report and the application to FXR radiography

Description: This report documents some cooperative research into volumetric image reconstruction from single radiographs. Imaging dynamic events is the most important application for this type of work, but the techniques have possible extensions. Two general objectives guide this work. The first objective is to gain an understanding of the assumptions and limitations of single-view methods for representing internal features. Second, we endeavor to obtain and/or develop techniques for performing image reconstructions with FXR radiographs. If possible, we seek to obtain some quantitative measure of the accuracy of this class of image reconstructions in two respects: (i) in terms of the dimensional accuracy of feature boundaries, and (ii) as pertains to the accuracy of the voxel intensities. Dynamic events are not always self-calibrating, and it is important to establish the reconstruction accuracy of single-view methods for placing bounds on the kinds of conclusions which can be advanced from single-view reconstructed images. Computed tomographic image reconstructions provide dimensional detail of internal structures of objects and provide a measure of the per-voxel attenuation of material in the object. When assumptions behind a reconstruction algorithm are not satisfied, or are satisfied in a limited way, the accuracy of the reconstructed image is compromised. It is the goal of Cr analysis to discern the {open_quotes}real{close_quotes} features of the internals of an object in the midst of a certain level of artifactual content in the image. By understanding the ways in which CT reconstructions from a single radiograph can produce misleading results we hope to develop some measure of the benefits and limitations of single view techniques. 31 refs., 20 figs.
Date: January 26, 1995
Creator: Back, N.; Schneberk, D.; McMillan, C.; Azevedo, S. & Gorvad, M.
Partner: UNT Libraries Government Documents Department

Wall surveyor project report

Description: A report is made on the demonstration of a first-generation Wall Surveyor that is capable of surveying the interior and thickness of a stone, brick, or cement wall. LLNL`s Micropower Impulse Radar is used, based on emitting and detecting very low amplitude and short microwave impulses (MIR rangefinder). Six test walls were used. While the demonstrator MIR Wall Surveyor is not fieldable yet, it has successfully scanned the test walls and produced real-time images identifying the walls. It is planned to optimize and package the evaluation wall surveyor into a hand held unit.
Date: February 22, 1996
Creator: Mullenhoff, D.J.; Johnston, B.C. & Azevedo, S.G.
Partner: UNT Libraries Government Documents Department