Search Results

Advanced search parameters have been applied.

Alpha Dithering to Correct Low-Opacity 8 Bit Compositing Errors

Description: This paper describes and analyzes a dithering technique for accurately specifying small values of opacity ({alpha}) that would normally not be possible because of the limited number of bits available in the alpha channel of graphics hardware. This dithering technique addresses problems related to compositing numerous low-opacity semitransparent polygons to create volumetric effects with graphics hardware. The paper also describes the causes and a possible solution to artifacts that arise from parallel or distributed volume rendering using bricking on multiple GPU's.
Date: March 31, 2003
Creator: Williams, P L; Frank, R J & LaMar, E C
Partner: UNT Libraries Government Documents Department

Viscosity of Aqueous Sodium Chloride Solutions From 0 - 150° C

Description: A critical evaluation of data on the viscosity of aqueous sodium chloride solutions is presented. The literature was screened through October 1977, and a databank of evaluated data was established. Viscosity values were converted when necessary to units of centigrade, centipoise and molal concentration. The data were correlated with the aid of an empirical equation to facilitate interpolation and computer calculations. The result of the evaluation includes a table containing smoothed values for the viscosity of NaCl solutions to 150 C.
Date: October 1, 1977
Creator: Ozbek, H.; Fair, J.A. & Phillips, S.L.
Partner: UNT Libraries Government Documents Department

On the computation of CMBR anisotropies from simulations of topological defects

Description: Techniques for computing the CMBR anisotropy from simulations of topological defects are discussed with an eye to getting as much information from a simulation as possible. Here we consider the practical details of which sums and multiplications to do and how many terms there are.
Date: May 1, 1997
Creator: Stebbins, A. & Dodelson, S.
Partner: UNT Libraries Government Documents Department

Implementation of Frictional Contact Conditions in Surface to Surface, Mortar Based Computational Frameworks

Description: A number of recent works have established the mortar technique as an accurate and robust spatial discretization method for contact problems in computational solid mechanics. Since methods based on this idea rely on an integral, non-local representation of the contact operators, their formulation is somewhat more involved than is true for more traditional ''point to surface'' contact algorithms; in particular, the integral projections have nontrivial linearizations in the fully large deformation context. In this work, we concentrate on another aspect of formulations of this type--definition and implementation of frictional contact operators within the mortar contact framework. Issues associated with frame indifference of frictional tractions and kinematics are discussed, and a numerical demonstration of the technique is given.
Date: April 1, 2004
Creator: Laursen, T A; Yang, B & Puso, M A
Partner: UNT Libraries Government Documents Department

Visualization Tools for Adaptive Mesh Refinement Data

Description: Adaptive Mesh Refinement (AMR) is a highly effective method for simulations that span a large range of spatiotemporal scales, such as astrophysical simulations that must accommodate ranges from interstellar to sub-planetary. Most mainstream visualization tools still lack support for AMR as a first class data type and AMR code teams use custom built applications for AMR visualization. The Department of Energy's (DOE's) Science Discovery through Advanced Computing (SciDAC) Visualization and Analytics Center for Enabling Technologies (VACET) is currently working on extending VisIt, which is an open source visualization tool that accommodates AMR as a first-class data type. These efforts will bridge the gap between general-purpose visualization applications and highly specialized AMR visual analysis applications. Here, we give an overview of the state of the art in AMR visualization research and tools and describe how VisIt currently handles AMR data.
Date: May 9, 2007
Creator: Weber, Gunther H.; Beckner, Vincent E.; Childs, Hank; Ligocki,Terry J.; Miller, Mark C.; Van Straalen, Brian et al.
Partner: UNT Libraries Government Documents Department

Dose refinement: ARAC's role

Description: The Atmospheric Release Advisory Capability (ARAC), located at the Lawrence Livermore National Laboratory, since the late 1970�s has been involved in assessing consequences from nuclear and other hazardous material releases into the atmosphere. ARAC�s primary role has been emergency response. However, after the emergency phase, there is still a significant role for dispersion modeling. This work usually involves refining the source term and, hence, the dose to the populations affected as additional information becomes available in the form of source term estimates�release rates, mix of material, and release geometry�and any measurements from passage of the plume and deposition on the ground. Many of the ARAC responses have been documented elsewhere. 1 Some of the more notable radiological releases that ARAC has participated in the post-emergency phase have been the 1979 Three Mile Island nuclear power plant (NPP) accident outside Harrisburg, PA, the 1986 Chernobyl NPP accident in the Ukraine, and the 1996 Japan Tokai nuclear processing plant explosion. ARAC has also done post-emergency phase analyses for the 1978 Russian satellite COSMOS 954 reentry and subsequent partial burn up of its on board nuclear reactor depositing radioactive materials on the ground in Canada, the 1986 uranium hexafluoride spill in Gore, OK, the 1993 Russian Tomsk-7 nuclear waste tank explosion, and lesser releases of mostly tritium. In addition, ARAC has performed a key role in the contingency planning for possible accidental releases during the launch of spacecraft with radioisotope thermoelectric generators (RTGs) on board (i.e. Galileo, Ulysses, Mars-Pathfinder, and Cassini), and routinely exercises with the Federal Radiological Monitoring and Assessment Center (FRMAC) in preparation for offsite consequences of radiological releases from NPPs and nuclear weapon accidents or incidents. Several accident post-emergency phase assessments are discussed in this paper in order to illustrate ARAC�s roll in dose refinement. A brief description of the …
Date: June 1, 1998
Creator: Baskett, R L; Ellis, J S & Sullivan, T J
Partner: UNT Libraries Government Documents Department


Description: The values of orbit functions for accelerator lattices as computed with accelerator design programs may differ between different programs. For a simple lattice, consisting of identical constant-gradient bending magnets, the functions (horizontal and vertical betatron tunes, dispersions, closed orbit offsets, orbit lengths, chromaticities etc.) can be evaluated analytically. This lattice was studied with the accelerator physics tools SYNCH [1], COSY INFINITY [2], MAD [3], and TEAPOT [4]. It was found that while all the programs give identical results at the central design momentum, the results differ substantially among the various lattice tools for non-zero momentum deviations. Detailed results and comparisons are presented.
Date: May 12, 2003
Partner: UNT Libraries Government Documents Department

LDRD final report : leveraging multi-way linkages on heterogeneous data.

Description: This report is a summary of the accomplishments of the 'Leveraging Multi-way Linkages on Heterogeneous Data' which ran from FY08 through FY10. The goal was to investigate scalable and robust methods for multi-way data analysis. We developed a new optimization-based method called CPOPT for fitting a particular type of tensor factorization to data; CPOPT was compared against existing methods and found to be more accurate than any faster method and faster than any equally accurate method. We extended this method to computing tensor factorizations for problems with incomplete data; our results show that you can recover scientifically meaningfully factorizations with large amounts of missing data (50% or more). The project has involved 5 members of the technical staff, 2 postdocs, and 1 summer intern. It has resulted in a total of 13 publications, 2 software releases, and over 30 presentations. Several follow-on projects have already begun, with more potential projects in development.
Date: September 1, 2010
Creator: Dunlavy, Daniel M. & Kolda, Tamara Gibson (Sandia National Laboratories, Livermore, CA)
Partner: UNT Libraries Government Documents Department

Empirical model for shell-corrected level densities

Description: An empirical model for calculating level densities of closed and near- closed shell nuclei has been developed and tested. This method is based on the calculation of shell plus pairing corrections for each relevant nuclide. A new version of the ALICE code is used to extract these corrections from the Myers-Swiatecki mass formula and to apply them to the calculation of effective excitations in level densities. The corrections are applied in a backshifted fashion to assure correct threshold dependence. We compare our calculated results with experimental data for the production of 56Ni and 88Y to test shell corrections near f7/c closure and the N=50 neutron shell. We also compare our results with those using pure Fermi gas (plus pairing) level densities, and with the more computationally intensive model of Kataria and Ramamurthy.
Date: April 29, 1997
Creator: Ross, M.A. & Blann, M.
Partner: UNT Libraries Government Documents Department

Possible effects of competition on electricity consumers in the Pacific Northwest

Description: In part, the impetus for restructuring the U.S. electricity industry stems from the large regional disparities in electricity prices. Indeed, industry reforms are moving most rapidly in high-cost states, such as California and those in the Northeast. Legislators, regulators, and many others in states that enjoy low electricity prices, on the other hand, ask whether increased competition will benefit consumers in their states. This report quantifies the effects of increased competition on electricity consumers and producers in two regions, the Pacific Northwest and California. California`s generating costs are roughly double those of the Northwest. We use a new strategic-planning model called Oak Ridge Competitive Electricity Dispatch (ORCED) to conduct these analyses. Specifically, we analyzed four cases: a pre-competition base case intended to represent conditions as they might exist under current regulation in the year 2000, a post-competition case in which customer loads and load shapes respond to real-time electricity pricing, a sensitivity case in which natural-gas prices are 20% higher than in the base case, and a sensitivity case in which the hydroelectric output in the Northwest is 20% less than in the base case. The ORCED analyses suggest that, absent regulatory intervention, retail competition would increase profits for producers in the Northwest and lower prices for consumers in California at the expense of consumers in the Northwest and producers in California. However, state regulators may be able to capture some or all of the increased profits and use them to lower electricity prices in the low-cost region. Perhaps the most straightforward way to allocate the costs and benefits to retail customers is through development of transition-cost charges or credits. With this option, the consumers in both regions can benefit from competition. The magnitude and even direction of bulk-power trading between regions depends strongly on the amount of hydroelectric power …
Date: January 1, 1998
Creator: Hadley, S. & Hirst, E.
Partner: UNT Libraries Government Documents Department

Improving performance via mini-applications.

Description: Application performance is determined by a combination of many choices: hardware platform, runtime environment, languages and compilers used, algorithm choice and implementation, and more. In this complicated environment, we find that the use of mini-applications - small self-contained proxies for real applications - is an excellent approach for rapidly exploring the parameter space of all these choices. Furthermore, use of mini-applications enriches the interaction between application, library and computer system developers by providing explicit functioning software and concrete performance results that lead to detailed, focused discussions of design trade-offs, algorithm choices and runtime performance issues. In this paper we discuss a collection of mini-applications and demonstrate how we use them to analyze and improve application performance on new and future computer platforms.
Date: September 1, 2009
Creator: Crozier, Paul Stewart; Thornquist, Heidi K.; Numrich, Robert W. (University of Minnesota, Minneapolis, MN); Williams, Alan B.; Edwards, Harold Carter; Keiter, Eric Richard et al.
Partner: UNT Libraries Government Documents Department

On how to make the fastest gun in the west

Description: A new gasdynamic launcher is described, in which intact projectiles weighing at least one gram can be accelerated to mass velocities of 20 km/s. The system employs a conventional 2-stage light gas gun, with the barrel modified and filled with helium to act as a pump tube for a third stage. It is demonstrated that inter-stage kinetic energy efficiencies of 45% are possible and that these results can be achieved while maintaining the peak pressure applied to the projectile below 2.5 GPa. A simple analysis of this system is given, from which design parameters can be readily derived, and hydrocode calculations are presented to validate the model.
Date: January 1, 1997
Creator: Glenn, L.A.
Partner: UNT Libraries Government Documents Department

Synergistic approach to modeling X-ray spectra

Description: Plasma emission models used in X-ray astronomy need to simulate X-ray spectra from at least thirteen elements. Development of comprehensive models requires large-scale calculations; for example, Fe M-shell spectra, K{alpha} fluorescence from near-neutral ions, and dielectronic recombination satellite spectra from L-shell ions. Current and recent missions (EUVE, ASCA, DXS, etc.) have already demonstrated the need for major, rapid improvements in spectral models. The high-resolution spectra to be acquired with the next generation of X- ray observatories (AXAF, XMM, Astro-E) promise to push spectral models to their limits. Essential to ensuring the quality of calculations used in spectral codes is corroboration in the laboratory, where controlled and precisely measured plasma conditions can be attained. To this end, we are capitalizing on a three-way synergistic relationship that links astrophysical observations, atomic modeling, and experiments using the LLNL Electron Beam Ion Trap (EBIT). After providing a brief orientation concerning the role of plasma emission models in X-ray astronomy, we discuss one example of this interplay.
Date: July 1, 1998
Creator: Liedahl, D.A., LLNL
Partner: UNT Libraries Government Documents Department

Analytic and experimental validation of thermo elastic plastic material response calculation

Description: We compare the thermo-elastic-plastic response of fissionable metals calculated by the solid mechanics code DYNA to an analytic model for the case of a uniformly heated thin spherical shell and to experimental data for the case of a thin rod heated in a pulsed reactor. In both cases, the materials are volumetrically heated by neutron exposure. We find good agreement between the code and the analytic model and experimental data for the first and second case, respectively. For very fast heating times, macroscopic displacement may be replaced by microscopic plastic flow. To verify this behavior, an experiment to be done at SNLA SPR III is described. Validation of the code in these simple geometries is a necessary step if calculations involving more complicated geometries are to be understood and trusted.
Date: October 2, 1997
Creator: DiPeso, G.
Partner: UNT Libraries Government Documents Department

The gene identification problem: An overview for developers

Description: The gene identification problem is the problem of interpreting nucleotide sequences by computer, in order to provide tentative annotation on the location, structure, and functional class of protein-coding genes. This problem is of self-evident importance, and is far from being fully solved, particularly for higher eukaryotes, Thus it is not surprising that the number of algorithm and software developers working in this area is rapidly increasing. The present paper is an overview of the field, with an emphasis on eukaryotes, for such developers.
Date: March 27, 1995
Creator: Fickett, J.W.
Partner: UNT Libraries Government Documents Department


Description: The project, ''Calibration of Seismic Attributes for Reservoir Characterization'' is on schedule as planned, with only minor departures from plan. They have been working on multiple data sets, including two public-domain sets, one proprietary data set with a corporate partner, and one other proprietary data set as a member of a consortium. They have expanded the use, on a regular basis, of high-end software well beyond that anticipated in the original work plan. The use of these high-end software packages has greatly enhanced their ability to identify, study, and evaluate potential attributes in the seismic data. In addition, the high end software has served the purpose of pointing them in the right direction to make simple and straightforward relationships between the rock physical parameters and the seismic data. They required the use of this software to initially discover those relationships, but the understanding of those relationships is, so far, very straightforward, and does not require the use of high-end software.
Date: May 1, 2000
Creator: Pennington, Wayne D.
Partner: UNT Libraries Government Documents Department

Shifted power method for computing tensor eigenpairs.

Description: Recent work on eigenvalues and eigenvectors for tensors of order m {>=} 3 has been motivated by applications in blind source separation, magnetic resonance imaging, molecular conformation, and more. In this paper, we consider methods for computing real symmetric-tensor eigenpairs of the form Ax{sup m-1} = {lambda}x subject to {parallel}x{parallel} = 1, which is closely related to optimal rank-1 approximation of a symmetric tensor. Our contribution is a novel shifted symmetric higher-order power method (SS-HOPM), which we showis guaranteed to converge to a tensor eigenpair. SS-HOPM can be viewed as a generalization of the power iteration method for matrices or of the symmetric higher-order power method. Additionally, using fixed point analysis, we can characterize exactly which eigenpairs can and cannot be found by the method. Numerical examples are presented, including examples from an extension of the method to fnding complex eigenpairs.
Date: October 1, 2010
Creator: Mayo, Jackson R. & Kolda, Tamara Gibson
Partner: UNT Libraries Government Documents Department
Back to Top of Screen