2,982 Matching Results

Search Results

Advanced search parameters have been applied.

Alpha Dithering to Correct Low-Opacity 8 Bit Compositing Errors

Description: This paper describes and analyzes a dithering technique for accurately specifying small values of opacity ({alpha}) that would normally not be possible because of the limited number of bits available in the alpha channel of graphics hardware. This dithering technique addresses problems related to compositing numerous low-opacity semitransparent polygons to create volumetric effects with graphics hardware. The paper also describes the causes and a possible solution to artifacts that arise from parallel or distributed volume rendering using bricking on multiple GPU's.
Date: March 31, 2003
Creator: Williams, P L; Frank, R J & LaMar, E C
Partner: UNT Libraries Government Documents Department

Possible effects of competition on electricity consumers in the Pacific Northwest

Description: In part, the impetus for restructuring the U.S. electricity industry stems from the large regional disparities in electricity prices. Indeed, industry reforms are moving most rapidly in high-cost states, such as California and those in the Northeast. Legislators, regulators, and many others in states that enjoy low electricity prices, on the other hand, ask whether increased competition will benefit consumers in their states. This report quantifies the effects of increased competition on electricity consumers and producers in two regions, the Pacific Northwest and California. California`s generating costs are roughly double those of the Northwest. We use a new strategic-planning model called Oak Ridge Competitive Electricity Dispatch (ORCED) to conduct these analyses. Specifically, we analyzed four cases: a pre-competition base case intended to represent conditions as they might exist under current regulation in the year 2000, a post-competition case in which customer loads and load shapes respond to real-time electricity pricing, a sensitivity case in which natural-gas prices are 20% higher than in the base case, and a sensitivity case in which the hydroelectric output in the Northwest is 20% less than in the base case. The ORCED analyses suggest that, absent regulatory intervention, retail competition would increase profits for producers in the Northwest and lower prices for consumers in California at the expense of consumers in the Northwest and producers in California. However, state regulators may be able to capture some or all of the increased profits and use them to lower electricity prices in the low-cost region. Perhaps the most straightforward way to allocate the costs and benefits to retail customers is through development of transition-cost charges or credits. With this option, the consumers in both regions can benefit from competition. The magnitude and even direction of bulk-power trading between regions depends strongly on the amount of hydroelectric power ...
Date: January 1, 1998
Creator: Hadley, S. & Hirst, E.
Partner: UNT Libraries Government Documents Department

On the computation of CMBR anisotropies from simulations of topological defects

Description: Techniques for computing the CMBR anisotropy from simulations of topological defects are discussed with an eye to getting as much information from a simulation as possible. Here we consider the practical details of which sums and multiplications to do and how many terms there are.
Date: May 1, 1997
Creator: Stebbins, A. & Dodelson, S.
Partner: UNT Libraries Government Documents Department

Empirical model for shell-corrected level densities

Description: An empirical model for calculating level densities of closed and near- closed shell nuclei has been developed and tested. This method is based on the calculation of shell plus pairing corrections for each relevant nuclide. A new version of the ALICE code is used to extract these corrections from the Myers-Swiatecki mass formula and to apply them to the calculation of effective excitations in level densities. The corrections are applied in a backshifted fashion to assure correct threshold dependence. We compare our calculated results with experimental data for the production of 56Ni and 88Y to test shell corrections near f7/c closure and the N=50 neutron shell. We also compare our results with those using pure Fermi gas (plus pairing) level densities, and with the more computationally intensive model of Kataria and Ramamurthy.
Date: April 29, 1997
Creator: Ross, M.A. & Blann, M.
Partner: UNT Libraries Government Documents Department

Dose refinement: ARAC's role

Description: The Atmospheric Release Advisory Capability (ARAC), located at the Lawrence Livermore National Laboratory, since the late 1970�s has been involved in assessing consequences from nuclear and other hazardous material releases into the atmosphere. ARAC�s primary role has been emergency response. However, after the emergency phase, there is still a significant role for dispersion modeling. This work usually involves refining the source term and, hence, the dose to the populations affected as additional information becomes available in the form of source term estimates�release rates, mix of material, and release geometry�and any measurements from passage of the plume and deposition on the ground. Many of the ARAC responses have been documented elsewhere. 1 Some of the more notable radiological releases that ARAC has participated in the post-emergency phase have been the 1979 Three Mile Island nuclear power plant (NPP) accident outside Harrisburg, PA, the 1986 Chernobyl NPP accident in the Ukraine, and the 1996 Japan Tokai nuclear processing plant explosion. ARAC has also done post-emergency phase analyses for the 1978 Russian satellite COSMOS 954 reentry and subsequent partial burn up of its on board nuclear reactor depositing radioactive materials on the ground in Canada, the 1986 uranium hexafluoride spill in Gore, OK, the 1993 Russian Tomsk-7 nuclear waste tank explosion, and lesser releases of mostly tritium. In addition, ARAC has performed a key role in the contingency planning for possible accidental releases during the launch of spacecraft with radioisotope thermoelectric generators (RTGs) on board (i.e. Galileo, Ulysses, Mars-Pathfinder, and Cassini), and routinely exercises with the Federal Radiological Monitoring and Assessment Center (FRMAC) in preparation for offsite consequences of radiological releases from NPPs and nuclear weapon accidents or incidents. Several accident post-emergency phase assessments are discussed in this paper in order to illustrate ARAC�s roll in dose refinement. A brief description of the ...
Date: June 1, 1998
Creator: Baskett, R L; Ellis, J S & Sullivan, T J
Partner: UNT Libraries Government Documents Department


Description: A critical evaluation of data on the viscosity of aqueous sodium chloride solutions is presented. The literature was screened through October 1977, and a databank of evaluated data was established. Viscosity values were converted when necessary to units of centigrade, centipoise and molal concentration. The data were correlated with the aid of an empirical equation to facilitate interpolation and computer calculations. The result of the evaluation includes a table containing smoothed values for the viscosity of NaCl solutions to 150 C.
Date: October 1, 1977
Creator: Ozbek, H.; Fair, J.A. & Phillips, S.L.
Partner: UNT Libraries Government Documents Department

Visualization Tools for Adaptive Mesh Refinement Data

Description: Adaptive Mesh Refinement (AMR) is a highly effective method for simulations that span a large range of spatiotemporal scales, such as astrophysical simulations that must accommodate ranges from interstellar to sub-planetary. Most mainstream visualization tools still lack support for AMR as a first class data type and AMR code teams use custom built applications for AMR visualization. The Department of Energy's (DOE's) Science Discovery through Advanced Computing (SciDAC) Visualization and Analytics Center for Enabling Technologies (VACET) is currently working on extending VisIt, which is an open source visualization tool that accommodates AMR as a first-class data type. These efforts will bridge the gap between general-purpose visualization applications and highly specialized AMR visual analysis applications. Here, we give an overview of the state of the art in AMR visualization research and tools and describe how VisIt currently handles AMR data.
Date: May 9, 2007
Creator: Weber, Gunther H.; Beckner, Vincent E.; Childs, Hank; Ligocki,Terry J.; Miller, Mark C.; Van Straalen, Brian et al.
Partner: UNT Libraries Government Documents Department

Implementation of Frictional Contact Conditions in Surface to Surface, Mortar Based Computational Frameworks

Description: A number of recent works have established the mortar technique as an accurate and robust spatial discretization method for contact problems in computational solid mechanics. Since methods based on this idea rely on an integral, non-local representation of the contact operators, their formulation is somewhat more involved than is true for more traditional ''point to surface'' contact algorithms; in particular, the integral projections have nontrivial linearizations in the fully large deformation context. In this work, we concentrate on another aspect of formulations of this type--definition and implementation of frictional contact operators within the mortar contact framework. Issues associated with frame indifference of frictional tractions and kinematics are discussed, and a numerical demonstration of the technique is given.
Date: April 1, 2004
Creator: Laursen, T A; Yang, B & Puso, M A
Partner: UNT Libraries Government Documents Department


Description: The values of orbit functions for accelerator lattices as computed with accelerator design programs may differ between different programs. For a simple lattice, consisting of identical constant-gradient bending magnets, the functions (horizontal and vertical betatron tunes, dispersions, closed orbit offsets, orbit lengths, chromaticities etc.) can be evaluated analytically. This lattice was studied with the accelerator physics tools SYNCH [1], COSY INFINITY [2], MAD [3], and TEAPOT [4]. It was found that while all the programs give identical results at the central design momentum, the results differ substantially among the various lattice tools for non-zero momentum deviations. Detailed results and comparisons are presented.
Date: May 12, 2003
Partner: UNT Libraries Government Documents Department

LDRD final report : leveraging multi-way linkages on heterogeneous data.

Description: This report is a summary of the accomplishments of the 'Leveraging Multi-way Linkages on Heterogeneous Data' which ran from FY08 through FY10. The goal was to investigate scalable and robust methods for multi-way data analysis. We developed a new optimization-based method called CPOPT for fitting a particular type of tensor factorization to data; CPOPT was compared against existing methods and found to be more accurate than any faster method and faster than any equally accurate method. We extended this method to computing tensor factorizations for problems with incomplete data; our results show that you can recover scientifically meaningfully factorizations with large amounts of missing data (50% or more). The project has involved 5 members of the technical staff, 2 postdocs, and 1 summer intern. It has resulted in a total of 13 publications, 2 software releases, and over 30 presentations. Several follow-on projects have already begun, with more potential projects in development.
Date: September 1, 2010
Creator: Dunlavy, Daniel M. & Kolda, Tamara Gibson (Sandia National Laboratories, Livermore, CA)
Partner: UNT Libraries Government Documents Department

Synergistic approach to modeling X-ray spectra

Description: Plasma emission models used in X-ray astronomy need to simulate X-ray spectra from at least thirteen elements. Development of comprehensive models requires large-scale calculations; for example, Fe M-shell spectra, K{alpha} fluorescence from near-neutral ions, and dielectronic recombination satellite spectra from L-shell ions. Current and recent missions (EUVE, ASCA, DXS, etc.) have already demonstrated the need for major, rapid improvements in spectral models. The high-resolution spectra to be acquired with the next generation of X- ray observatories (AXAF, XMM, Astro-E) promise to push spectral models to their limits. Essential to ensuring the quality of calculations used in spectral codes is corroboration in the laboratory, where controlled and precisely measured plasma conditions can be attained. To this end, we are capitalizing on a three-way synergistic relationship that links astrophysical observations, atomic modeling, and experiments using the LLNL Electron Beam Ion Trap (EBIT). After providing a brief orientation concerning the role of plasma emission models in X-ray astronomy, we discuss one example of this interplay.
Date: July 1, 1998
Creator: Liedahl, D.A., LLNL
Partner: UNT Libraries Government Documents Department

Evaluation Techniques and Properties of an Exact Solution to a Subsonic Free Surface Jet Flow

Description: Computational techniques for the evaluation of steady plane subsonic flows represented by Chaplygin series in the hodograph plane are presented. These techniques are utilized to examine the properties of the free surface wall jet solution. This solution is a prototype for the shaped charge jet, a problem which is particularly difficult to compute properly using general purpose finite element or finite difference continuum mechanics codes. The shaped charge jet is a classic validation problem for models involving high explosives and material strength. Therefore, the problem studied in this report represents a useful verification problem associated with shaped charge jet modeling.
Date: April 1, 2002
Partner: UNT Libraries Government Documents Department

Grid computing : enabling a vision for collaborative research.

Description: In this paper the authors provide a motivation for Grid computing based on a vision to enable a collaborative research environment. The authors vision goes beyond the connection of hardware resources. They argue that with an infrastructure such as the Grid, new modalities for collaborative research are enabled. They provide an overview showing why Grid research is difficult, and they present a number of management-related issues that must be addressed to make Grids a reality. They list projects that provide solutions to subsets of these issues.
Date: April 9, 2002
Creator: von Laszewski, G.
Partner: UNT Libraries Government Documents Department

Linearly convergent inexact proximal point algorithm for minimization. Revision 1

Description: In this paper, we propose a linearly convergent inexact PPA for minimization, where the inner loop stops when the relative reduction on the residue (defined as the objective value minus the optimal value) of the inner loop subproblem meets some preassigned constant. This inner loop stopping criterion can be achieved in a fixed number of iterations if the inner loop algorithm has a linear rate on the regularized subproblems. Therefore the algorithm is able to avoid the computationally expensive process of solving the inner loop subproblems exactly or asymptotically accurately; a process required by most of the other linearly convergent PPAs. As applications of this inexact PPA, we develop linearly convergent iteration schemes for minimizing functions with singular Hessian matrices, and for solving hemiquadratic extended linear-quadratic programming problems. We also prove that Correa-Lemarechal`s ``implementable form`` of PPA converges linearly under mild conditions.
Date: August 1, 1993
Creator: Zhu, C.
Partner: UNT Libraries Government Documents Department

Comparisons of TORT and MCNP dose calculations for BNCT treatment planning

Description: The relative merit of using a deterministic code to calculate dose distributions for BNCT applications were examined. The TORT discrete deterministic ordinated code was used in comparison to MCNP4A to calculate dose distributions for BNCT applications
Date: December 31, 1996
Creator: Ingersol, D.T.; Slater, C.O.; Williams, L.R.; Redmond, E.L., II & Zamenhof, R.G.
Partner: UNT Libraries Government Documents Department

Modeling and evaluation of HE driven shock effects in copper with the MTS model

Description: Many experimental studies have investigated the effect of shock pressure on the post-shock mechanical properties of OFHC copper. These studies have shown that significant hardening occurs during shock loading due to dislocation processes and twinning. It has been demonstrated that when an appropriate initial value of the Mechanical Threshold Stress (MTS) is specified, the post-shock flow stress of OFE copper is well described by relationships derived independently for unshocked materials. In this study we consider the evolution of the MTS during HE driven shock loading processes and the effect on the subsequent flow stress of the copper. An increased post shock flow stress results in a higher material temperature due to an increase in the plastic work. An increase in temperature leads to thermal softening which reduces the flow stress. These coupled effects will determine if there is melting in a shaped charge jet or a necking instability in an EFP Ww. `Me critical factor is the evolution path followed combined with the `current` temperature, plastic strain, and strain rate. Preliminary studies indicate that in simulations of HE driven shock with very high resolution zoning, the MTS saturates because of the rate dependence in the evolution law. On going studies are addressing this and other issues with the goal of developing a version of the MT`S model that treats HE driven, shock loading, temperature, strain, and rate effects apriori.
Date: March 17, 1997
Creator: Murphy, M.J. & Lassila, D.F.
Partner: UNT Libraries Government Documents Department