878 Matching Results

Search Results

Advanced search parameters have been applied.

Empirical model for shell-corrected level densities

Description: An empirical model for calculating level densities of closed and near- closed shell nuclei has been developed and tested. This method is based on the calculation of shell plus pairing corrections for each relevant nuclide. A new version of the ALICE code is used to extract these corrections from the Myers-Swiatecki mass formula and to apply them to the calculation of effective excitations in level densities. The corrections are applied in a backshifted fashion to assure correct threshold dependence. We compare our calculated results with experimental data for the production of 56Ni and 88Y to test shell corrections near f7/c closure and the N=50 neutron shell. We also compare our results with those using pure Fermi gas (plus pairing) level densities, and with the more computationally intensive model of Kataria and Ramamurthy.
Date: April 29, 1997
Creator: Ross, M.A. & Blann, M.
Partner: UNT Libraries Government Documents Department

Possible effects of competition on electricity consumers in the Pacific Northwest

Description: In part, the impetus for restructuring the U.S. electricity industry stems from the large regional disparities in electricity prices. Indeed, industry reforms are moving most rapidly in high-cost states, such as California and those in the Northeast. Legislators, regulators, and many others in states that enjoy low electricity prices, on the other hand, ask whether increased competition will benefit consumers in their states. This report quantifies the effects of increased competition on electricity consumers and producers in two regions, the Pacific Northwest and California. California`s generating costs are roughly double those of the Northwest. We use a new strategic-planning model called Oak Ridge Competitive Electricity Dispatch (ORCED) to conduct these analyses. Specifically, we analyzed four cases: a pre-competition base case intended to represent conditions as they might exist under current regulation in the year 2000, a post-competition case in which customer loads and load shapes respond to real-time electricity pricing, a sensitivity case in which natural-gas prices are 20% higher than in the base case, and a sensitivity case in which the hydroelectric output in the Northwest is 20% less than in the base case. The ORCED analyses suggest that, absent regulatory intervention, retail competition would increase profits for producers in the Northwest and lower prices for consumers in California at the expense of consumers in the Northwest and producers in California. However, state regulators may be able to capture some or all of the increased profits and use them to lower electricity prices in the low-cost region. Perhaps the most straightforward way to allocate the costs and benefits to retail customers is through development of transition-cost charges or credits. With this option, the consumers in both regions can benefit from competition. The magnitude and even direction of bulk-power trading between regions depends strongly on the amount of hydroelectric power ...
Date: January 1, 1998
Creator: Hadley, S. & Hirst, E.
Partner: UNT Libraries Government Documents Department

On the computation of CMBR anisotropies from simulations of topological defects

Description: Techniques for computing the CMBR anisotropy from simulations of topological defects are discussed with an eye to getting as much information from a simulation as possible. Here we consider the practical details of which sums and multiplications to do and how many terms there are.
Date: May 1, 1997
Creator: Stebbins, A. & Dodelson, S.
Partner: UNT Libraries Government Documents Department

Dose refinement: ARAC's role

Description: The Atmospheric Release Advisory Capability (ARAC), located at the Lawrence Livermore National Laboratory, since the late 1970�s has been involved in assessing consequences from nuclear and other hazardous material releases into the atmosphere. ARAC�s primary role has been emergency response. However, after the emergency phase, there is still a significant role for dispersion modeling. This work usually involves refining the source term and, hence, the dose to the populations affected as additional information becomes available in the form of source term estimates�release rates, mix of material, and release geometry�and any measurements from passage of the plume and deposition on the ground. Many of the ARAC responses have been documented elsewhere. 1 Some of the more notable radiological releases that ARAC has participated in the post-emergency phase have been the 1979 Three Mile Island nuclear power plant (NPP) accident outside Harrisburg, PA, the 1986 Chernobyl NPP accident in the Ukraine, and the 1996 Japan Tokai nuclear processing plant explosion. ARAC has also done post-emergency phase analyses for the 1978 Russian satellite COSMOS 954 reentry and subsequent partial burn up of its on board nuclear reactor depositing radioactive materials on the ground in Canada, the 1986 uranium hexafluoride spill in Gore, OK, the 1993 Russian Tomsk-7 nuclear waste tank explosion, and lesser releases of mostly tritium. In addition, ARAC has performed a key role in the contingency planning for possible accidental releases during the launch of spacecraft with radioisotope thermoelectric generators (RTGs) on board (i.e. Galileo, Ulysses, Mars-Pathfinder, and Cassini), and routinely exercises with the Federal Radiological Monitoring and Assessment Center (FRMAC) in preparation for offsite consequences of radiological releases from NPPs and nuclear weapon accidents or incidents. Several accident post-emergency phase assessments are discussed in this paper in order to illustrate ARAC�s roll in dose refinement. A brief description of the ...
Date: June 1, 1998
Creator: Baskett, R L; Ellis, J S & Sullivan, T J
Partner: UNT Libraries Government Documents Department

LDRD final report : leveraging multi-way linkages on heterogeneous data.

Description: This report is a summary of the accomplishments of the 'Leveraging Multi-way Linkages on Heterogeneous Data' which ran from FY08 through FY10. The goal was to investigate scalable and robust methods for multi-way data analysis. We developed a new optimization-based method called CPOPT for fitting a particular type of tensor factorization to data; CPOPT was compared against existing methods and found to be more accurate than any faster method and faster than any equally accurate method. We extended this method to computing tensor factorizations for problems with incomplete data; our results show that you can recover scientifically meaningfully factorizations with large amounts of missing data (50% or more). The project has involved 5 members of the technical staff, 2 postdocs, and 1 summer intern. It has resulted in a total of 13 publications, 2 software releases, and over 30 presentations. Several follow-on projects have already begun, with more potential projects in development.
Date: September 1, 2010
Creator: Dunlavy, Daniel M. & Kolda, Tamara Gibson (Sandia National Laboratories, Livermore, CA)
Partner: UNT Libraries Government Documents Department

Visualization Tools for Adaptive Mesh Refinement Data

Description: Adaptive Mesh Refinement (AMR) is a highly effective method for simulations that span a large range of spatiotemporal scales, such as astrophysical simulations that must accommodate ranges from interstellar to sub-planetary. Most mainstream visualization tools still lack support for AMR as a first class data type and AMR code teams use custom built applications for AMR visualization. The Department of Energy's (DOE's) Science Discovery through Advanced Computing (SciDAC) Visualization and Analytics Center for Enabling Technologies (VACET) is currently working on extending VisIt, which is an open source visualization tool that accommodates AMR as a first-class data type. These efforts will bridge the gap between general-purpose visualization applications and highly specialized AMR visual analysis applications. Here, we give an overview of the state of the art in AMR visualization research and tools and describe how VisIt currently handles AMR data.
Date: May 9, 2007
Creator: Weber, Gunther H.; Beckner, Vincent E.; Childs, Hank; Ligocki,Terry J.; Miller, Mark C.; Van Straalen, Brian et al.
Partner: UNT Libraries Government Documents Department


Description: A critical evaluation of data on the viscosity of aqueous sodium chloride solutions is presented. The literature was screened through October 1977, and a databank of evaluated data was established. Viscosity values were converted when necessary to units of centigrade, centipoise and molal concentration. The data were correlated with the aid of an empirical equation to facilitate interpolation and computer calculations. The result of the evaluation includes a table containing smoothed values for the viscosity of NaCl solutions to 150 C.
Date: October 1, 1977
Creator: Ozbek, H.; Fair, J.A. & Phillips, S.L.
Partner: UNT Libraries Government Documents Department

Comparisons of TORT and MCNP dose calculations for BNCT treatment planning

Description: The relative merit of using a deterministic code to calculate dose distributions for BNCT applications were examined. The TORT discrete deterministic ordinated code was used in comparison to MCNP4A to calculate dose distributions for BNCT applications
Date: December 31, 1996
Creator: Ingersol, D.T.; Slater, C.O.; Williams, L.R.; Redmond, E.L., II & Zamenhof, R.G.
Partner: UNT Libraries Government Documents Department

Calculating Contained Firing Facility (CFF) explosive firing zone

Description: The University of California awarded LLNL contract No. B345381 for the design of the facility to Parsons Infrastructure & Technology, Inc., of Pasadena, California. The Laboratory specified that the firing chamber be able to withstand repeated fxings of 60 Kg of explosive located in the center of the chamber, 4 feet above the floor, and repeated firings of 35 Kg of explosive at the same height and located anywhere within 2 feet of the edge of a region on the floor called the anvil. Other requirements were that the chamber be able to accommodate the penetrations of the existing bullnose of the Bunker 801 flash X-ray machine and the roof of the underground camera room. These requirements and provisions for blast-resistant doors formed the essential basis for the design. The design efforts resulted in a steel-reinforced concrete snucture measuring (on the inside) 55 x 5 1 feet by 30 feet high. The walls and ceiling are to be approximately 6 feet thick. Because the 60-Kg charge is not located in the geometric center of the volume and a 35-K:: charge could be located anywhere in a prescribed area, there will be different dynamic pressures and impulses on the various walls? floor, and ceiling, depending upon the weights and locations of the charges. The detailed calculations and specifications to achieve the design criteria were performed by Parsons and are included in Reference 1. Reference 2, Structures to Resist the E@xts of Accidental L%plosions (TMS- 1300>, is the primary design manual for structures of this type. It includes an analysis technique for the calculation of blast loadings within a cubicle or containment-type structure. Parsons used the TM5- 1300 methods to calculate the loadings on the various fling chamber surfaces for the design criteria explosive weights and locations. At LLNL the same methods ...
Date: October 20, 1998
Creator: Lyle, J. W.
Partner: UNT Libraries Government Documents Department

Effective pure states for bulk quantum computation

Description: In bulk quantum computation one can manipulate a large number of indistinguishable quantum computers by parallel unitary operations and measure expectation values of certain observables with limited sensitivity. The initial state of each computer in the ensemble is known but not pure. Methods for obtaining effective pure input states by a series of manipulations have been described by Gershenfeld and Chuang (logical labeling) and Corey et al. (spatial averaging) for the case of quantum computation with nuclear magnetic resonance. We give a different technique called temporal averaging. This method is based on classical randomization, requires no ancilla qubits and can be implemented in nuclear magnetic resonance without using gradient fields. We introduce several temporal averaging algorithms suitable for both high temperature and low temperature bulk quantum computing and analyze the signal to noise behavior of each.
Date: November 1, 1997
Creator: Knill, E.; Chuang, I. & Laflamme, R.
Partner: UNT Libraries Government Documents Department

MCNP{sup TM} criticality primer and training experiences

Description: With the closure of many experimental facilities, the nuclear criticality safety analyst is increasingly required to rely on computer calculations to identify safe limits for the handling and storage of fissile materials. However, the analyst may have little experience with the specific codes available at his or her facility. Usually, the codes are quite complex, black boxes capable of analyzing numerous problems with a myriad of input options. Documentation for these codes is designed to cover all the possible configurations and types of analyses but does not give much detail on any particular type of analysis. For criticality calculations, the user of a code is primarily interested in the value of the effective multiplication factor for a system (k{sub eff}). Most codes will provide this, and truckloads of other information that may be less pertinent to criticality calculations. Based on discussions with code users in the nuclear criticality safety community, it was decided that a simple document discussing the ins and outs of criticality calculations with specific codes would be quite useful. The Transport Methods Group, XTM, at Los Alamos National Laboratory (LANL) decided to develop a primer for criticality calculations with their Monte Carlo code, MCNP. This was a joint task between LANL with a knowledge and understanding of the nuances and capabilities of MCNP and the University of New Mexico with a knowledge and understanding of nuclear criticality safety calculations and educating first time users of neutronics calculations. The initial problem was that the MCNP manual just contained too much information. Almost everything one needs to know about MCNP can be found in the manual; the problem is that there is more information than a user requires to do a simple k{sub eff} calculation. The basic concept of the primer was to distill the manual to create a ...
Date: September 1, 1995
Creator: Briesmeister, J.; Forster, R.A. & Busch, R.
Partner: UNT Libraries Government Documents Department

Modeling words with subword units in an articulatorily constrained speech recognition algorithm

Description: The goal of speech recognition is to find the most probable word given the acoustic evidence, i.e. a string of VQ codes or acoustic features. Speech recognition algorithms typically take advantage of the fact that the probability of a word, given a sequence of VQ codes, can be calculated.
Date: November 20, 1997
Creator: Hogden, J.
Partner: UNT Libraries Government Documents Department

How do plasma flow switches scale with current? Issues in the 6 MA to 30 MA regime

Description: Point mass calculations are used to model switched implosions on several pulsed power machines. The model includes a lumped circuit representation of the pulsed power source. A simple switching model is used to describe a standard plasma flow switch. Implosion kinetic energies are obtained at a convergence ratio of 20 to 1. Heuristic arguments are used to estimate the plasma temperature at pinch, the total x-ray output and the radiation pulse width. Switched models are presented for Pegasus II, Shiva Star, Procyon and Atlas.
Date: September 1, 1995
Creator: Bowers, R.L.; Greene, A.E.; Nakafuji, G.; Peterson, D.L. & Roderick, N.F.
Partner: UNT Libraries Government Documents Department

Modeling and evaluation of HE driven shock effects in copper with the MTS model

Description: Many experimental studies have investigated the effect of shock pressure on the post-shock mechanical properties of OFHC copper. These studies have shown that significant hardening occurs during shock loading due to dislocation processes and twinning. It has been demonstrated that when an appropriate initial value of the Mechanical Threshold Stress (MTS) is specified, the post-shock flow stress of OFE copper is well described by relationships derived independently for unshocked materials. In this study we consider the evolution of the MTS during HE driven shock loading processes and the effect on the subsequent flow stress of the copper. An increased post shock flow stress results in a higher material temperature due to an increase in the plastic work. An increase in temperature leads to thermal softening which reduces the flow stress. These coupled effects will determine if there is melting in a shaped charge jet or a necking instability in an EFP Ww. `Me critical factor is the evolution path followed combined with the `current` temperature, plastic strain, and strain rate. Preliminary studies indicate that in simulations of HE driven shock with very high resolution zoning, the MTS saturates because of the rate dependence in the evolution law. On going studies are addressing this and other issues with the goal of developing a version of the MT`S model that treats HE driven, shock loading, temperature, strain, and rate effects apriori.
Date: March 17, 1997
Creator: Murphy, M.J. & Lassila, D.F.
Partner: UNT Libraries Government Documents Department

Steering object-oriented computations with Python

Description: We have described current approaches and future plans for steering C++ application, running Python on parallel platforms, and combination of Tk interface and Python interpreter in steering computations. In addition, there has been significant enhancement in the Gist module. Tk mega widgets has been implemented for a few physics applications. We have also written Python interface to SIJLO, a data storage package used as an interface to a visualization system named MeshTv. Python is being used to control large-scale simulations (molecular dynamics in particular) running on the CM-5 and T3D at LANL as well. A few other code development projects at LLNL are either using or considering Python as their steering shells. In summary, the merits of Python have been appreciated by more and more people in the scientific computation community.
Date: October 1, 1996
Creator: Yang, T.-Y.B.; Dubois, P.F.; Furnish, G. & Beazley, D.M.
Partner: UNT Libraries Government Documents Department

Theoretical and computer models of detonation in solid explosives

Description: Recent experimental and theoretical advances in understanding energy transfer and chemical kinetics have led to improved models of detonation waves in solid explosives. The Nonequilibrium Zeldovich - von Neumann - Doring (NEZND) model is supported by picosecond laser experiments and molecular dynamics simulations of the multiphonon up-pumping and internal vibrational energy redistribution (IVR) processes by which the unreacted explosive molecules are excited to the transition state(s) preceding reaction behind the leading shock front(s). High temperature, high density transition state theory calculates the induction times measured by laser interferometric techniques. Exothermic chain reactions form product gases in highly excited vibrational states, which have been demonstrated to rapidly equilibrate via supercollisions. Embedded gauge and Fabry-Perot techniques measure the rates of reaction product expansion as thermal and chemical equilibrium is approached. Detonation reaction zone lengths in carbon-rich condensed phase explosives depend on the relatively slow formation of solid graphite or diamond. The Ignition and Growth reactive flow model based on pressure dependent reaction rates and Jones-Wilkins-Lee (JWL) equations of state has reproduced this nanosecond time resolved experimental data and thus has yielded accurate average reaction zone descriptions in one-, two- and three- dimensional hydrodynamic code calculations. The next generation reactive flow model requires improved equations of state and temperature dependent chemical kinetics. Such a model is being developed for the ALE3D hydrodynamic code, in which heat transfer and Arrhenius kinetics are intimately linked to the hydrodynamics.
Date: October 1, 1997
Creator: Tarver, C.M. & Urtiew, P.A.
Partner: UNT Libraries Government Documents Department

Building a programmable interface for physics codes using numeric python

Description: With its portability, ease to add built-in functions and objects in C, and fast array facility among many other features, Python proved to be an excellent language for creating programmable scientific applications. In addition to the two modules presented, there are also other progresses at LLNL in using Python. For example, Python interfaces are being developed for at least three graphics packages, and Python interpreter and applications have been built on distributed platforms such as meiko and Cray T3D.
Date: April 16, 1996
Creator: Yang, T.-Y.B.; Dubois, P.F. & Motteler, Z.C.
Partner: UNT Libraries Government Documents Department

Adventures in supercomputing: An innovative program for high school teachers

Description: Within the realm of education, seldom does an innovative program become available with the potential to change an educator`s teaching methodology. Adventures in Supercomputing (AiS), sponsored by the U.S. Department of Energy (DOE), is such a program. It is a program for high school teachers that changes the teacher paradigm from a teacher-directed approach of teaching to a student-centered approach. {open_quotes}A student-centered classroom offers better opportunities for development of internal motivation, planning skills, goal setting and perseverance than does the traditional teacher-directed mode{close_quotes}. Not only is the process of teaching changed, but the cross-curricula integration within the AiS materials is remarkable. Written from a teacher`s perspective, this paper will describe the AiS program and its effects on teachers and students, primarily at Wartburg Central High School, in Wartburg, Tennessee. The AiS program in Tennessee is sponsored by Oak Ridge National Laboratory (ORNL).
Date: December 31, 1994
Creator: Oliver, C.E.; Hicks, H.R.; Summers, B.G. & Staten, D.G.
Partner: UNT Libraries Government Documents Department