58 Matching Results

Search Results

Advanced search parameters have been applied.

SULFUR POLYMER STABILIZATION/SOLIDIFICATION (SPSS) TREATABILITY OF LOS ALAMOS NATIONAL LABORATORY MERCURY WASTE.

Description: Brookhaven National Laboratory's Sulfur Polymer Stabilization/Solidification (SPSS) process was used to treat approximately 90kg of elemental mercury mixed waste from Los Alamos National Laboratory. Treatment was carried out in a series of eight batches using a 1 ft{sup 3} pilot-scale mixer, where mercury loading in each batch was 33.3 weight percent. Although leach performance is currently not regulated for amalgamated elemental mercury (Hg) mixed waste, Toxicity Characteristic Leach Procedure (TCLP) testing of SPSS treated elemental mercury waste indicates that leachability is readily reduced to below the TCLP limit of 200 ppb (regulatory requirement following treatment by retort for wastes containing > 260 ppb Hg), and with process optimization, to levels less than the stringent Universal Treatment Standard (UTS) limit of 25 ppb that is applied to waste containing < 260 ppm Hg. In addition, mercury-contaminated debris, consisting of primary glass and plastic containers, as well as assorted mercury thermometers, switches, and labware, was first reacted with SPSS components to stabilize the mercury contamination, then macroencapsulated in the molten SPSS product. This treatment was done by vigorous agitation of the sulfur polymer powder and the comminuted debris. Larger plastic and metal containers were reacted to stabilize internal mercury contamination, and then filled with molten sulfur polymer to encapsulate the treated product.
Date: November 1, 2001
Creator: ADAMS,J.W. & KALB,P.D.
Partner: UNT Libraries Government Documents Department

Final LDRD report : the physics of 1D and 2D electron gases in III-nitride heterostructure NWs.

Description: The proposed work seeks to demonstrate and understand new phenomena in novel, freestanding III-nitride core-shell nanowires, including 1D and 2D electron gas formation and properties, and to investigate the role of surfaces and heterointerfaces on the transport and optical properties of nanowires, using a combined experimental and theoretical approach. Obtaining an understanding of these phenomena will be a critical step that will allow development of novel, ultrafast and ultraefficient nanowire-based electronic and photonic devices.
Date: September 1, 2009
Creator: Armstrong, Andrew M.; Arslan, Ilke (Sandia National Laboratories, Livermore, CA); Upadhya, Prashanth C. (Los Alamos National Laboratory, Los Alamos, NM); Morales, Eugenia T. (Sandia National Laboratories, Livermore, CA); Leonard, Francois Leonard (Sandia National Laboratories, Livermore, CA); Li, Qiming et al.
Partner: UNT Libraries Government Documents Department

Unique Signal mathematical analysis task group FY03 status report.

Description: The Unique Signal is a key constituent of Enhanced Nuclear Detonation Safety (ENDS). Although the Unique Signal approach is well prescribed and mathematically assured, there are numerous unsolved mathematical problems that could help assess the risk of deviations from the ideal approach. Some of the mathematics-based results shown in this report are: 1. The risk that two patterns with poor characteristics (easily generated by inadvertent processes) could be combined through exclusive-or mixing to generate an actual Unique Signal pattern has been investigated and found to be minimal (not significant when compared to the incompatibility metric of actual Unique Signal patterns used in nuclear weapons). 2. The risk of generating actual Unique Signal patterns with linear feedback shift registers is minimal, but the patterns in use are not as invulnerable to inadvertent generation by dependent processes as previously thought. 3. New methods of testing pair-wise incompatibility threats have resulted in no significant problems found for the set of Unique Signal patterns currently used. Any new patterns introduced would have to be carefully assessed for compatibility with existing patterns, since some new patterns under consideration were found to be deficient when associated with other patterns in use. 4. Markov models were shown to correspond to some of the engineered properties of Unique Signal sequences. This gives new support for the original design objectives. 5. Potential dependence among events (caused by a variety of communication protocols) has been studied. New evidence has been derived of the risk associated with combined communication of multiple events, and of the improvement in abnormal-environment safety that can be achieved through separate-event communication.
Date: December 1, 2003
Creator: Baty, Roy Stuart (Los Alamos National Laboratory, Los Alamos, NM); Johnston, Anna Marie; Hart, Elizabeth (Utah State University, Logan, UT); White, Allan (NASA, Langley Research Center, Hampton, VA) & Cooper, James Arlin
Partner: UNT Libraries Government Documents Department

A nucleon in a tiny box

Description: The authors use Chiral Perturbation Theory to compute the nucleon mass-shift due to finite volume and temperature effects. The results are valid up to next-to-leading order in the ''{epsilon}-regime'' (mL {approx} m{beta} << 1) as well as in the ''p-regime'' (mL {approx} m{beta} >> 1). Based on the two leading orders, they discuss the convergence of the expansion as a function of the lattice size and quark masses. This result can be used to extrapolate lattice results obtained from lattice sizes smaller than the pion cloud, avoiding the numerical simulation of physics under theoretical control. An extraction of the low-energy coefficient c{sub 3} of the chiral Lagrangean from lattice simulations at small volumes and a ''magic'' ratio {beta} = 1.22262 L might be possible.
Date: July 1, 2004
Creator: Bedaque, Paulo F.; Griesshammer, Harald W. & Rupak, Gautam
Partner: UNT Libraries Government Documents Department

MEMS-based arrays of micro ion traps for quantum simulation scaling.

Description: In this late-start Tier I Seniors Council sponsored LDRD, we have designed, simulated, microfabricated, packaged, and tested ion traps to extend the current quantum simulation capabilities of macro-ion traps to tens of ions in one and two dimensions in monolithically microfabricated micrometer-scaled MEMS-based ion traps. Such traps are being microfabricated and packaged at Sandia's MESA facility in a unique tungsten MEMS process that has already made arrays of millions of micron-sized cylindrical ion traps for mass spectroscopy applications. We define and discuss the motivation for quantum simulation using the trapping of ions, show the results of efforts in designing, simulating, and microfabricating W based MEMS ion traps at Sandia's MESA facility, and describe is some detail our development of a custom based ion trap chip packaging technology that enables the implementation of these devices in quantum physics experiments.
Date: November 1, 2006
Creator: Berkeland, Dana J. (Los Alamos National Laboratory); Blain, Matthew Glenn & Jokiel, Bernhard, Jr.
Partner: UNT Libraries Government Documents Department

Accelerating Network Traffic Analytics Using Query-DrivenVisualization

Description: Realizing operational analytics solutions where large and complex data must be analyzed in a time-critical fashion entails integrating many different types of technology. This paper focuses on an interdisciplinary combination of scientific data management and visualization/analysis technologies targeted at reducing the time required for data filtering, querying, hypothesis testing and knowledge discovery in the domain of network connection data analysis. We show that use of compressed bitmap indexing can quickly answer queries in an interactive visual data analysis application, and compare its performance with two alternatives for serial and parallel filtering/querying on 2.5 billion records worth of network connection data collected over a period of 42 weeks. Our approach to visual network connection data exploration centers on two primary factors: interactive ad-hoc and multiresolution query formulation and execution over n dimensions and visual display of then-dimensional histogram results. This combination is applied in a case study to detect a distributed network scan and to then identify the set of remote hosts participating in the attack. Our approach is sufficiently general to be applied to a diverse set of data understanding problems as well as used in conjunction with a diverse set of analysis and visualization tools.
Date: July 29, 2006
Creator: Bethel, E. Wes; Campbell, Scott; Dart, Eli; Stockinger, Kurt & Wu,Kesheng
Partner: UNT Libraries Government Documents Department

Uranium Hydrogeochemical and Stream Sediment Reconnaissance Data Release for the Cody NTMS Quadrangle, Wyoming, Including Concentrations of Forty-Two Additional Elements

Description: "This report contains data for samples collected during a geochemical survey for uranium in the Cody National Topographic Map Series (NTMS) quadrangle, Wyoming, by the Los Alamos Scientific Laboratory (LASL) as part of the nationwide Hydrogeochemical and Stream Sediment Reconnaissance (HSSR). [...] Totals of 627 water and 1482 sediment samples were collected from 1529 locations at a nominal density of one location per 10 km2. Water samples were collected from streams, springs, and wells, and sediment samples were collected from streams and springs. Most samples were collected by two private contractors in the summers of 1976 and 1977" (p. 1).
Date: August 1980
Creator: Bolivar, Stephen L.
Partner: UNT Libraries Government Documents Department

Uranium Hydrogeochemical and Stream Sediment Reconnaissance of the Bozeman NTMS Quadrangle, Montana

Description: "This report describes work done in the Bozeman, Montana National Topographic Map Series (NTMS) quadrangle (1:250,000 scale) by the Hydrogeochemical and stream Sediment Reconnaissance (HSSR). The HSSR...is designed to identify areas having higher than normal concentrations of uranium in ground waters, surface waters, and water-transported sediments" (p. 1). In this project, 1251 water and 1536 sediment samples were collected from 1586 locations to test for uranium levels.
Date: November 1978
Creator: Bolivar, Stephen L.
Partner: UNT Libraries Government Documents Department

Uranium Hydrogeochemical and Stream Sediment Reconnaissance of the Montrose NTMS Quadrangle, Colorado, Including Concentrations of Forty-Three Additional Elements

Description: Report of uranium findings from a reconnaissance of the Gallup NTMS quadrangle through water, sediment, and water samples collected from streams, springs, and wells.
Date: July 1979
Creator: Broxton, David E.; Morris, Wayne A. & Bolivar, Stephen L.
Partner: UNT Libraries Government Documents Department

Low-Level Radioactive Waste Management at the Nevada Test Site - Year 2000 Current Status

Description: The performance objectives of the Department of Energy's Low-level radioactive waste disposal facilities at the Nevada Test Site transcend those of any other radioactive waste disposal site in the United States. The expanded paper will describe the technical attributes of the facilities, the present and the future disposal capacities and capabilities, and includes a description of the process from waste approval to final disposition. The paper also summarizes the current status of the waste disposal operations.
Date: August 6, 1999
Creator: Bruce D. Becker, Bechtel Nevada; Bruce M. Crowe, Los Alamos National Laboratory; Carl P. Gertz, DOE Nevada & Wendy A. Clayton, DOE Nevada
Partner: UNT Libraries Government Documents Department

Low-Level Radioactive Waste Management at the Nevada Test Site - Current Status

Description: The performance objective of the Department of Energy's Low-Level Radioactive Waste disposal facility at the Nevada Test Site transcends those of any other radioactive waste disposal site in the United States. This paper describes the technical attributes of the facility, present and future capacities and capabilities, and provides a description of the process from waste approval to final disposition. The paper also summarizes the current status of the waste disposal operations.
Date: February 1, 1999
Creator: Bruce D. Becker, Bechtel Nevada; Bruce M. Crowe, Los Alamos National Laboratory; Carl P. Gertz, DOE Nevada Operations Office & Wendy A. Clayton, DOE Nevada Operations Office
Partner: UNT Libraries Government Documents Department

Building A Universal Nuclear Energy Density Functional (UNEDF)

Description: During the period of Dec. 1 2006 – Jun. 30, 2012, the UNEDF collaboration carried out a comprehensive study of all nuclei, based on the most accurate knowledge of the strong nuclear interaction, the most reliable theoretical approaches, the most advanced algorithms, and extensive computational resources, with a view towards scaling to the petaflop platforms and beyond. The long-term vision initiated with UNEDF is to arrive at a comprehensive, quantitative, and unified description of nuclei and their reactions, grounded in the fundamental interactions between the constituent nucleons. We seek to replace current phenomenological models of nuclear structure and reactions with a well-founded microscopic theory that delivers maximum predictive power with well-quantified uncertainties. Specifically, the mission of this project has been three-fold: first, to find an optimal energy density functional (EDF) using all our knowledge of the nucleonic Hamiltonian and basic nuclear properties; second, to apply the EDF theory and its extensions to validate the functional using all the available relevant nuclear structure and reaction data; third, to apply the validated theory to properties of interest that cannot be measured, in particular the properties needed for reaction theory. The main physics areas of UNEDF, defined at the beginning of the project, were: ab initio structure; ab initio functionals; DFT applications; DFT extensions; reactions.
Date: September 30, 2012
Creator: Carlson, Joe, Los Alamos National Laboratory, Los Alamos, NM; Furnstahl, Dick, Ohio State University, Columbus, OH; Horoi, Mihai, Central Michigan University, Mount Pleasant, MI; Lusk, Rusty, Argonne National Laboratory, Argonne, IL; Nazarewicz, Witek, University of Tennessee, Knoxville, TN; Ng, Esmond, Berkeley National Laboratory, Berkeley, CA et al.
Partner: UNT Libraries Government Documents Department

Status and Plans for the National Spherical Torus Experimental Research Facility

Description: An overview of the research capabilities and the future plans on the MA-class National Spherical Torus Experiment (NSTX) at Princeton is presented. NSTX research is exploring the scientific benefits of modifying the field line structure from that in more conventional aspect ratio devices, such as the tokamak. The relevant scientific issues pursued on NSTX include energy confinement, MHD stability at high beta, non-inductive sustainment, solenoid-free start-up, and power and particle handling. In support of the NSTX research goal, research tools are being developed by the NSTX team. In the context of the fusion energy development path being formulated in the US, an ST-based Component Test Facility (CTF) and, ultimately a high beta Demo device based on the ST, are being considered. For these, it is essential to develop high performance (high beta and high confinement), steady-state (non-inductively driven) ST operational scenarios and an efficient solenoid-free start-up concept. We will also briefly describe the Next-Step-ST (NSST) device being designed to address these issues in fusion-relevant plasma conditions.
Date: July 27, 2005
Creator: Columbia University
Partner: UNT Libraries Government Documents Department

Quantifying reliability uncertainty : a proof of concept.

Description: This paper develops Classical and Bayesian methods for quantifying the uncertainty in reliability for a system of mixed series and parallel components for which both go/no-go and variables data are available. Classical methods focus on uncertainty due to sampling error. Bayesian methods can explore both sampling error and other knowledge-based uncertainties. To date, the reliability community has focused on qualitative statements about uncertainty because there was no consensus on how to quantify them. This paper provides a proof of concept that workable, meaningful quantification methods can be constructed. In addition, the application of the methods demonstrated that the results from the two fundamentally different approaches can be quite comparable. In both approaches, results are sensitive to the details of how one handles components for which no failures have been seen in relatively few tests.
Date: October 1, 2009
Creator: Diegert, Kathleen V.; Dvorack, Michael A.; Ringland, James T.; Mundt, Michael Joseph; Huzurbazar, Aparna (Los Alamos National Laboratory, Los Alamos, NM); Lorio, John F. et al.
Partner: UNT Libraries Government Documents Department

Numerical tools for atomistic simulations.

Description: The final report for a Laboratory Directed Research and Development project entitled 'Parallel Atomistic Computing for Failure Analysis of Micromachines' is presented. In this project, atomistic algorithms for parallel computers were developed to assist in quantification of microstructure-property relations related to weapon micro-components. With these and other serial computing tools, we are performing atomistic simulations of various sizes, geometries, materials, and boundary conditions. These tools provide the capability to handle the different size-scale effects required to predict failure. Nonlocal continuum models have been proposed to address this problem; however, they are phenomenological in nature and are difficult to validate for micro-scale components. Our goal is to separately quantify damage nucleation, growth, and coalescence mechanisms to provide a basis for macro-scale continuum models that will be used for micromachine design. Because micro-component experiments are difficult, a systematic computational study that employs Monte Carlo methods, molecular statics, and molecular dynamics (EAM and MEAM) simulations to compute continuum quantities will provide mechanism-property relations associated with the following parameters: specimen size, number of grains, crystal orientation, strain rates, temperature, defect nearest neighbor distance, void/crack size, chemical state, and stress state. This study will quantify sizescale effects from nanometers to microns in terms of damage progression and thus potentially allow for optimized micro-machine designs that are more reliable and have higher fidelity in terms of strength. In order to accomplish this task, several atomistic methods needed to be developed and evaluated to cover the range of defects, strain rates, temperatures, and sizes that a material may see in micro-machines. Therefore we are providing a complete set of tools for large scale atomistic simulations that include pre-processing of realistic material configurations, processing under different environments, and post-processing with appropriate continuum quantities. By running simulations with these tools, we are able to determine size scale effects that ...
Date: January 1, 2004
Creator: Fang, H. (Mississippi State University); Gullett, Philip Michael; Slepoy, Alexander (Sandia National Laboratories, Albuquerque, NM); Horstemeyer, Mark F. (Mississippi State University); Baskes, Michael I. (Los Alamos National Laboratory, Los Alamos, NM); Wagner, Gregory John et al.
Partner: UNT Libraries Government Documents Department

Exploring pulse shaping for Z using graded-density impactors on gas guns (final report for LDRD project 79879).

Description: While isentropic compression experiment (ICE) techniques have proved useful in deducing the high-pressure compressibility of a wide range of materials, they have encountered difficulties where large-volume phase transitions exist. The present study sought to apply graded-density impactor methods for producing isentropic loading to planar impact experiments to selected such problems. Cerium was chosen due to its 20% compression between 0.7 and 1.0 GPa. A model was constructed based on limited earlier dynamic data, and applied to the design of a suite of experiments. A capability for handling this material was installed. Two experiments were executed using shock/reload techniques with available samples, loading initially to near the gamma-alpha transition, then reloading. As well, two graded-density impactor experiments were conducted with alumina. A method for interpreting ICE data was developed and validated; this uses a wavelet construction for the ramp wave and includes corrections for the ''diffraction'' of wavelets by releases or reloads reflected from the sample/window interface. Alternate methods for constructing graded-density impactors are discussed.
Date: October 1, 2005
Creator: Furnish, Michael David; Reinhart, William Dodd; Anderson, William W. (Los Alamos National Laboratory, Los Alamos, NM); Vogler, Tracy John; Hixson, Rob (Los Alamos National Laboratory, Los Alamos, NM) & Kipp, Marlin E.
Partner: UNT Libraries Government Documents Department

Radionuclide inventories : ORIGEN2.2 isotopic depletion calculation for high burnup low-enriched uranium and weapons-grade mixed-oxide pressurized-water reactor fuel assemblies.

Description: The Oak Ridge National Laboratory computer code, ORIGEN2.2 (CCC-371, 2002), was used to obtain the elemental composition of irradiated low-enriched uranium (LEU)/mixed-oxide (MOX) pressurized-water reactor fuel assemblies. Described in this report are the input parameters for the ORIGEN2.2 calculations. The rationale for performing the ORIGEN2.2 calculation was to generate inventories to be used to populate MELCOR radionuclide classes. Therefore the ORIGEN2.2 output was subsequently manipulated. The procedures performed in this data reduction process are also described herein. A listing of the ORIGEN2.2 input deck for two-cycle MOX is provided in the appendix. The final output from this data reduction process was three tables containing the radionuclide inventories for LEU/MOX in elemental form. Masses, thermal powers, and activities were reported for each category.
Date: April 1, 2010
Creator: Gauntt, Randall O.; Ross, Kyle W. (Los Alamos National Laboratory, Los Alamos, NM); Smith, James Dean & Longmire, Pamela
Partner: UNT Libraries Government Documents Department

Multi-unit operations considerations.

Description: Several nuclear weapons programs have or are pursuing the implementation of multi-unit operations for tasks such as disassembly and inspection, and rebuild. A multi-unit operation is interpreted to mean the execution of nuclear explosive operating procedures in a single facility by two separate teams of technicians. The institution of a multi-unit operations program requires careful consideration of the tools, resources, and environment provided to the technicians carrying out the work. Therefore, a systematic approach is necessary to produce safe, secure, and reliable processes. In order to facilitate development of a more comprehensive multi-unit operations program, the current work details categorized issues that should be addressed prior to the implementation of multi-unit operations in a given weapons program. The issues have been organized into the following categories: local organizational conditions, work process flow/material handling/workplace configuration, ambient environmental conditions, documented safety analysis, and training.
Date: September 1, 2005
Creator: Gilmore, Walter E. (Los Alamos National Laboratory, Los Alamos, NM); Bennett, Thomas C. (Lawrence Livermore National Laboratory, Albuquerque, NM) & Brannon, Nathan Gregory
Partner: UNT Libraries Government Documents Department

Surveillance metrics sensitivity study.

Description: In September of 2009, a Tri-Lab team was formed to develop a set of metrics relating to the NNSA nuclear weapon surveillance program. The purpose of the metrics was to develop a more quantitative and/or qualitative metric(s) describing the results of realized or non-realized surveillance activities on our confidence in reporting reliability and assessing the stockpile. As a part of this effort, a statistical sub-team investigated various techniques and developed a complementary set of statistical metrics that could serve as a foundation for characterizing aspects of meeting the surveillance program objectives. The metrics are a combination of tolerance limit calculations and power calculations, intending to answer level-of-confidence type questions with respect to the ability to detect certain undesirable behaviors (catastrophic defects, margin insufficiency defects, and deviations from a model). Note that the metrics are not intended to gauge product performance but instead the adequacy of surveillance. This report gives a short description of four metrics types that were explored and the results of a sensitivity study conducted to investigate their behavior for various inputs. The results of the sensitivity study can be used to set the risk parameters that specify the level of stockpile problem that the surveillance program should be addressing.
Date: September 1, 2011
Creator: Hamada, Michael S. (Los Alamos National Laboratory); Bierbaum, Rene Lynn & Robertson, Alix A. (Lawrence Livermore Laboratory)
Partner: UNT Libraries Government Documents Department