UNT Libraries Government Documents Department - Browse

Transducer Signal Noise Analysis for Sensor Authentication
The abstract is being passed through STIMS for submision to the conference. International safeguards organizations charged with promoting the peaceful use of nuclear energy employ unattended and remote monitoring systems supplemented with onsite inspections to ensure nuclear materials are not diverted for weaponization purposes. These systems are left unattended for periods of several months between inspections. During these periods physical security means are the main deterrent used to detect intentional monitoring system tampering. The information gathering components are locked in secure and sealed rooms. The sensor components (i.e. neutron and gamma detectors) are located throughout the plant in unsecure areas where sensor tampering could take place during the periods between inspections. Sensor tampering could allow the diversion of nuclear materials from the accepted and intended use to uses not consistent with the peaceful use of nuclear energy. A method and an apparatus is presented that address the detection of sensor tampering during the periods between inspections. It was developed at the Idaho National Laboratory (INL) for the Department of Energy (DOE) in support of the IAEA. The method is based on the detailed analysis of the sensor noise floor after the sensor signal is removed. The apparatus consists of a 2.1” x 2.6” electronic circuit board containing all signal conditioning and processing components and a laptop computer running an application that acquires and stores the analysis results between inspection periods. The sensors do not require any modification and are remotely located in their normal high radiation zones. The apparatus interfaces with the sensor signal conductors using a simple pass through connector at the normal sensor electronics interface package located in the already secure and sealed rooms. The apparatus does not require hardening against the effects of radiation due to its location. Presented is the apparatus design, the analysis method, and ...
Definition of Technology Readiness Levels for Transmutation Fuel Development
To quantitatively assess the maturity of a given technology, the Technology Readiness Level (TRL) process is used. The TRL process has been developed and successfully used by the Department of Defense (DOD) for development and deployment of new technology and systems for defense applications. In addition, NASA has also successfully used the TRL process to develop and deploy new systems for space applications. Transmutation fuel development is a critical technology needed for closing the nuclear fuel cycle. Because the deployment of a new nuclear fuel forms requires a lengthy and expensive research, development, and demonstration program, applying the TRL concept to the transmutation fuel development program is very useful as a management and tracking tool. This report provides definition of the technology readiness level assessment process as defined for use in assessing nuclear fuel technology development for the Transuranic Fuel Development Campaign.
Review of Transmutation Fuel Studies
The technology demonstration element of the Global Nuclear Energy Partnership (GNEP) program is aimed at demonstrating the closure of the fuel cycle by destroying the transuranic (TRU) elements separated from spent nuclear fuel (SNF). Multiple recycle through fast reactors is used for burning the TRU initially separated from light-water reactor (LWR) spent nuclear fuel. For the initial technology demonstration, the preferred option to demonstrate the closed fuel cycle destruction of TRU materials is a sodium-cooled fast reactor (FR) used as burner reactor. The sodium-cooled fast reactor represents the most mature sodium reactor technology available today. This report provides a review of the current state of development of fuel systems relevant to the sodium-cooled fast reactor. This report also provides a review of research and development of TRU-metal alloy and TRU-oxide composition fuels. Experiments providing data supporting the understanding of minor actinide (MA)-bearing fuel systems are summarized and referenced.
Assessment of Startup Fuel Options for the GNEP Advanced Burner Reactor (ABR)
The Global Nuclear Energy Program (GNEP) includes a program element for the development and construction of an advanced sodium cooled fast reactor to demonstrate the burning (transmutation) of significant quantities of minor actinides obtained from a separations process and fabricated into a transuranic bearing fuel assembly. To demonstrate and qualify transuranic (TRU) fuel in a fast reactor, an Advanced Burner Reactor (ABR) prototype is needed. The ABR would necessarily be started up using conventional metal alloy or oxide (U or U, Pu) fuel. Startup fuel is needed for the ABR for the first 2 to 4 core loads of fuel in the ABR. Following start up, a series of advanced TRU bearing fuel assemblies will be irradiated in qualification lead test assemblies in the ABR. There are multiple options for this startup fuel. This report provides a description of the possible startup fuel options as well as possible fabrication alternatives available to the program in the current domestic and international facilities and infrastructure.
NOx emissions from a heavy-duty diesel engine were reduced by more than 90% and 80% utilizing a full-scale ethanol-SCR system for space velocities of 21000/h and 57000/h respectively. These results were achieved for catalyst temperatures between 360 and 400 C and for C1:NOx ratios of 4-6. The SCR process appears to rapidly convert ethanol to acetaldehyde, which subsequently slipped past the catalyst at appreciable levels at a space velocity of 57000/h. Ammonia and N2O were produced during conversion; the concentrations of each were higher for the low space velocity condition. However, the concentration of N2O did not exceed 10 ppm. In contrast to other catalyst technologies, NOx reduction appeared to be enhanced by initial catalyst aging, with the presumed mechanism being sulfate accumulation within the catalyst. A concept for utilizing ethanol (distilled from an E-diesel fuel) as the SCR reductant was demonstrated.
Differences in the lung toxicity and bacterial mutagenicity of seven samples from gasoline and diesel vehicle emissions were reported previously [1]. Filter and vapor-phase semivolatile organic samples were collected from normal and high-emitter gasoline and diesel vehicles operated on chassis dynamometers on the Unified Driving Cycle, and the compositions of the samples were measured in detail. The two fractions of each sample were combined in their original mass collection ratios, and the toxicity of the seven samples was compared by measuring inflammation and tissue damage in rat lungs and mutagenicity in bacteria. There was good agreement among the toxicity response variables in ranking the samples and demonstrating a five-fold range of toxicity. The relationship between chemical composition and toxicity was analyzed by a combination of principal component analysis (PCA) and partial least squares regression (PLS, also known as projection to latent surfaces). The PCA /PLS analysis revealed the chemical constituents co-varying most strongly with toxicity and produced models predicting the relative toxicity of the samples with good accuracy. The results demonstrated the utility of the PCA/PLS approach, which is now being applied to additional samples, and it also provided a starting point for confirming the compounds that actually cause the effects.
Bulletin of the Medical Department, Brookhaven National Laboratory (1960)
Bulletin of the Medical Department, Brookhaven National Laboratory (1961)
Building America Residential System Research Results: Achieving 30% Whole House Energy Savings Level in Marine Climates; January 2006 - December 2006
The Building America program conducts the system research required to reduce risks associated with the design and construction of homes that use an average of 30% to 90% less total energy for all residential energy uses than the Building America Research Benchmark, including research on homes that will use zero net energy on annual basis. To measure the program's progress, annual research milestones have been established for five major climate regions in the United States. The system research activities required to reach each milestone take from 3 to 5 years to complete and include research in individual test houses, studies in pre-production prototypes, and research studies with lead builders that provide early examples that the specified energy savings level can be successfully achieved on a production basis. This report summarizes research results for the 30% energy savings level and demonstrates that lead builders can successfully provide 30% homes in the Marine Climate Region on a cost neutral basis.
Building America Residential System Research Results: Achieving 30% Whole House Energy Savings Level in Mixed-Humid Climates; January 2006 - December 2006
The Building America program conducts the system research required to reduce risks associated with the design and construction of homes that use an average of 30% to 90% less total energy for all residential energy uses than the Building America Research Benchmark, including research on homes that will use zero net energy on annual basis. To measure the program's progress, annual research milestones have been established for five major climate regions in the United States. The system research activities required to reach each milestone take from 3 to 5 years to complete and include research in individual test houses, studies in pre-production prototypes, and research studies with lead builders that provide early examples that the specified energy savings level can be successfully achieved on a production basis. This report summarizes research results for the 30% energy savings level and demonstrates that lead builders can successfully provide 30% homes in the Mixed-Humid Climate Region on a cost-neutral basis.
Building America Residential System Research Results: Achieving 30% Whole House Energy Savings Level in the Hot-Dry and Mixed-Dry Climates
The Building America program conducts the system research required to reduce risks associated with the design and construction of homes that use an average of 30% to 90% less total energy for all residential energy uses than the Building America Research Benchmark, including research on homes that will use zero net energy on annual basis. To measure the program's progress, annual research milestones have been established for five major climate regions in the United States. The system research activities required to reach each milestone take from 3 to 5 years to complete and include research in individual test houses, studies in pre-production prototypes, and research studies with lead builders that provide early examples that the specified energy savings level can be successfully achieved on a production basis. This report summarizes research results for the 30% energy savings level and demonstrates that lead builders can successfully provide 30% homes in the Hot-Dry/Mixed-Dry Climate Region on a cost neutral basis.
Building America Residential System Research Results: Achieving 30% Whole House Energy Savings Level in Cold Climates
The Building America program conducts the system research required to reduce risks associated with the design and construction of homes that use an average of 30% to 90% less total energy for all residential energy uses than the Building America Research Benchmark, including research on homes that will use zero net energy on annual basis. To measure the program's progress, annual research milestones have been established for five major climate regions in the United States. The system research activities required to reach each milestone take from 3 to 5 years to complete and include research in individual test houses, studies in pre-production prototypes, and research studies with lead builders that provide early examples that the specified energy savings level can be successfully achieved on a production basis. This report summarizes research results for the 30% energy savings level and demonstrates that lead builders can successfully provide 30% homes in Cold Climates on a cost-neutral basis.
Strategies for the Commercialization and Deployment of Greenhouse Gas Intensity-Reducing Technologies and Practices
New technologies will be a critical component--perhaps the critical component--of our efforts to tackle the related challenges of energy security, climate change, and air pollution, all the while maintaining a strong economy. But just developing new technologies is not enough. Our ability to accelerate the market penetration of clean energy, enabling, and other climate-related technologies will have a determining impact on our ability to slow, stop, and reverse the growth in greenhouse gas (GHG) emissions. Title XVI, Subtitle A, of the Energy Policy Act of 2005 (EPAct 2005) directs the Administration to report on its strategy to promote the commercialization and deployment (C&D) of GHG intensity-reducing technologies and practices. The Act also requests the Administration to prepare an inventory of climate-friendly technologies suitable for deployment and to identify the barriers and commercial risks facing advanced technologies. Because these issues are related, they are integrated here within a single report that we, representing the Committee on Climate Change Science and Technology Integration (CCCSTI), are pleased to provide the President, the Congress, and the public. Over the past eight years, the Administration of President George W. Bush has pursued a series of policies and measures aimed at encouraging the development and deployment of advanced technologies to reduce GHG emissions. This report highlights these policies and measures, discusses the barriers to each, and integrates them within a larger body of other extant policy. Taken together, more than 300 policies and measures described in this document may be viewed in conjunction with the U.S. Climate Change Technology Program's (CCTP's) Strategic Plan, published in September 2006, which focuses primarily on the role of advanced technology and associated research and development (R&D) for mitigating GHG emissions. The CCTP, a multi-agency technology planning and coordination program, initiated by President Bush, and subsequently authorized in EPAct2005, is responsible ...
Independent Assessment of the Savannah River Site High-Level Waste Salt Disposition Alternatives Evaluation
This report presents the results of the Independent Project Evaluation (IPE) Team assessment of the Westinghouse Savannah River Company High-Level Waste Salt Disposition Systems Engineering (SE) Team's deliberations, evaluations, and selections. The Westinghouse Savannah River Company concluded in early 1998 that production goals and safety requirements for processing SRS HLW salt to remove Cs-137 could not be met in the existing In-Tank Precipitation Facility as currently configured for precipitation of cesium tetraphenylborate. The SE Team was chartered to evaluate and recommend an alternative(s) for processing the existing HLW salt to remove Cs-137. To replace the In-Tank Precipitation process, the Savannah River Site HLW Salt Disposition SE Team downselected (October 1998) 140 candidate separation technologies to two alternatives: Small-Tank Tetraphenylborate (TPB) Precipitation (primary alternative) and Crystalline Silicotitanate (CST) Nonelutable Ion Exchange (backup alternative). The IPE Team, commissioned by the Department of Energy, concurs that both alternatives are technically feasible and should meet all salt disposition requirements. But the IPE Team judges that the SE Team's qualitative criteria and judgments used in their downselection to a primary and a backup alternative do not clearly discriminate between the two alternatives. To properly choose between Small-Tank TPB and CST Ion Exchange for the primary alternative, the IPE Team suggests the following path forward: Complete all essential R and D activities for both alternatives and formulate an appropriate set of quantitative decision criteria that will be rigorously applied at the end of the R and D activities. Concurrent conceptual design activities should be limited to common elements of the alternatives.
Corrective action investigation plan for Corrective Action Unit 143: Area 25 contaminated waste dumps, Nevada Test Site, Nevada, Revision 1 (with Record of Technical Change No. 1 and 2)
This plan contains the US Department of Energy, Nevada Operations Office's approach to collect the data necessary to evaluate correction action alternatives appropriate for the closure of Corrective Action Unit (CAU) 143 under the Federal Facility Agreement and Consent Order. Corrective Action Unit 143 consists of two waste dumps used for the disposal of solid radioactive wastes. Contaminated Waste Dump No.1 (CAS 25-23-09) was used for wastes generated at the Reactor Maintenance Assembly and Disassembly (R-MAD) Facility and Contaminated Waste Dump No.2 (CAS 25-23-03) was used for wastes generated at the Engine Maintenance Assembly and Disassembly (E-MAD) Facility. Both the R-MAD and E-MAD facilities are located in Area 25 of the Nevada Test Site. Based on site history, radionuclides are the primary constituent of concern and are located in these disposal areas; vertical and lateral migration of the radionuclides is unlikely; and if migration has occurred it will be limited to the soil beneath the Contaminated Waste Disposal Dumps. The proposed investigation will involve a combination of Cone Penetrometer Testing within and near the solid waste disposal dumps, field analysis for radionuclides and volatile organic compounds, as well as sample collection from the waste dumps and surrounding areas for off-site chemical, radiological, and geotechnical analyses. The results of this field investigation will support a defensible evaluation of corrective action alternatives in the corrective action decision document.
Record of Technical Change No.1 for ``Corrective Action Decision Document for Corrective Action Unit 240: Area 25 Vehicle Washdown, Nevada Test Site, Nevada''
This Record of Technical Change provides updates to the technical information provided in ``Corrective Action Decision Document for Corrective Action Unit 240: Area 25 Vehicle Washdown, Nevada Test Site, Nevada.''
Savannah River Ecology Laboratory, Annual Technical Progress Report of Ecological Research, June 30, 2003
No abstract prepared.
2006 Toxic Chemical Release Inventory Report for the Emergency Planning and Community Right-to-Know Act of 1986, Title III, Section 313
For reporting year 2006, Los Alamos National Laboratory (LANL or the Laboratory) submitted Form R reports for lead as required under the Emergency Planning and Community Right-to-Know Act (EPCRA) Section 313. No other EPCRA Section 313 chemicals were used in 2006 above the reportable thresholds. This document was prepared to provide a description of the evaluation of EPCRA Section 313 chemical use and threshold determinations for LANL for calendar year 2006, as well as to provide background information about data included on the Form R reports. Section 313 of EPCRA specifically requires facilities to submit a Toxic Chemical Release Inventory Report (Form R) to the U.S. Environmental Protection Agency (EPA) and state agencies if the owners and operators manufacture, process, or otherwise use any of the listed toxic chemicals above listed threshold quantities. EPA compiles this data in the Toxic Release Inventory database. Form R reports for each chemical over threshold quantities must be submitted on or before July 1 each year and must cover activities that occurred at the facility during the previous year. In 1999, EPA promulgated a final rule on persistent bioaccumulative toxics (PBTs). This rule added several chemicals to the EPCRA Section 313 list of toxic chemicals and established lower reporting thresholds for these and other PBT chemicals that were already reportable. These lower thresholds became applicable in reporting year 2000. In 2001, EPA expanded the PBT rule to include a lower reporting threshold for lead and lead compounds. Facilities that manufacture, process, or otherwise use more than 100 lb of lead or lead compounds must submit a Form R.
Community Assessment Tool for Public Health Emergencies Including Pandemic Influenza
The Community Assessment Tool (CAT) for Public Health Emergencies Including Pandemic Influenza (hereafter referred to as the CAT) was developed as a result of feedback received from several communities. These communities participated in workshops focused on influenza pandemic planning and response. The 2008 through 2011 workshops were sponsored by the Centers for Disease Control and Prevention (CDC). Feedback during those workshops indicated the need for a tool that a community can use to assess its readiness for a disaster - readiness from a total healthcare perspective, not just hospitals, but the whole healthcare system. The CAT intends to do just that - help strengthen existing preparedness plans by allowing the healthcare system and other agencies to work together during an influenza pandemic. It helps reveal each core agency partners (sectors) capabilities and resources, and highlights cases of the same vendors being used for resource supplies (e.g., personal protective equipment [PPE] and oxygen) by the partners (e.g., public health departments, clinics, or hospitals). The CAT also addresses gaps in the community's capabilities or potential shortages in resources. This tool has been reviewed by a variety of key subject matter experts from federal, state, and local agencies and organizations. It also has been piloted with various communities that consist of different population sizes, to include large urban to small rural communities.
Acid Pit Stabilization Project (Volume 1 - Cold Testing) and (Volume 2 - Hot Testing)
During the summer and fall of Fiscal Year 1997, a Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) Treatability Study was performed at the Idaho National Engineering and Environmental Laboratory. The study involved subsurface stabilization of a mixed waste contaminated soil site called the Acid Pit. This study represents the culmination of a successful technology development effort that spanned Fiscal Years 1994-1996. Research and development of the in situ grout stabilization technique was conducted. Hardware and implementation techniques are currently documented in a patent pending with the United States Patent and Trademark Office. The stabilization technique involved using jet grouting of an innovative grouting material to form a monolith out of the contamination zone. The monolith simultaneously provides a barrier to further contaminant migration and closes voids in the soil structure against further subsidence. This is accomplished by chemical incorporation of contaminants into less soluble species and achieving a general reduction in hydraulic conductivity within the monolith. The grout used for this study was TECT-HG, a relatively dense iron oxide-based cementitious grout. The treatability study involved cold testing followed by in situ stabilization of the Acid Pit. Volume 1 of this report discusses cold testing, performed as part of a ''Management Readiness Assessment'' in preparation for going hot. Volume 2 discusses the results of the hot Acid Pit Stabilization phase of this project. Drilling equipment was specifically rigged to reduce the spread of contamination, and all grouting was performed under a concrete block containing void space to absorb any grout returns. Data evaluation included examination of implementability of the grouting process and an evaluation of the contaminant spread during grouting. Following curing of the stabilized pit, cores were obtained and evaluated for toxicity characteristic leach ing procedure protocol for the main contaminant of concern, which was mercury. In addition, the ...
Performance assessment analyses unique to Department of Energy spent nuclear fuel
This paper describes the iterative process of grouping and performance assessment that has led to the current grouping of the U.S. Department of Energy (DOE) spent nuclear fuel (SNF). The unique sensitivity analyses that form the basis for incorporating DOE fuel into the total system performance assessment (TSPA) base case model are described. In addition, the chemistry that results from dissolution of DOE fuel and high level waste (HLW) glass in a failed co-disposal package, and the effects of disposal of selected DOE SNF in high integrity cans are presented.
Field Operations Program Chevrolet S-10 (Lead-Acid) Accelerated Reliability Testing - Final Report
This report summarizes the Accelerated Reliability testing of five lead-acid battery-equipped Chevrolet S-10 electric vehicles by the US Department of Energy's Field Operations Program and the Program's testing partners, Electric Transportation Applications (ETA) and Southern California Edison (SCE). ETA and SCE operated the S-10s with the goal of placing 25,000 miles on each vehicle within 1 year, providing an accelerated life-cycle analysis. The testing was performed according to established and published test procedures. The S-10s' average ranges were highest during summer months; changes in ambient temperature from night to day and from season-to-season impacted range by as much as 10 miles. Drivers also noted that excessive use of power during acceleration also had a dramatic effect on vehicle range. The spirited performance of the S-10s created a great temptation to inexperienced electric vehicle drivers to ''have a good time'' and to fully utilize the S-10's acceleration capability. The price of injudicious use of power is greatly reduced range and a long-term reduction in battery life. The range using full-power accelerations followed by rapid deceleration in city driving has been 20 miles or less.
Gamma-Ray Spectrometric Characterization of Overpacked CC104/107 RH-TRU Wastes: Surrogate Tests
Development of the gamma-ray spectrometric technique termed GSAK (Gamma-Ray Spectrometry with Acceptable Knowledge) for the characterization of CC104/107 remote-handled transuranic (RH-TRU) wastes continued this year. Proof-of-principle measurements have been completed on the surrogate RH-TRU waste drums configured earlier this year. The GSAK technique uses conventional gamma-ray spectrometry to quantify the detectable fission product content of overpacked RH-TRU drums. These results are then coupled with the inventory report to characterize the waste drum content. The inventory report is based on process knowledge of the waste drum loading and calculations of the isotopic distribution in the spent fuel examined to generate the drummed wastes. Three RH-TRU surrogate drums were configured with encapsulated EBR-II driver fuel rod segments arranged in the surrogate drum assemblies. Segment-specific inventory calculations initially specified the radionuclide content of the fuel segments and thus the surrogate drums. Radiochemical assays performed on representative fuel element segments identified a problem in the accuracy of some of the fission and activation product inventory values and provided a basis for adjustment of the specified surrogate drum inventories. The three waste drum surrogates, contained within their 8.9 cm (3.5 inch) thick steel overpacks, were analyzed by gamma-ray spectrometry at the TREAT facility at Argonne National Laboratory-West. Seven fission and activation product radionuclides ({sup 54}Mn, {sup 60}Co, {sup 125}Sb, {sup 134}Cs, {sup 137}Cs, {sup 144}CePr, and {sup 154}Eu) were reliably detected. The gamma-ray spectral accuracy was very good. In all cases, a two-sigma error bar constructed about the measured value included the actual drum activity.
Geothermal Electrical Production CO2 Emissions Study
Emission of �greenhouse gases� into the environment has become an increasing concern. Deregulation of the electrical market will allow consumers to select power suppliers that utilize �green power.� Geothermal power is classed as �green power� and has lower emissions of carbon dioxide per kilowatt-hour of electricity than even the cleanest of fossil fuels, natural gas. However, previously published estimates of carbon dioxide emissions are relatively old and need revision. This study estimates that the average carbon dioxide emissions from geothermal and fossil fuel power plants are: geothermal 0.18 , coal 2.13, petroleum 1.56 , and natural gas 1.03 pounds of carbon dioxide per kilowatt-hour respectively.
Waste Management Planned for the Advanced Fuel Cycle Facility
The U.S. Department of Energy (DOE) Global Nuclear Energy Partnership (GNEP) program has been proposed to develop and employ advanced technologies to increase the proliferation resistance of spent nuclear fuels, recover and reuse nuclear fuel resources, and reduce the amount of wastes requiring permanent geological disposal. In the initial GNEP fuel cycle concept, spent nuclear fuel is to be reprocessed to separate re-useable transuranic elements and uranium from waste fission products, for fabricating new fuel for fast reactors. The separated wastes would be converted to robust waste forms for disposal. The Advanced Fuel Cycle Facility (AFCF) is proposed by DOE for developing and demonstrating spent nuclear fuel recycling technologies and systems. The AFCF will include capabilities for receiving and reprocessing spent fuel and fabricating new nuclear fuel from the reprocessed spent fuel. Reprocessing and fuel fabrication activities will generate a variety of radioactive and mixed waste streams. Some of these waste streams are unique and unprecedented. The GNEP vision challenges traditional U.S. radioactive waste policies and regulations. Product and waste streams have been identified during conceptual design. Waste treatment technologies have been proposed based on the characteristics of the waste streams and the expected requirements for the final waste forms. Results of AFCF operations will advance new technologies that will contribute to safe and economical commercial spent fuel reprocessing facilities needed to meet the GNEP vision. As conceptual design work and research and design continues, the waste management strategies for the AFCF are expected to also evolve.
Risk Management on the National Compact Stellarator Project (NCSX)
In its simplest form, risk management is a continuous assessment from project start to completion that identifies what can impact your project (i.e., what the risks are)., which of these risks are important, and identification and implementation of strategies to deal with these risks (both threats and opportunities). The National Compact Stellerator Experiment (NCSX) Project was a "first-of-a-kind" fusion experiment that was technically very challenging, primarily resulting from the complex component geometries and tight tolerances. Initial risk quantification approaches proved inadequate and contributed to the escalation of costs as the design evolved and construction started. After the Project was well into construction, a new risk management plan was adopted. This plan was based on successful Department of Energy (DOE) and industrial risk management precepts. This paper will address the importance of effective risk management processes and lessons learned. It is of note that a steady reduction of risk was observed in the last six months of the project.
Giving Back: Collaborations with Others in Ecological Studies on the Nevada National Security Site
Formerly named the Nevada Test Site, the Nevada National Security Site (NNSS) was the historical site for nuclear weapons testing from the 1950s to the early 1990s. The site was renamed in 2010 to reflect the diversity of nuclear, energy, and homeland security activities now conducted at the site. Biological and ecological programs and research have been conducted on the site for decades to address the impacts of radiation and to take advantage of the relatively undisturbed and isolated lands for gathering basic information on the occurrence and distribution of native plants and animals. Currently, the Office of the Assistant Manager for Environmental Management of the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Site Office (NNSA/NSO) oversees the radiological biota monitoring and ecological compliance programs on the NNSS. The top priority of these programs are compliance with federal and state regulations. They focus on performing radiological dose assessments for the public who reside near the NNSS and for populations of plants and animals on the NNSS and in protecting important species and habitat from direct impacts of mission activities. The NNSS serves as an invaluable outdoor laboratory. The geographic and ecological diversity of the site offers researchers many opportunities to study human influences on ecosystems. NNSA/NSO has pursued collaborations with outside agencies and organizations to be able to conduct programs and studies that enhance radiological biota monitoring and ecosystem preservation when budgets are restrictive, as well as to provide valuable scientific information to the human health and natural resource communities at large. NNSA/NSO is using one current collaborative study to better assess the potential dose to the off-site public from the ingestion of game animals, the most realistic pathway for off-site public exposure at this time from radionuclide contamination on the NNSS. A second collaborative study is ...
Consequence analysis to support documented safety analysis requires the use of one or more years of representative meteorological data for atmospheric transport and dispersion calculations. At minimum, the needed meteorological data for most atmospheric transport and dispersion models consist of hourly samples of wind speed and atmospheric stability class. Atmospheric stability is inferred from measured and/or observed meteorological data. Several methods exist to convert measured and observed meteorological data into atmospheric stability class data. In this paper, one year of meteorological data from a western Department of Energy (DOE) site is processed to determine atmospheric stability class using two methods. The method that is prescribed by the U.S. Nuclear Regulatory Commission (NRC) for supporting licensing of nuclear power plants makes use of measurements of vertical temperature difference to determine atmospheric stability. Another method that is preferred by the U.S. Environmental Protection Agency (EPA) relies upon measurements of incoming solar radiation, vertical temperature gradient, and wind speed. Consequences are calculated and compared using the two sets of processed meteorological data from these two methods as input data into the MELCOR Accident Consequence Code System 2 (MACCS2) code.
For the Salt Disposition Integration Project (SDIP), postulated events in the new Salt Waste Processing Facility (SWPF) can result in spilling liquids that contain Cs-137 and organics onto cell floors. The parameters of concern are the maximum temperature of the fluid following a spill and the time required for the maximum fluid temperature to be reached. Control volume models of the various process cells have been developed using standard conduction and natural convection relationships. The calculations are performed using the Mathcad modeling software. The results are being used in Consolidated Hazards Analysis Planning (CHAP) to determine the controls that may be needed to mitigate the potential impact of liquids containing Cs-137 and flammable organics that spill onto cell floors. Model development techniques and the ease of making model changes within the Mathcad environment are discussed. The results indicate that certain fluid spills result in overheating of the fluid, but the times to reach steady-state are several hundred hours. The long times allow time for spill clean up without the use of expensive mitigation controls.
The Saltstone Facility at Savannah River Site (SRS) mixes low-level radiological liquid waste with grout for permanent disposal as cement in vault cells. The grout mixture is poured into each cell in approximately 17 batches (8 to 10 hours duration). The grout mixture contains ten flammable gases of concern that are released from the mixture into the cell. Prior to operations, simple parametric transient calculations were performed to develop batch parameters (including schedule of batch pours) to support operational efficiency while ensuring that a flammable gas mixture does not develop in the cell vapor space. The analysis demonstrated that a nonflammable vapor space environment can be achieved, with workable operational constraints, without crediting the ventilation flow as a safety system control. Isopar L was identified as the primary flammable gas of concern. The transient calculations balanced inflows of the flammable gases into the vapor space with credited outflows of diurnal breathing through vent holes and displacement from new grout pours and gases generated. Other important features of the analyses included identifying conditions that inhibited a well-mixed vapor space, the expected frequency and duration of such conditions, and the estimated level of stratification that could develop.
Department of Energy (DOE) accident analysis for establishing the required control sets for nuclear facility safety applies a series of simplifying, reasonably conservative assumptions regarding inputs and methodologies for quantifying dose consequences. Most of the analytical practices are conservative, have a technical basis, and are based on regulatory precedent. However, others are judgmental and based on older understanding of phenomenology. The latter type of practices can be found in modeling hypothetical releases into the atmosphere and the subsequent exposure. Often the judgments applied are not based on current technical understanding but on work that has been superseded. The objective of this paper is to review the technical basis for the major inputs and assumptions in the quantification of consequence estimates supporting DOE accident analysis, and to identify those that could be reassessed in light of current understanding of atmospheric dispersion and radiological exposure. Inputs and assumptions of interest include: Meteorological data basis; Breathing rate; and Inhalation dose conversion factor. A simple dose calculation is provided to show the relative difference achieved by improving the technical bases.
This represents an assessment of the available Savannah River Site (SRS) hard-rock probabilistic seismic hazard assessments (PSHAs), including PSHAs recently completed, for incorporation in the SRS seismic hazard update. The prior assessment of the SRS seismic design basis (WSRC, 1997) incorporated the results from two PSHAs that were published in 1988 and 1993. Because of the vintage of these studies, an assessment is necessary to establish the value of these PSHAs considering more recently collected data affecting seismic hazards and the availability of more recent PSHAs. This task is consistent with the Department of Energy (DOE) order, DOE O 420.1B and DOE guidance document DOE G 420.1-2. Following DOE guidance, the National Map Hazard was reviewed and incorporated in this assessment. In addition to the National Map hazard, alternative ground motion attenuation models (GMAMs) are used with the National Map source model to produce alternate hazard assessments for the SRS. These hazard assessments are the basis for the updated hard-rock hazard recommendation made in this report. The development and comparison of hazard based on the National Map models and PSHAs completed using alternate GMAMs provides increased confidence in this hazard recommendation. The alternate GMAMs are the EPRI (2004), USGS (2002) and a regional specific model (Silva et al., 2004). Weights of 0.6, 0.3 and 0.1 are recommended for EPRI (2004), USGS (2002) and Silva et al. (2004) respectively. This weighting gives cluster weights of .39, .29, .15, .17 for the 1-corner, 2-corner, hybrid, and Greens-function models, respectively. This assessment is judged to be conservative as compared to WSRC (1997) and incorporates the range of prevailing expert opinion pertinent to the development of seismic hazard at the SRS. The corresponding SRS hard-rock uniform hazard spectra are greater than the design spectra developed in WSRC (1997) that were based on the LLNL ...
Time-Resolved Hard X-Ray Spectrometer
Wired array studies are being conducted at the SNL Z accelerator to maximize the x-ray generation for inertial confinement fusion targets and high energy density physics experiments. An integral component of these studies is the characterization of the time-resolved spectral content of the x-rays. Due to potential spatial anisotropy in the emitted radiation, it is also critical to diagnose the time-evolved spectral content in a space-resolved manner. To accomplish these two measurement goals, we developed an x-ray spectrometer using a set of high-speed detectors (silicon PIN diodes) with a collimated field-of-view that converged on a 1-cm-diameter spot at the pinch axis. Spectral discrimination is achieved by placing high Z absorbers in front of these detectors. We built two spectrometers to permit simultaneous different angular views of the emitted radiation. Spectral data have been acquired from recent Z shots for the radial and polar views. UNSPEC1 has been adapted to analyze and unfold the measured data to reconstruct the x-ray spectrum. The unfold operator code, UFO2, is being adapted for a more comprehensive spectral unfolding treatment.
Large-Format X-Ray Pinhole Camera
National Security Technologies, LLC, has successfully implemented many scientific and engineering innovations in the new Large-Format Pinhole Camera (LFPHC), which have dramatically increased the detection sensitivity and reliability of the camera in exotic locations, such as the Sandia National Laboratories Z-facility. Quality improvements of the LFPHC have been demonstrated in its fielding at Z, where high-quality images were recorded. A major improvement was the development of a new, user-friendly LFPHC camera back that would tolerate high radiation, electromagnetic interference, and mechanical shock. Key modifications resulted in improved detection sensitivity, spatial resolution, uniformity along the microchannel plate strip, and stability of the interframe timing and delay. Design considerations and improvements will be discussed.
Understanding the dynamic performance of microchannel plates in pulsed mode
The dynamic performance of a microchannel plate (MCP) is highly dependent on the high-voltage waveforms that are applied to it. Impedance mismatches in MCP detectors can significantly vary the waveforms on the MCP compared to the input pulses. High-voltage pulse waveforms launched onto surface coatings on the MCPs have historically been difficult and expensive to measure. Over the past few years, we have developed and tested techniques utilizing probes to measure the voltage propagation on the surface of MCPs. Square and Gaussian pulses with widths ranging from 200 ps to 2 ns have been applied. We have investigated the effects of coating thickness, microstrip width, and openended versus terminated strips. These data provide a wealth of knowledge that is enabling a better understanding of images recorded with these devices. This presentation discusses a method for measuring voltage profiles on the surface of the MCP and presents Monte Carlo simulations of the optical gate profiles based on the measured waveforms. Excellent agreement in the optical gate profiles have been achieved between the simulations and the experimental measurements using a short-pulse ultraviolet laser.
Megagauss Magnetic Field Sensors Based on Ag2Te
Pulsed power machines capable of producing tremendous energy face various diagnostic and characterizing challenges. Such devices, which may produce 10 - 100MAs, have traditionally relied on Faraday rotation and Rogowski coil technology for time-varying current measurements. Faraday rotation requires a host of costly optical components, including fibers, polarizers, retarders, lasers, and detectors, as well as setup, alignment, and time-consuming post-processing to unwrap the time-dependent current signal. Rogowski coils face potential problems such as physical distortion to the sensor itself due to the tremendous strain caused by magnetically induced pressures, which is proportional to the magnetic field squared (B2). Electrical breakdown in the intense field region is also a major concern. Other related challenges include, but are not limited to, bandwidth and inductance limitations and susceptibility issues related to electrical magnetic interference (EMI).
Comprehensive Epidemiologic Data Resource (CEDR) (Poster)
This poster introduces the Comprehensive Epidemiologic Data Resource (CEDR), an electronic database with demographic, health outcome, and exposure information for over a million DOE nuclear plant and laboratory workers.
U.S. Department of Energy Human Subjects Research Database (HSRD) A model for internal oversight and external transparency
This poster introduces the Department of Energy (DOE) Human Subjects Research Database (HSRD), which contains information on all Department of Energy research projects involving human subjects that: are funded by DOE; are conducted in DOE facilities; are performed by DOE personnel; include current or former DOE or contract personnel.
Reliability Results of NERSC Systems
In order to address the needs of future scientific applications for storing and accessing large amounts of data in an efficient way, one needs to understand the limitations of current technologies and how they may cause systeminstability or unavailability. A number of factors can impact system availability ranging from facility-wide power outage to a single point of failure such as network switches or global file systems. In addition, individual component failure in a system can degrade the performance of that system. This paper focuses on analyzing both of these factors and their impacts on the computational and storage systems at NERSC. Component failure data presented in this report primarily focuses on disk drive in on of the computational system and tape drive failure in HPSS. NERSC collected available component failure data and system-wide outages for its computational and storage systems over a six-year period and made them available to the HPC community through the Petascale Data Storage Institute.
A novel catalytic synthesis gas oxidation process using molten carbonate salts supported on compatible fluidized iron oxide particles (supported-liquid-phase-catalyst (SLPC) fluidized bed process) was investigated. This process combines the advantages of large scale fluidized bed processing with molten salt bath oxidation. Molten salt catalysts can be supported within porous fluidized particles in order to improve mass transfer rates between the liquid catalysts and the reactant gases. Synthesis gas can be oxidized at reduced temperatures resulting in low NO{sub x} formation while trace sulfides and halides are captured in-situ. Hence, catalytic oxidation of synthesis gas can be carried out simultaneously with hot gas cleanup. Such SLPC fluidized bed processes are affected by inter-particle liquid capillary forces that may lead to agglomeration and de-fluidization of the bed. An understanding of the origin and strength of these forces is needed so that they can be overcome in practice. Process design is based on thermodynamic free energy minimization calculations that indicate the suitability of eutectic Na{sub 2}CO{sub 3}/K{sub 2}CO{sub 3} mixtures for capturing trace impurities in-situ (< 1 ppm SO{sub x} released) while minimizing the formation of NO{sub x}(< 10 ppm). Iron oxide has been identified as a preferred support material since it is non-reactive with sodium, is inexpensive, has high density (i.e. inertia), and can be obtained in various particle sizes and porosities. Force balance modeling has been used to design a surrogate ambient temperature system that is hydrodynamically similar to the real system, thus allowing complementary investigation of the governing fluidization hydrodynamics. The primary objective of this research was to understand the origin of and to quantify the liquid capillary interparticle forces affecting the molten carbonate SLPC fluidized bed process. Substantial theoretical and experimental exploratory results indicate process feasibility. The potential environmental gain from success is enormous, impacting all areas of the ...
RTE1, A Novel Regulator of Ethylene Receptor Function
RTE1 is a novel conserved gene found in both plants and animals. The main aims of this project were to: 1) examine Arabidopsis RTE1 function using genetic and cell biological analyses, and 2) determine whether the Arabidopsis RTH gene plays a role similar to that of RTE1 in ethylene signaling.
Multiphase Flow in Complex Fracture Apertures under a Wide Range of Flow Conditions
A better understanding of multiphase flow through fractures requires knowledge of the detailed physics of interfacial flows at the microscopic pore scale. The objective of our project was to develop tools for the simulation of such phenomena. Complementary work was performed by a group led by Dr.~Paul Meakin of the Idaho National Engineering and Environmental Laboratory. Our focus was on the lattice-Boltzmann (LB) method. In particular, we studied both the statics and dynamics of contact lines where two fluids (wetting and non-wetting) meet solid boundaries. Previous work had noted deficiencies in the way LB methods simulate such interfaces. Our work resulted in significant algorithmic improvements that alleviated these deficiencies. As a result, we were able to study in detail the behavior of the dynamic contact angle in flow through capillary tubes. Our simulations revealed that our LB method reproduces the correct scaling of the dynamic contact angle with respect to velocity, viscosity, and surface tension, without specification of an artificial slip length. Further study allowed us to identify the microscopic origin of the dynamic contact angle in LB methods. These results serve to delineate the range of applicability of multiphase LB methods to flows through complex geometries.
Particle and Blood Cell Dynamics in Oscillatory Flows Final Report
Our aim has been to uncover fundamental aspects of the suspension and dislodgement of particles in wall-bounded oscillatory flows, in flows characterized by Reynolds numbers en- compassing the situation found in rivers and near shores (and perhaps in some industrial processes). Our research tools are computational and our coverage of parameter space fairly broad. Computational means circumvent many complications that make the measurement of the dynamics of particles in a laboratory setting an impractical task, especially on the broad range of parameter space we plan to report upon. The impact of this work on the geophysical problem of sedimentation is boosted considerably by the fact that the proposed calculations can be considered ab-initio, in the sense that little to no modeling is done in generating dynamics of the particles and of the moving fluid: we use a three-dimensional Navier Stokes solver along with straightforward boundry conditions. Hence, to the extent that Navier Stokes is a model for an ideal incompressible isotropic Newtonian fluid, the calculations yield benchmark values for such things as the drag, buoyancy, and lift of particles, in a highly controlled environment. Our approach will be to make measurements of the lift, drag, and buoyancy of particles, by considering progressively more complex physical configurations and physics.
Continuous Severe Plastic Deformation Processing of Aluminum Alloys
Metals with grain sizes smaller than 1-micrometer have received much attention in the past decade. These materials have been classified as ultra fine grain (UFG) materials (grain sizes in the range of 100 to 1000-nm) and nano-materials (grain size <100-nm) depending on the grain size. This report addresses the production of bulk UFG metals through the use of severe plastic deformation processing, and their subsequent use as stock material for further thermomechanical processing, such as forging. A number of severe plastic deformation (SPD) methods for producing bulk UFG metals have been developed since the early 1990s. The most promising of these processes for producing large size stock that is suitable for forging is the equal channel angular extrusion or pressing (ECAE/P) process. This process involves introducing large shear strain in the work-piece by pushing it through a die that consists of two channels with the same cross-sectional shape that meet at an angle to each other. Since the cross-sections of the two channels are the same, the extruded product can be re-inserted into the entrance channel and pushed again through the die. Repeated extrusion through the ECAE/P die accumulates sufficient strain to breakdown the microstructure and produce ultra fine grain size. It is well known that metals with very fine grain sizes (< 10-micrometer) have higher strain rate sensitivity and greater elongation to failure at elevated temperature, exhibiting superplastic behavior. However, this superplastic behavior is usually manifest at high temperature (> half the melting temperature on the absolute scale) and very low strain rates (< 0.0001/s). UFG metals have been shown to exhibit superplastic characteristics at lower temperature and higher strain rates, making this phenomenon more practical for manufacturing. This enables part unitization and forging more complex and net shape parts. Laboratory studies have shown that this is particularly true ...
Collaborative Research: Metabolic Engineering of E. coli Sugar-Utilization Regulatory Systems for the Consumption of Plant Biomass Sugars.
The overall objective of this project is to metabolically engineer the E. coli sugar-utilization regulatory systems (SURS) to utilize sugar mixtures obtained from plant biomass. Of particular relevance is the implementation of a metabolic engineering cycle aided by functional genomics and systems biology tools. Our findings will help in the establishment of a platform for the efficient production of fuels and chemicals from lignocellulosic sugars. Our research has improved the understanding of the role of SURS in regulating sugar utilization and several other cellular functions. For example, we discovered that Mlc, a global regulatory protein, regulates the utilization of xylose and demonstrated the existence of an important link between catabolite repression and respiratory/fermentative metabolism. The study of SURS mutants also revealed a connection between flagellar biosynthesis and catabolite repression. Several tools were also developed as part of this project. A novel tool (Elementary Network Decomposition, END) to help elucidate the network topology of regulatory systems was developed and its utility as a discovery tool was demonstrated by applying it to the SURS in E. coli. A novel method (and software) to estimate metabolic fluxes that uses labeling experiments and eliminates reliance on extracellular fluxes was also developed. Although not initially considered in the scope of this project, we have developed a novel and superior method for optimization of HPLC separation and applied it to the simultaneous quantification of different functionalities (sugars, organic acids, ethanol, etc.) present in our fermentation samples. Currently under development is a genetic network driven metabolic flux analysis framework to integrate transcriptional and flux data.
DOE-ER-46139-Phase II-Final-Report-Tritt-2011
This proposal emphasizes investigations of the thermal and electrical transport properties of new and novel solid-state materials, with the specific goal of achieving higher efficiency solid-state thermoelectric materials. This program will continue to build a very strong collaborative research effort between researchers at Oak Ridge National Laboratory (ONRL) and Clemson University. We propose three new faculty hires and major equipment purchases in order to further enhance our level of national recognition. We will be positioned for competition for major non-EPSCoR DOE and DOD funding (i.e. NSF-Materials Research Center) and able to address many other areas of DOE and national importance. Graduate and undergraduate students will be extensively involved in this project, spending significant time at ORNL, thus gaining important training and educational opportunities. We will also include an outreach program to bring in outside students and faculty. An External Advisory Board of distinguished scientists will provide oversight to the program.
Evaluation of Summary of Temperature Data Collected to Improve Emergence Timing Estimates for Chum and Chinook Salmon in the Lower Columbia River, 2004-2005 Progress Report, Appendix C - Temperature Data Compendium.
No abstract prepared.
Evaluation of Summary of Temperature Data Collected to Improve Emergence Timing Estimates for Chum and Fall Chinook Salmon in the Lower Columbia River, 2004-2005, Appendix C - Temperature Data Compendium.
No abstract prepared.
Magnetic Adsorption Method for the Treatment of Metal Contaminated Aqueous Waste
There have been many recent developments in separation methods used for treating radioactive and non-radioactive metal bearing liquid wastes. These methods have included adsorption, ion exchange, solvent extraction and other chemical and physical techniques. To date very few, if any, of these processes can provide a low cost and environmentally benign solution. Recent research into the use of magnetite for wastewater treatment indicates the potential for magnetite both cost and environment drivers. A brief review of recent work in using magnetite as a sorbent is presented as well as recent work performed in our laboratory using supported magnetite in the presence of an external magnetic field. The application to groundwater and other aqueous waste streams is discussed. Recent research has focused on supporting magnetite in an economical (as compared to the magnetic polymine-epichlorohydrine resin) and inert (non-reactive, chemically or otherwise) environment that promotes both adsorption and satisfactory flow characteristics.
Evaluation of Natural Attenuation as One Component of Chloroethene-Contaminated Groundwater Remediation
Test Area North (TAN) at the Idaho National Engineering and Environmental Laboratory (INEEL) is the site of a large trichloroethene (TCE) plume resulting from the historical injection of wastewater into the Snake River Plain Aquifer. The TAN Record of Decision (ROD) selected pump and treat as the final remedy and included a contingency for post-ROD treatability studies of alternative technologies. The technologies still under consideration are in-situ bioremediation, in-situ chemical oxidation, and natural attenuation. Both anaerobic and aerobic laboratory microcosm studies indicate the presence of microorganisms capable of chloroethene degradation. Field data indicate that TCE concentrations decrease relative to tritium and tetrachloroethene indicating an as yet unknown process is contributing to natural attenuation of TCE. Several methods for analyzing the field data have been evaluated and important limitations identified. Early results from the continued evaluation of the three alternative technologies suggest the combined approach of active remediation of the source area (in situ bioremediation and/or chemical oxidation replacing or augmenting pump and treat) and natural attenuation within the dissolved phase plume may be more cost and schedule effective than the base case pump and treat.