19 Matching Results

Search Results

Advanced search parameters have been applied.

Accelerator production of tritium authorization basis strategy

Description: The Accelerator Production of Tritium (APT) project has proposed a strategy to develop the APT authorization basis and safety case based on DOE orders and fundamental requirements for safe operation. The strategy is viable regardless of whether the APT is regulated by DOE or by an external regulatory body. Currently the operation of Department of Energy (DOE) facilities is authorized by DOE and regulated by DOE orders and regulations while meeting the environmental protection requirements of the Environmental Protection Agency (EPA) and the states. In the spring of 1994, Congress proposed legislation and held hearings related to requiring all DOE operations to be subject to external regulation. On January 25, 1995, DOE, with the support of the White House Council on Environmental Quality, created the Advisory Committee on External Regulation of Department of Energy Nuclear Safety. This committee divided its recommendations into three areas: (1) facility safety, (2) worker safety, and (3) environmental protection. In the area of facility safety the committee recommended external regulation of DOE nuclear facilities by either the Nuclear Regulatory Commission (NRC) or a restructured Defense Nuclear Facilities Safety Board (DNFSB). In the area of worker safety, the committee recommended that the Occupational Safety and Health Administration (OSHA) regulate DOE nuclear facilities. In the environmental protection area, the committee did not recommend a change in the regulation by the EPA and the states of DOE nuclear facilities. If these recommendations are accepted, all DOE nuclear facilities will be impacted to some extent.
Date: May 1, 1996
Creator: Miller, L.A.; Edwards, J. & Rose, S.
Partner: UNT Libraries Government Documents Department

Accelerator Production of Tritium Programmatic Environmental Impact Statement Input Submittal

Description: The Programmatic Environmental Impact Statement for Tritium Supply and Recycling considers several methods for the production of tritium. One of these methods is the Accelerator Production of Tritium. This report summarizes the design characteristics of APT including the accelerator, target/blanket, tritium extraction facility, and the balance of plant. Two spallation targets are considered: (1) a tungsten neutron-source target and (2) a lead neutron-source target. In the tungsten target concept, the neutrons are captured by the circulating He-3, thus producing tritium; in the lead target concept, the tritium is produced by neutron capture by Li-6 in a surrounding lithium-aluminum blanket. This report also provides information to support the PEIS including construction and operational resource needs, waste generation, and potential routine and accidental releases of radioactive material. The focus of the report is on the impacts of a facility that will produce 3/8th of the baseline goal of tritium. However, some information is provided on the impacts of APT facilities that would produce smaller quantities.
Date: February 1, 1996
Creator: Miller, L.A.; Greene, G.A. & Boyack, B.E.
Partner: UNT Libraries Government Documents Department

Preliminary environmental assessments of known geothermal resource areas in the United States

Description: The basic purpose of the Geothermal Overview Project is to identify, summarize, and assess the environmental issues of the top priority KGRAs from among the approximately 40 KGRAs currently identified by the Division of Geothermal Energy, DOE, as having high possibilities for commercial development. The Geothermal Overview Project addresses issues pertaining to air quality, ecosystems quality, noise effects, geological effects, water quality, socioeconomic effects, and health effects. For each KGRA the following functions are accomplished: identification of key issues; inventory of all available data; analysis and assessment of available data; and, identification of what additional information is required for adequate assessments. Studies at the Geysers-Calistoga KGRA in Northern California are used as an example.
Date: July 1, 1978
Creator: Phelps, P.L.; Ermak, D.L.; Anspaugh, L.R.; Jackson, C.D. & Miller, L.A.
Partner: UNT Libraries Government Documents Department

Guidelines for the verification and validation of expert system software and conventional software: Rationale and description of V&V guideline packages and procedures. Volume 5

Description: This report is the fifth volume in a series of reports describing the results of the Expert System Verification C, and Validation (V&V) project which is jointly funded by the U.S. Nuclear Regulatory Commission and the Electric Power Research Institute toward the objective of formulating Guidelines for the V&V of expert systems for use in nuclear power applications. This report provides the rationale for and description of those guidelines. The actual guidelines themselves are presented in Volume 7, {open_quotes}User`s Manual.{close_quotes} Three factors determine what V&V is needed: (1) the stage of the development life cycle (requirements, design, or implementation); (2) whether the overall system or a specialized component needs to be tested (knowledge base component, inference engine or other highly reusable element, or a component involving conventional software); and (3) the stringency of V&V that is needed (as judged from an assessment of the system`s complexity and the requirement for its integrity to form three Classes). A V&V Guideline package is provided for each of the combinations of these three variables. The package specifies the V&V methods recommended and the order in which they should be administered, the assurances each method provides, the qualifications needed by the V&V team to employ each particular method, the degree to which the methods should be applied, the performance measures that should be taken, and the decision criteria for accepting, conditionally accepting, or rejecting an evaluated system. In addition to the Guideline packages, highly detailed step-by-step procedures are provided for 11 of the more important methods, to ensure that they can be implemented correctly. The Guidelines can apply to conventional procedural software systems as well as all kinds of Al systems.
Date: March 1, 1995
Creator: Mirsky, S. M.; Hayes, J. E. & Miller, L. A.
Partner: UNT Libraries Government Documents Department

Guidelines for the verification and validation of expert system software and conventional software: Evaluation of knowledge base certification methods. Volume 4

Description: This report presents the results of the Knowledge Base Certification activity of the expert systems verification and validation (V&V) guideline development project which is jointly funded by the US Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. This activity is concerned with the development and testing of various methods for assuring the quality of knowledge bases. The testing procedure used was that of behavioral experiment, the first known such evaluation of any type of V&V activity. The value of such experimentation is its capability to provide empirical evidence for -- or against -- the effectiveness of plausible methods in helping people find problems in knowledge bases. The three-day experiment included 20 participants from three nuclear utilities, the Nuclear Regulatory Commission`s Technical training Center, the University of Maryland, EG&G Idaho, and SAIC. The study used two real nuclear expert systems: a boiling water reactor emergency operating procedures tracking system and a pressurized water reactor safety assessment systems. Ten participants were assigned to each of the expert systems. All participants were trained in and then used a sequence of four different V&V methods selected as being the best and most appropriate for study on the basis of prior evaluation activities. These methods either involved the analysis and tracing of requirements to elements in the knowledge base (requirements grouping and requirements tracing) or else involved direct inspection of the knowledge base for various kinds of errors. Half of the subjects within each system group used the best manual variant of the V&V methods (the control group), while the other half were supported by the results of applying real or simulated automated tools to the knowledge bases (the experimental group).
Date: March 1995
Creator: Miller, L. A.; Hayes, J. E. & Mirsky, S. M.
Partner: UNT Libraries Government Documents Department

Safety of spallation sources in the accelerator production of tritium

Description: The Accelerator Production of Tritium (APT) project will employ a high power proton accelerator to generate neutrons in a spallation target for the production of tritium. This paper will describe major attributes of the safety of this facility. 1 fig.
Date: October 1, 1996
Creator: Edwards, J.; Lowrie, B.; Miller, L-A.; Rose, S.; Schweitzer, E. & Darby, J.
Partner: UNT Libraries Government Documents Department

Guidelines for the verification and validation of expert system software and conventional software: Validation scenarios. Volume 6

Description: This report is the sixth volume in a series of reports describing the results of the Expert System Verification and Validation (V&V) project which is jointly funded by the US Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. This activity was concerned with the development of a methodology for selecting validation scenarios and subsequently applying it to two expert systems used for nuclear utility applications. Validation scenarios were defined and classified into five categories: PLANT, TEST, BASICS, CODE, and LICENSING. A sixth type, REGRESSION, is a composite of the others and refers to the practice of using trusted scenarios to ensure that modifications to software did not change unmodified functions. Rationale was developed for preferring scenarios selected from the categories in the order listed and for determining under what conditions to select scenarios from other types. A procedure incorporating all of the recommendations was developed as a generalized method for generating validation scenarios. The procedure was subsequently applied to two expert systems used in the nuclear industry and was found to be effective, given that an experienced nuclear engineer made the final scenario selections. A method for generating scenarios directly from the knowledge base component was suggested.
Date: March 1, 1995
Creator: Mirsky, S. M.; Hayes, J. E. & Miller, L. A.
Partner: UNT Libraries Government Documents Department

Guidelines for the verification and validation of expert system software and conventional software: Bibliography. Volume 8

Description: This volume contains all of the technical references found in Volumes 1-7 concerning the development of guidelines for the verification and validation of expert systems, knowledge-based systems, other AI systems, object-oriented systems, and conventional systems.
Date: March 1, 1995
Creator: Miller, L. A.; Hayes, J. E. & Mirsky, S. M.
Partner: UNT Libraries Government Documents Department

Guidelines for the verification and validation of expert system software and conventional software: Project summary. Volume 1

Description: This eight-volume report presents guidelines for performing verification and validation (V&V) on Artificial Intelligence (Al) systems with nuclear applications. The guidelines have much broader application than just expert systems; they are also applicable to object-oriented programming systems, rule-based systems, frame-based systems, model-based systems, neural nets, genetic algorithms, and conventional software systems. This is because many of the components of AI systems are implemented in conventional procedural programming languages, so there is no real distinction. The report examines the state of the art in verifying and validating expert systems. V&V methods traditionally applied to conventional software systems are evaluated for their applicability to expert systems. One hundred fifty-three conventional techniques are identified and evaluated. These methods are found to be useful for at least some of the components of expert systems, frame-based systems, and object-oriented systems. A taxonomy of 52 defect types and their delectability by the 153 methods is presented. With specific regard to expert systems, conventional V&V methods were found to apply well to all the components of the expert system with the exception of the knowledge base. The knowledge base requires extension of the existing methods. Several innovative static verification and validation methods for expert systems have been identified and are described here, including a method for checking the knowledge base {open_quotes}semantics{close_quotes} and a method for generating validation scenarios. Evaluation of some of these methods was performed both analytically and experimentally. A V&V methodology for expert systems is presented based on three factors: (1) a system`s judged need for V&V (based in turn on its complexity and degree of required integrity); (2) the life-cycle phase; and (3) the system component being tested.
Date: March 1, 1995
Creator: Mirsky, S. M.; Hayes, J. E. & Miller, L. A.
Partner: UNT Libraries Government Documents Department

Guidelines for the verification and validation of expert system software and conventional software: User`s manual. Volume 7

Description: This report provides a step-by-step guide, or user manual, for personnel responsible for the planning and execution of the verification and validation (V&V), and developmental testing, of expert systems, conventional software systems, and various other types of artificial intelligence systems. While the guide was developed primarily for applications in the utility industry, it applies well to all industries. The user manual has three sections. In Section 1 the user assesses the stringency of V&V needed for the system under consideration, identifies the development stage the system is in, and identifies the component(s) of the system to be tested next. These three pieces of information determine which Guideline Package of V&V methods is most appropriate for those conditions. The V&V Guideline Packages are provided in Section 2. Each package consists of an ordered set of V&V techniques to be applied to the system, guides on choosing the review/evaluation team, measurement criteria, and references to a book or report which describes the application of the method. Section 3 presents details of 11 of the most important (or least well-explained in the literature) methods to assist the user in applying these techniques accurately.
Date: March 1, 1995
Creator: Mirsky, S. M.; Hayes, J. E. & Miller, L. A.
Partner: UNT Libraries Government Documents Department

RADTRAD: A simplified model for RADionuclide Transport and Removal And Dose estimation

Description: This report documents the RADTRAD computer code developed for the U.S. Nuclear Regulatory Commission (NRC) Office of Nuclear Reactor Regulation (NRR) to estimate transport and removal of radionuclides and dose at selected receptors. The document includes a users` guide to the code, a description of the technical basis for the code, the quality assurance and code acceptance testing documentation, and a programmers` guide. The RADTRAD code can be used to estimate the containment release using either the NRC TID-14844 or NUREG-1465 source terms and assumptions, or a user-specified table. In addition, the code can account for a reduction in the quantity of radioactive material due to containment sprays, natural deposition, filters, and other natural and engineered safety features. The RADTRAD code uses a combination of tables and/or numerical models of source term reduction phenomena to determine the time-dependent dose at user-specified locations for a given accident scenario. The code system also provides the inventory, decay chain, and dose conversion factor tables needed for the dose calculation. The RADTRAD code can be used to assess occupational radiation exposures, typically in the control room; to estimate site boundary doses; and to estimate dose attenuation due to modification of a facility or accident sequence.
Date: April 1, 1998
Creator: Humphreys, S.L.; Miller, L.A.; Monroe, D.K. & Heames, T.J.
Partner: UNT Libraries Government Documents Department

Guidelines for the verification and validation of expert system software and conventional software: Survey and documentation of expert system verification and validation methodologies. Volume 3

Description: This report is the third volume in the final report for the Expert System Verification and Validation (V&V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V&V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V&V of expert systems is not nearly as established or prevalent as V&V of conventional software systems. When V&V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of {open_quotes}ad hoc testing.{close_quotes} There were few examples of employing V&V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V&V methods in an earlier task.
Date: March 1, 1995
Creator: Groundwater, E. H.; Miller, L. A. & Mirsky, S. M.
Partner: UNT Libraries Government Documents Department

Guidelines for the verification and validation of expert system software and conventional software: Survey and assessment of conventional software verification and validation methods. Volume 2

Description: By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 153 methods so identified were classified according to their appropriateness for various phases of a developmental life-cycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods` power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V&V (determined by ratings of a system`s complexity and required-integrity). Methods were then rank-ordered for each of the three classes by terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each for the identified components of knowledge-based and expert systems, as well as the system as a whole.
Date: March 1, 1995
Creator: Mirsky, S. M.; Groundwater, E. H.; Hayes, J. E. & Miller, L. A.
Partner: UNT Libraries Government Documents Department

Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

Description: The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.
Date: January 1, 1995
Creator: Harper, F. T.; Young, M. L. & Miller, L. A.
Partner: UNT Libraries Government Documents Department

Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

Description: The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.
Date: January 1, 1995
Creator: Harper, F. T.; Young, M. L.; Miller, L.A.; Hora, S. C.; Lui, C. H.; Goossens, L. H. J. et al.
Partner: UNT Libraries Government Documents Department

Accident progression event tree analysis for postulated severe accidents at N Reactor

Description: A Level II/III probabilistic risk assessment (PRA) has been performed for N Reactor, a Department of Energy (DOE) production reactor located on the Hanford reservation in Washington. The accident progression analysis documented in this report determines how core damage accidents identified in the Level I PRA progress from fuel damage to confinement response and potential releases the environment. The objectives of the study are to generate accident progression data for the Level II/III PRA source term model and to identify changes that could improve plant response under accident conditions. The scope of the analysis is comprehensive, excluding only sabotage and operator errors of commission. State-of-the-art methodology is employed based largely on the methods developed by Sandia for the US Nuclear Regulatory Commission in support of the NUREG-1150 study. The accident progression model allows complex interactions and dependencies between systems to be explicitly considered. Latin Hypecube sampling was used to assess the phenomenological and systemic uncertainties associated with the primary and confinement system responses to the core damage accident. The results of the analysis show that the N Reactor confinement concept provides significant radiological protection for most of the accident progression pathways studied.
Date: June 1, 1990
Creator: Wyss, G.D.; Camp, A.L.; Miller, L.A.; Dingman, S.E.; Kunsman, D.M. (Sandia National Labs., Albuquerque, NM (USA)) & Medford, G.T. (Science Applications International Corp., Albuquerque, NM (USA))
Partner: UNT Libraries Government Documents Department

Anisotropic displacement threshold energies in silicon by molecular dynamics simulations

Description: A combination of molecular dynamics simulations and theoretical modeling was used to examine the orientation dependent threshold energies for displacement of silicon atoms from their lattice site due to energetic particle collisions. These results are important for a detailed understanding of both radiation effects in silicon devices and beam-enhanced stimulation of molecular beam epitaxial growth. The molecular dynamics code developed for this study, which employs a Tersoff interaction potential, as well as the theoretical model that incorporates the symmetry of the crystal are described. Bulk displacement threshold energies were determined by the molecular dynamics code for four directions through the open face in the {l angle}111{r angle}. These values were then incorporated into the theoretical model for the average bulk displacement threshold energy. The average bulk displacement threshold energy was found to be 14.8 eV in 30{degree} about {l angle}111{r angle} and 11.1 eV in 20{degree} about {l angle}100{r angle}.
Date: January 1, 1990
Creator: Miller, L.A.; Brice, D.K.; Picraux, S.T. (Sandia National Labs., Albuquerque, NM (USA)) & Prinja, A.K. (New Mexico Univ., Albuquerque, NM (USA). Dept. of Chemical and Nuclear Engineering)
Partner: UNT Libraries Government Documents Department

Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

Description: The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project.
Date: January 1, 1995
Creator: Harper, F. T.; Young, M. L.; Miller, L. A.; Hora, S. C.; Lui, C. H.; Goossens, L. H. J. et al.
Partner: UNT Libraries Government Documents Department

Summary of uncertainty analysis of dispersion and deposition modules of the MACCS and COSYMA consequence codes: A joint USNRC/CEC study

Description: This paper briefly describes an ongoing project designed to assess the uncertainty in offsite radiological consequence calculations of hypothetical accidents in commercial nuclear power plants. This project is supported jointly by the Commission of European Communities (CEC) and the US Nuclear Regulatory Commission (USNRC). Both commissions have expressed an interest in assessing the uncertainty in consequence calculations used for risk assessments and regulatory purposes.
Date: October 1, 1993
Creator: Harper, F. T.; Miller, L. A.; Young, M. L.; Goossens, L. H. J.; Cooke, R. M.; Hora, S. C. et al.
Partner: UNT Libraries Government Documents Department