27 Matching Results

Search Results

Advanced search parameters have been applied.

Improving Thermal Model Prediction Through Statistical Analysis of Irradiation and Post-Irradiation Data from AGR Experiments

Description: As part of the Research and Development program for Next Generation High Temperature Reactors (HTR), a series of irradiation tests, designated as Advanced Gas-cooled Reactor (AGR), have been defined to support development and qualification of fuel design, fabrication process, and fuel performance under normal operation and accident conditions. The AGR tests employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule and instrumented with thermocouples (TC) embedded in graphite blocks enabling temperature control. The data representing the crucial test fuel conditions (e.g., temperature, neutron fast fluence, and burnup) while impossible to obtain from direct measurements are calculated by physics and thermal models. The irradiation and post-irradiation examination (PIE) experimental data are used in model calibration effort to reduce the inherent uncertainty of simulation results. This paper is focused on fuel temperature predicted by the ABAQUS code’s finite element-based thermal models. The work follows up on a previous study, in which several statistical analysis methods were adapted, implemented in the NGNP Data Management and Analysis System (NDMAS), and applied for improving qualification of AGR-1 thermocouple data. The present work exercises the idea that the abnormal trends of measured data observed from statistical analysis may be caused by either measuring instrument deterioration or physical mechanisms in capsules that may have shifted the system thermal response. As an example, the uneven reduction of the control gas gap in Capsule 5 revealed by the capsule metrology measurements in PIE helps justify the reduction in TC readings instead of TC drift. This in turn prompts modification of thermal model to better fit with experimental data, thus help increase confidence, and in other word reduce model uncertainties in thermal simulation results of the AGR-1 test.
Date: October 1, 2012
Creator: Pham, Dr. Binh T.; Hawkes, Grant L. & Einerson, Jeffrey J.
Partner: UNT Libraries Government Documents Department

A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

Description: Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.
Date: October 1, 2006
Creator: Johnson, J. D. (Prostat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ) & Storlie, Curtis B. (North Carolina State University, Raleigh, NC)
Partner: UNT Libraries Government Documents Department

Incorporating uncertainty in RADTRAN 6.0 input files.

Description: Uncertainty may be introduced into RADTRAN analyses by distributing input parameters. The MELCOR Uncertainty Engine (Gauntt and Erickson, 2004) has been adapted for use in RADTRAN to determine the parameter shape and minimum and maximum of the distribution, to sample on the distribution, and to create an appropriate RADTRAN batch file. Coupling input parameters is not possible in this initial application. It is recommended that the analyst be very familiar with RADTRAN and able to edit or create a RADTRAN input file using a text editor before implementing the RADTRAN Uncertainty Analysis Module. Installation of the MELCOR Uncertainty Engine is required for incorporation of uncertainty into RADTRAN. Gauntt and Erickson (2004) provides installation instructions as well as a description and user guide for the uncertainty engine.
Date: February 1, 2010
Creator: Dennis, Matthew L.; Weiner, Ruth F. & Heames, Terence John (Alion Science and Technology)
Partner: UNT Libraries Government Documents Department

Uncertainty and sensitivity analyses of ballast life-cycle cost and payback period

Description: The paper introduces an innovative methodology for evaluating the relative significance of energy-efficient technologies applied to fluorescent lamp ballasts. The method involves replacing the point estimates of life cycle cost of the ballasts with uncertainty distributions reflecting the whole spectrum of possible costs, and the assessed probability associated with each value. The results of uncertainty and sensitivity analyses will help analysts reduce effort in data collection and carry on analysis more efficiently. These methods also enable policy makers to gain an insightful understanding of which efficient technology alternatives benefit or cost what fraction of consumers, given the explicit assumptions of the analysis.
Date: June 1, 2000
Creator: McMahon, James E.; Liu, Xiaomin; Turiel, Ike; Hakim, Sajid & Fisher, Diane
Partner: UNT Libraries Government Documents Department

Uncertainty Analysis of RELAP5-3D

Description: As world-wide energy consumption continues to increase, so does the demand for the use of alternative energy sources, such as Nuclear Energy. Nuclear Power Plants currently supply over 370 gigawatts of electricity, and more than 60 new nuclear reactors have been commissioned by 15 different countries. The primary concern for Nuclear Power Plant operation and lisencing has been safety. The safety of the operation of Nuclear Power Plants is no simple matter- it involves the training of operators, design of the reactor, as well as equipment and design upgrades throughout the lifetime of the reactor, etc. To safely design, operate, and understand nuclear power plants, industry and government alike have relied upon the use of best-estimate simulation codes, which allow for an accurate model of any given plant to be created with well-defined margins of safety. The most widely used of these best-estimate simulation codes in the Nuclear Power industry is RELAP5-3D. Our project focused on improving the modeling capabilities of RELAP5-3D by developing uncertainty estimates for its calculations. This work involved analyzing high, medium, and low ranked phenomena from an INL PIRT on a small break Loss-Of-Coolant Accident as wall as an analysis of a large break Loss-Of- Coolant Accident. Statistical analyses were performed using correlation coefficients. To perform the studies, computer programs were written that modify a template RELAP5 input deck to produce one deck for each combination of key input parameters. Python scripting enabled the running of the generated input files with RELAP5-3D on INL’s massively parallel cluster system. Data from the studies was collected and analyzed with SAS. A summary of the results of our studies are presented.
Date: July 1, 2012
Creator: Gertman, Alexandra E & Mesina, Dr. George L
Partner: UNT Libraries Government Documents Department

Solution-verified reliability analysis and design of bistable MEMS using error estimation and adaptivity.

Description: This report documents the results for an FY06 ASC Algorithms Level 2 milestone combining error estimation and adaptivity, uncertainty quantification, and probabilistic design capabilities applied to the analysis and design of bistable MEMS. Through the use of error estimation and adaptive mesh refinement, solution verification can be performed in an automated and parameter-adaptive manner. The resulting uncertainty analysis and probabilistic design studies are shown to be more accurate, efficient, reliable, and convenient.
Date: October 1, 2006
Creator: Eldred, Michael Scott; Subia, Samuel Ramirez; Neckels, David; Hopkins, Matthew Morgan; Notz, Patrick K.; Adams, Brian M. et al.
Partner: UNT Libraries Government Documents Department

Representation of analysis results involving aleatory and epistemic uncertainty.

Description: Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for the representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.
Date: August 1, 2008
Creator: Johnson, Jay Dean (ProStat, Mesa, AZ); Helton, Jon Craig (Arizona State University, Tempe, AZ); Oberkampf, William Louis & Sallaberry, Cedric J.
Partner: UNT Libraries Government Documents Department

Uncertainty Analysis for RELAP5-3D

Description: In its current state, RELAP5-3D is a 'best-estimate' code; it is one of our most reliable programs for modeling what occurs within reactor systems in transients from given initial conditions. This code, however, remains an estimator. A statistical analysis has been performed that begins to lay the foundation for a full uncertainty analysis. By varying the inputs over assumed probability density functions, the output parameters were shown to vary. Using such statistical tools as means, variances, and tolerance intervals, a picture of how uncertain the results are based on the uncertainty of the inputs has been obtained.
Date: August 1, 2011
Creator: Pawel, Aaron J. & Mesina, Dr. George L.
Partner: UNT Libraries Government Documents Department

Experimental uncertainty estimation and statistics for data having interval uncertainty.

Description: This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.
Date: May 1, 2007
Creator: Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York) & Hajagos, Janos (Applied Biomathematics, Setauket, New York)
Partner: UNT Libraries Government Documents Department

Extended Forward Sensitivity Analysis for Uncertainty Quantification

Description: This report presents the forward sensitivity analysis method as a means for quantification of uncertainty in system analysis. The traditional approach to uncertainty quantification is based on a “black box” approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. This approach requires large number of simulation runs and therefore has high computational cost. Contrary to the “black box” method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In this approach equations for the propagation of uncertainty are constructed and the sensitivity is solved for as variables in the same simulation. This “glass box” method can generate similar sensitivity information as the above “black box” approach with couples of runs to cover a large uncertainty region. Because only small numbers of runs are required, those runs can be done with a high accuracy in space and time ensuring that the uncertainty of the physical model is being measured and not simply the numerical error caused by the coarse discretization. In the forward sensitivity method, the model is differentiated with respect to each parameter to yield an additional system of the same size as the original one, the result of which is the solution sensitivity. The sensitivity of any output variable can then be directly obtained from these sensitivities by applying the chain rule of differentiation. We extend the forward sensitivity method to include time and spatial steps as special parameters so that the numerical errors can be quantified against other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty analysis. By knowing the relative sensitivity ...
Date: September 1, 2008
Creator: Zhao, Haihua & Mousseau, Vincent A.
Partner: UNT Libraries Government Documents Department

Ideas underlying quantification of margins and uncertainties(QMU): a white paper.

Description: This report describes key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions at Sandia National Laboratories. While QMU is a broad process and methodology for generating critical technical information to be used in stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, we discuss the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, the need to separate aleatory and epistemic uncertainty in QMU, and the risk-informed decision making that is best suited for decisive application of QMU. The paper is written at a high level, but provides a systematic bibliography of useful papers for the interested reader to deepen their understanding of these ideas.
Date: September 1, 2006
Creator: Helton, Jon Craig; Trucano, Timothy Guy & Pilch, Martin M.
Partner: UNT Libraries Government Documents Department

Survey of sampling-based methods for uncertainty and sensitivity analysis.

Description: Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.
Date: June 1, 2006
Creator: Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. & Storlie, Curt B.
Partner: UNT Libraries Government Documents Department

ASSESSMENT OF UNCERTAINTY IN THE RADIATION DOSES FOR THE TECHA RIVER DOSIMETRY SYSTEM

Description: In order to provide more accurate and precise estimates of individual dose (and thus more precise estimates of radiation risk) for the members of the ETRC, a new dosimetric calculation system, the Techa River Dosimetry System-2009 (TRDS-2009) has been prepared. The deterministic version of the improved dosimetry system TRDS-2009D was basically completed in April 2009. Recent developments in evaluation of dose-response models in light of uncertain dose have highlighted the importance of different types of uncertainties in the development of individual dose estimates. These include uncertain parameters that may be either shared or unshared within the dosimetric cohort, and also the nature of the type of uncertainty as aleatory or epistemic and either classical or Berkson. This report identifies the nature of the various input parameters and calculational methods incorporated in the Techa River Dosimetry System (based on the TRDS-2009D implementation), with the intention of preparing a stochastic version to estimate the uncertainties in the dose estimates. This report reviews the equations, databases, and input parameters, and then identifies the author’s interpretations of their general nature. It presents the approach selected so that the stochastic, Monte-Carlo, implementation of the dosimetry System - TRDS-2009MC - will provide useful information regarding the uncertainties of the doses.
Date: October 23, 2009
Creator: Napier, Bruce A.; Degteva, M. O.; Anspaugh, L. R. & Shagina, N. B.
Partner: UNT Libraries Government Documents Department

An example uncertainty and sensitivity analysis at the Horonobe site for performance assessment calculations.

Description: Given pre-existing Groundwater Modeling System (GMS) models of the Horonobe Underground Research Laboratory (URL) at both the regional and site scales, this work performs an example uncertainty analysis for performance assessment (PA) applications. After a general overview of uncertainty and sensitivity analysis techniques, the existing GMS sitescale model is converted to a PA model of the steady-state conditions expected after URL closure. This is done to examine the impact of uncertainty in site-specific data in conjunction with conceptual model uncertainty regarding the location of the Oomagari Fault. In addition, a quantitative analysis of the ratio of dispersive to advective forces, the F-ratio, is performed for stochastic realizations of each conceptual model. All analyses indicate that accurate characterization of the Oomagari Fault with respect to both location and hydraulic conductivity is critical to PA calculations. This work defines and outlines typical uncertainty and sensitivity analysis procedures and demonstrates them with example PA calculations relevant to the Horonobe URL.
Date: August 1, 2004
Creator: James, Scott Carlton
Partner: UNT Libraries Government Documents Department

Development of Performance Acceptance Test Guidelines for Large Commercial Parabolic Trough Solar Fields: Preprint

Description: Prior to commercial operation, large solar systems in utility-size power plants need to pass a performance acceptance test conducted by the EPC contractor or owners. In lieu of the present absence of engineering code developed for this purpose, NREL has undertaken the development of interim guidelines to provide recommendations for test procedures that can yield results of a high level of accuracy consistent with good engineering knowledge and practice. The fundamental differences between acceptance of a solar power plant and a conventional fossil-fired plant are the transient nature of the energy source and the necessity to utilize an analytical performance model in the acceptance process. These factors bring into play the need to establish methods to measure steady state performance, potential impacts of transient processes, comparison to performance model results, and the possible requirement to test, or model, multi-day performance within the scope of the acceptance test procedure. The power block and BOP are not within the boundaries of this guideline. The current guideline is restricted to the solar thermal performance of parabolic trough systems and has been critiqued by a broad range of stakeholders in CSP development and technology.
Date: December 1, 2010
Creator: Kearney, D. & Mehos, M.
Partner: UNT Libraries Government Documents Department

Acceptance Performance Test Guideline for Utility Scale Parabolic Trough and Other CSP Solar Thermal Systems: Preprint

Description: Prior to commercial operation, large solar systems in utility-size power plants need to pass a performance acceptance test conducted by the engineering, procurement, and construction (EPC) contractor or owners. In lieu of the present absence of ASME or other international test codes developed for this purpose, the National Renewable Energy Laboratory has undertaken the development of interim guidelines to provide recommendations for test procedures that can yield results of a high level of accuracy consistent with good engineering knowledge and practice. Progress on interim guidelines was presented at SolarPACES 2010. Significant additions and modifications were made to the guidelines since that time, resulting in a final report published by NREL in April 2011. This paper summarizes those changes, which emphasize criteria for assuring thermal equilibrium and steady state conditions within the solar field.
Date: August 1, 2011
Creator: Mehos, M. S.; Wagner, M. J. & Kearney, D. W.
Partner: UNT Libraries Government Documents Department

Sensitivity technologies for large scale simulation.

Description: Sensitivity analysis is critically important to numerous analysis algorithms, including large scale optimization, uncertainty quantification,reduced order modeling, and error estimation. Our research focused on developing tools, algorithms and standard interfaces to facilitate the implementation of sensitivity type analysis into existing code and equally important, the work was focused on ways to increase the visibility of sensitivity analysis. We attempt to accomplish the first objective through the development of hybrid automatic differentiation tools, standard linear algebra interfaces for numerical algorithms, time domain decomposition algorithms and two level Newton methods. We attempt to accomplish the second goal by presenting the results of several case studies in which direct sensitivities and adjoint methods have been effectively applied, in addition to an investigation of h-p adaptivity using adjoint based a posteriori error estimation. A mathematical overview is provided of direct sensitivities and adjoint methods for both steady state and transient simulations. Two case studies are presented to demonstrate the utility of these methods. A direct sensitivity method is implemented to solve a source inversion problem for steady state internal flows subject to convection diffusion. Real time performance is achieved using novel decomposition into offline and online calculations. Adjoint methods are used to reconstruct initial conditions of a contamination event in an external flow. We demonstrate an adjoint based transient solution. In addition, we investigated time domain decomposition algorithms in an attempt to improve the efficiency of transient simulations. Because derivative calculations are at the root of sensitivity calculations, we have developed hybrid automatic differentiation methods and implemented this approach for shape optimization for gas dynamics using the Euler equations. The hybrid automatic differentiation method was applied to a first order approximation of the Euler equations and used as a preconditioner. In comparison to other methods, the AD preconditioner showed better convergence behavior. Our ultimate ...
Date: January 1, 2005
Creator: Collis, Samuel Scott; Bartlett, Roscoe Ainsworth; Smith, Thomas Michael; Heinkenschloss, Matthias (Rice University, Houston, TX); Wilcox, Lucas C. (Brown University, Providence, RI); Hill, Judith C. (Carnegie Mellon University, Pittsburgh, PA) et al.
Partner: UNT Libraries Government Documents Department

Visual Scanning Hartmann Optical Tester (VSHOT) Uncertainty Analysis (Milestone Report)

Description: In 1997, an uncertainty analysis was conducted of the Video Scanning Hartmann Optical Tester (VSHOT). In 2010, we have completed a new analysis, based primarily on the geometric optics of the system, and it shows sensitivities to various design and operational parameters. We discuss sources of error with measuring devices, instrument calibrations, and operator measurements for a parabolic trough mirror panel test. These help to guide the operator in proper setup, and help end-users to understand the data they are provided. We include both the systematic (bias) and random (precision) errors for VSHOT testing and their contributions to the uncertainty. The contributing factors we considered in this study are: target tilt; target face to laser output distance; instrument vertical offset; laser output angle; distance between the tool and the test piece; camera calibration; and laser scanner. These contributing factors were applied to the calculated slope error, focal length, and test article tilt that are generated by the VSHOT data processing. Results show the estimated 2-sigma uncertainty in slope error for a parabolic trough line scan test to be +/-0.2 milliradians; uncertainty in the focal length is +/- 0.1 mm, and the uncertainty in test article tilt is +/- 0.04 milliradians.
Date: October 1, 2010
Creator: Gray, A.; Lewandowski, A. & Wendelin, T.
Partner: UNT Libraries Government Documents Department

A methodology for selecting an optimal experimental design for the computer analysis of a complex system

Description: Investigation and evaluation of a complex system is often accomplished through the use of performance measures based on system response models. The response models are constructed using computer-generated responses supported where possible by physical test results. The general problem considered is one where resources and system complexity together restrict the number of simulations that can be performed. The levels of input variables used in defining environmental scenarios, initial and boundary conditions and for setting system parameters must be selected in an efficient way. This report describes an algorithmic approach for performing this selection.
Date: February 3, 2000
Creator: RUTHERFORD,BRIAN M.
Partner: UNT Libraries Government Documents Department

Extension and Enhancement Methods for Setting Data Quality Objectives

Description: The project developed statistical tools for the application of decision theory and operations research methods, including cost-benefit analysis, to the DQO process for environmental clean up. A pilot study was conducted, using these methods at the Hanford site, to estimate vadose zone contamination plumes under the SX tank farm, and to help plan further sampling.
Date: March 1, 2000
Creator: Goodman, D.
Partner: UNT Libraries Government Documents Department

A Resampling Based Approach to Optimal Experimental Design for Computer Analysis of a Complex System

Description: The investigation of a complex system is often performed using computer generated response data supplemented by system and component test results where possible. Analysts rely on an efficient use of limited experimental resources to test the physical system, evaluate the models and to assure (to the extent possible) that the models accurately simulate the system order investigation. The general problem considered here is one where only a restricted number of system simulations (or physical tests) can be performed to provide additional data necessary to accomplish the project objectives. The levels of variables used for defining input scenarios, for setting system parameters and for initializing other experimental options must be selected in an efficient way. The use of computer algorithms to support experimental design in complex problems has been a topic of recent research in the areas of statistics and engineering. This paper describes a resampling based approach to form dating this design. An example is provided illustrating in two dimensions how the algorithm works and indicating its potential on larger problems. The results show that the proposed approach has characteristics desirable of an algorithmic approach on the simple examples. Further experimentation is needed to evaluate its performance on larger problems.
Date: August 4, 1999
Creator: Rutherford, Brian
Partner: UNT Libraries Government Documents Department

Conceptual and computational basis for the quantification of margins and uncertainty.

Description: In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e, Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainty (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. This presentation discusses and illustrates the conceptual and computational basis of QMU in analyses that use computational models to predict the behavior of complex systems. Topics considered include (1) the role of aleatory and epistemic uncertainty in QMU, (2) the representation of uncertainty with probability, (3) the probabilistic representation of uncertainty in QMU analyses involving only epistemic uncertainty, (4) the probabilistic representation of uncertainty in QMU analyses involving aleatory and epistemic uncertainty, (5) procedures for sampling-based uncertainty and sensitivity analysis, (6) the representation of uncertainty with alternatives to probability such as interval analysis, possibility theory and evidence theory, (7) the representation of uncertainty with alternatives to probability in QMU analyses involving only epistemic uncertainty, and (8) the representation of uncertainty with alternatives to probability in QMU analyses involving aleatory and epistemic uncertainty. Concepts and computational procedures are illustrated with both notional examples and examples from reactor safety and radioactive waste disposal.
Date: June 1, 2009
Creator: Helton, Jon Craig (Arizona State University, Tempe, AZ)
Partner: UNT Libraries Government Documents Department

Illustration of sampling-based approaches to the calculation of expected dose in performance assessments for the proposed high level radioactive waste repository at Yucca Mountain, Nevada.

Description: A deep geologic repository for high level radioactive waste is under development by the U.S. Department of Energy at Yucca Mountain (YM), Nevada. As mandated in the Energy Policy Act of 1992, the U.S. Environmental Protection Agency (EPA) has promulgated public health and safety standards (i.e., 40 CFR Part 197) for the YM repository, and the U.S. Nuclear Regulatory Commission has promulgated licensing standards (i.e., 10 CFR Parts 2, 19, 20, etc.) consistent with 40 CFR Part 197 that the DOE must establish are met in order for the YM repository to be licensed for operation. Important requirements in 40 CFR Part 197 and 10 CFR Parts 2, 19, 20, etc. relate to the determination of expected (i.e., mean) dose to a reasonably maximally exposed individual (RMEI) and the incorporation of uncertainty into this determination. This presentation describes and illustrates how general and typically nonquantitive statements in 40 CFR Part 197 and 10 CFR Parts 2, 19, 20, etc. can be given a formal mathematical structure that facilitates both the calculation of expected dose to the RMEI and the appropriate separation in this calculation of aleatory uncertainty (i.e., randomness in the properties of future occurrences such as igneous and seismic events) and epistemic uncertainty (i.e., lack of knowledge about quantities that are poorly known but assumed to have constant values in the calculation of expected dose to the RMEI).
Date: April 1, 2007
Creator: Helton, Jon Craig & Sallaberry, Cedric J.
Partner: UNT Libraries Government Documents Department