140 Matching Results

Search Results

Advanced search parameters have been applied.

PDV Uncertainty Estimation & Methods Comparison

Description: Several methods are presented for estimating the rapidly changing instantaneous frequency of a time varying signal that is contaminated by measurement noise. Useful a posteriori error estimates for several methods are verified numerically through Monte Carlo simulation. However, given the sampling rates of modern digitizers, sub-nanosecond variations in velocity are shown to be reliably measurable in most (but not all) cases. Results support the hypothesis that in many PDV regimes of interest, sub-nanosecond resolution can be achieved.
Date: November 1, 2011
Creator: Machorro, E.
Partner: UNT Libraries Government Documents Department

Improving Thermal Model Prediction Through Statistical Analysis of Irradiation and Post-Irradiation Data from AGR Experiments

Description: As part of the Research and Development program for Next Generation High Temperature Reactors (HTR), a series of irradiation tests, designated as Advanced Gas-cooled Reactor (AGR), have been defined to support development and qualification of fuel design, fabrication process, and fuel performance under normal operation and accident conditions. The AGR tests employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule and instrumented with thermocouples (TC) embedded in graphite blocks enabling temperature control. The data representing the crucial test fuel conditions (e.g., temperature, neutron fast fluence, and burnup) while impossible to obtain from direct measurements are calculated by physics and thermal models. The irradiation and post-irradiation examination (PIE) experimental data are used in model calibration effort to reduce the inherent uncertainty of simulation results. This paper is focused on fuel temperature predicted by the ABAQUS code’s finite element-based thermal models. The work follows up on a previous study, in which several statistical analysis methods were adapted, implemented in the NGNP Data Management and Analysis System (NDMAS), and applied for improving qualification of AGR-1 thermocouple data. The present work exercises the idea that the abnormal trends of measured data observed from statistical analysis may be caused by either measuring instrument deterioration or physical mechanisms in capsules that may have shifted the system thermal response. As an example, the uneven reduction of the control gas gap in Capsule 5 revealed by the capsule metrology measurements in PIE helps justify the reduction in TC readings instead of TC drift. This in turn prompts modification of thermal model to better fit with experimental data, thus help increase confidence, and in other word reduce model uncertainties in thermal simulation results of the AGR-1 test.
Date: October 1, 2012
Creator: Pham, Dr. Binh T.; Hawkes, Grant L. & Einerson, Jeffrey J.
Partner: UNT Libraries Government Documents Department

Sensitivity and Uncertainty Analysis of Occupancy-related Parameters in Energy Modeling of Unt Zero Energy Lab

Description: The study focuses on the sensitivity and uncertainty analysis of occupancy-related parameters using Energyplus modeling method. The model is based on a real building Zero Energy Lab in Discovery Park, at University of North Texas. Four categories of parameters are analyzed: heating/cooling setpoint, lighting, equipment and occupancy. Influence coefficient (IC) is applied in the sensitivity study, in order to compare the impact of individual parameter on the overall building energy consumption. The study is conducted under Texas weather file as well as North Dakota weather file in order to find weather’s influence of sensitivity. Probabilistic collocation method (PCM) is utilized for uncertainty analysis, with an aim of predicting future energy consumption based on history or reference data set. From the study, it is found that cooling setpoint has the largest influence on overall energy consumption in both Texas and North Dakota, and occupancy number has the least influence. The analysis also indicates schedule’s influence on energy consumption. PCM is able to accurately predict future energy consumption with limited calculation, and has great advantage over Monte Carlo Method. The polynomial equations are generated in both 3-order and 6-order, and the 6-order equation is proved to have a better result, which is around 0.1% compared with real value.
Date: August 2013
Creator: Xiong, Guangyuan
Partner: UNT Libraries

Planning for Quality Data

Description: The assurance of data quality can be a complex process requiring careful planning. The planning process described in this paper uses Data Quality Objectives as the foundation. The described process considers three steps: project requirement identification, definition of the information necessary to answer the questions, and data collection and management. Since sufficient levels of documentation are required at all levels, uncertainty analysis, traceability and custody, data maintenance, and data evaluation and review are also discussed.
Date: May 1, 2005
Creator: Evans, Robert P.
Partner: UNT Libraries Government Documents Department

De Broglie wavelets versus Schroedinger wave functions: A ribbon model approach to quantum theory and the mechanisms of quantum interference

Description: As an alternative to better physical explanations of the mechanisms of quantum interference and the origins of uncertainty broadening, a linear hopping model is proposed with ``color-varying`` dynamics to reflect fast exchange between time-reversed states. Intricate relations between this model, particle-wave dualism, and relativity are discussed. The wave function is shown to possess dual characteristics of a stable, localized ``soliton-like`` de Broglie wavelet and a delocalized, interfering Schroedinger carrier wave function.
Date: February 1, 1996
Creator: Tang, Jau
Partner: UNT Libraries Government Documents Department

Uncertainty Quantification of Calculated Temperatures for the AGR-1 Experiment

Description: This report documents an effort to quantify the uncertainty of the calculated temperature data for the first Advanced Gas Reactor (AGR-1) fuel irradiation experiment conducted in the INL’s Advanced Test Reactor (ATR) in support of the Next Generation Nuclear Plant (NGNP) R&D program. Recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, the results of the numerical simulations can be used in combination with the statistical analysis methods to improve qualification of measured data. Additionally, the temperature simulation data for AGR tests can be used for validation of the fuel transport and fuel performance simulation models. The crucial roles of the calculated fuel temperatures in ensuring achievement of the AGR experimental program objectives require accurate determination of the model temperature uncertainties. The report is organized into three chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program and provides overviews of AGR-1 measured data, AGR-1 test configuration and test procedure, and thermal simulation. Chapters 2 describes the uncertainty quantification procedure for temperature simulation data of the AGR-1 experiment, namely, (i) identify and quantify uncertainty sources; (ii) perform sensitivity analysis for several thermal test conditions; (iii) use uncertainty propagation to quantify overall response temperature uncertainty. A set of issues associated with modeling uncertainties resulting from the expert assessments are identified. This also includes the experimental design to estimate the main effects and interactions of the important thermal model parameters. Chapter 3 presents the overall uncertainty results for the six AGR-1 capsules. This includes uncertainties for the daily volume-average and peak fuel temperatures, daily average temperatures at TC locations, and time-average volume-average and time-average peak fuel temperatures.
Date: March 1, 2013
Creator: Pham, Binh T.; Einerson, Jeffrey J. & Hawkes, Grant L.
Partner: UNT Libraries Government Documents Department

R7 VU Born-Assessed Demo Plan

Description: This is an initial draft of a born-assessed VU plan for the RELAP7 (R7) code development effort. The plan will continue to evolve based on the growth of code capability. This growth will be reflected as additional testing is identified and done. Later versions of this document will reflect that growth.
Date: February 1, 2010
Creator: Nourgaliev, Robert
Partner: UNT Libraries Government Documents Department

Effect of Epistemic Uncertainty Modeling Approach on Decision-Making: Example using Equipment Performance Indicator

Description: Quantitative risk assessments are an integral part of risk-informed regulation of current and future nuclear plants in the U.S. The Bayesian approach to uncertainty, in which both stochastic and epistemic uncertainties are represented with precise probability distributions, is the standard approach to modeling uncertainties in such quantitative risk assessments. However, there are long-standing criticisms of the Bayesian approach to epistemic uncertainty from many perspectives, and a number of alternative approaches have been proposed. Among these alternatives, the most promising (and most rapidly developing) would appear to be the concept of imprecise probability. In this paper, we employ a performance indicator example to focus the discussion. We first give a short overview of the traditional Bayesian paradigm and review some its controversial aspects, for example, issues with so-called noninformative prior distributions. We then discuss how the imprecise probability approach treats these issues and compare it with two other approaches: sensitivity analysis and hierarchical Bayes modeling. We conclude with some practical implications for risk-informed decision making.
Date: June 1, 2012
Creator: Kelly, Dana & Youngblood, Robert
Partner: UNT Libraries Government Documents Department

Planning For Quality Data

Description: The assurance of data quality can be a complex process requiring careful planning. The planning process described in this paper uses Data Quality Objectives as the foundation. The described process considers three steps: project requirement identification, definition of the information necessary to answer the questions, and data collection and management. Since sufficient levels of documentation are required at all levels, uncertainty analysis, traceability and custody, data maintenance, and data evaluation and review are also discussed.
Date: May 1, 2005
Creator: Evans, Robert P.
Partner: UNT Libraries Government Documents Department

A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

Description: Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.
Date: October 1, 2006
Creator: Johnson, J. D. (Prostat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ) & Storlie, Curtis B. (North Carolina State University, Raleigh, NC)
Partner: UNT Libraries Government Documents Department

Data-Driven Decision-Making Framework for Large-Scale Dynamical Systems under Uncertainty

Description: Managing large-scale dynamical systems (e.g., transportation systems, complex information systems, and power networks, etc.) in real-time is very challenging considering their complicated system dynamics, intricate network interactions, large scale, and especially the existence of various uncertainties. To address this issue, intelligent techniques which can quickly design decision-making strategies that are robust to uncertainties are needed. This dissertation aims to conquer these challenges by exploring a data-driven decision-making framework, which leverages big-data techniques and scalable uncertainty evaluation approaches to quickly solve optimal control problems. In particular, following techniques have been developed along this direction: 1) system modeling approaches to simplify the system analysis and design procedures for multiple applications; 2) effective simulation and analytical based approaches to efficiently evaluate system performance and design control strategies under uncertainty; and 3) big-data techniques that allow some computations of control strategies to be completed offline. These techniques and tools for analysis, design and control contribute to a wide range of applications including air traffic flow management, complex information systems, and airborne networks.
Access: This item is restricted to UNT Community Members. Login required if off-campus.
Date: August 2016
Creator: Xie, Junfei
Partner: UNT Libraries

Report on INL Activities for Uncertainty Reduction Analysis of FY12

Description: The work scope of this project related to the Work Packages of “Uncertainty Reduction Analyses” with the goal of reducing nuclear data uncertainties is to produce a set of improved nuclear data to be used both for a wide range of validated advanced fast reactor design calculations, and for providing guidelines for further improvements of the ENDF/B files (i.e. ENDF/B-VII, and future releases). Recent extensive sensitivity/uncertainty studies, performed within an international OECD-NEA initiative, have quantified for the first time the impact of current nuclear data uncertainties on design parameters of the major FCR&D and GEN-IV systems, and in particular on Na-cooled fast reactors with different fuels (oxide or metal), fuel composition (e.g. different Pu/TRU ratios) and different conversion ratios. These studies have pointed out that present uncertainties on the nuclear data should be significantly reduced, in order to get full benefit from the advanced modeling and simulation initiatives. Nuclear data plays a fundamental role in performance calculations of advanced reactor concepts. Uncertainties in the nuclear data propagate into uncertainties in calculated integral quantities, driving margins and costs in advanced system design, operation and safeguards. This package contributes to the resolution of technical, cost, safety, security and proliferation concerns in a multi-pronged, systematic, science-based R&D approach. The Nuclear Data effort identifies and develops small scale, phenomenon-specific experiments informed by theory and engineering to reduce the number of large, expensive integral experiments. The Nuclear Data activities are leveraged by effective collaborations between experiment and theory, between DOE programs and offices, at national laboratories and universities, both domestic and international. The primary objective is to develop reactor core sensitivity and uncertainty analyses that identify the improvement needs of key nuclear data which would facilitate fast spectrum system optimization and assure safety performance. The inclusion of fast spectrum integral experiment data is key to ...
Date: September 1, 2012
Creator: Palmiotti, G. & Salvatores, M.
Partner: UNT Libraries Government Documents Department

Calculating Confidence, Uncertainty, and Numbers of Samples When Using Statistical Sampling Approaches to Characterize and Clear Contaminated Areas

Description: This report discusses the methodology, formulas, and inputs needed to make characterization and clearance decisions for Bacillus anthracis-contaminated and uncontaminated (or decontaminated) areas using a statistical sampling approach. Specifically, the report includes the methods and formulas for calculating the • number of samples required to achieve a specified confidence in characterization and clearance decisions • confidence in making characterization and clearance decisions for a specified number of samples for two common statistically based environmental sampling approaches. In particular, the report addresses an issue raised by the Government Accountability Office by providing methods and formulas to calculate the confidence that a decision area is uncontaminated (or successfully decontaminated) if all samples collected according to a statistical sampling approach have negative results. Key to addressing this topic is the probability that an individual sample result is a false negative, which is commonly referred to as the false negative rate (FNR). The two statistical sampling approaches currently discussed in this report are 1) hotspot sampling to detect small isolated contaminated locations during the characterization phase, and 2) combined judgment and random (CJR) sampling during the clearance phase. Typically if contamination is widely distributed in a decision area, it will be detectable via judgment sampling during the characterization phrase. Hotspot sampling is appropriate for characterization situations where contamination is not widely distributed and may not be detected by judgment sampling. CJR sampling is appropriate during the clearance phase when it is desired to augment judgment samples with statistical (random) samples. The hotspot and CJR statistical sampling approaches are discussed in the report for four situations: 1. qualitative data (detect and non-detect) when the FNR = 0 or when using statistical sampling methods that account for FNR > 0 2. qualitative data when the FNR > 0 but statistical sampling methods are used that assume ...
Date: April 27, 2013
Creator: Piepel, Gregory F.; Matzke, Brett D.; Sego, Landon H. & Amidan, Brett G.
Partner: UNT Libraries Government Documents Department

Uncertainty Quantification Techniques for Sensor Calibration Monitoring in Nuclear Power Plants

Description: This report describes the status of ongoing research towards the development of advanced algorithms for online calibration monitoring. The objective of this research is to develop the next generation of online monitoring technologies for sensor calibration interval extension and signal validation in operating and new reactors. These advances are expected to improve the safety and reliability of current and planned nuclear power systems as a result of higher accuracies and increased reliability of sensors used to monitor key parameters. The focus of this report is on documenting the outcomes of the first phase of R&D under this project, which addressed approaches to uncertainty quantification (UQ) in online monitoring that are data-driven, and can therefore adjust estimates of uncertainty as measurement conditions change. Such data-driven approaches to UQ are necessary to address changing plant conditions, for example, as nuclear power plants experience transients, or as next-generation small modular reactors (SMR) operate in load-following conditions.
Date: September 1, 2013
Creator: Ramuhalli, Pradeep; Lin, Guang; Crawford, Susan L.; Konomi, Bledar A.; Braatz, Brett G.; Coble, Jamie B. et al.
Partner: UNT Libraries Government Documents Department

Data Decision Analysis: Project Shoal

Description: The purpose of this study was to determine the most appropriate field activities in terms of reducing the uncertainty in the groundwater flow and transport model at the Project Shoal area. The data decision analysis relied on well-known tools of statistics and uncertainty analysis. This procedure identified nine parameters that were deemed uncertain. These included effective porosity, hydraulic head, surface recharge, hydraulic conductivity, fracture correlation scale, fracture orientation, dip angle, dissolution rate of radionuclides from the puddle glass, and the retardation coefficient, which describes the sorption characteristics. The parameter uncertainty was described by assigning prior distributions for each of these parameters. Next, the various field activities were identified that would provide additional information on these parameters. Each of the field activities was evaluated by an expert panel to estimate posterior distribution of the parameters assuming a field activity was performed. The posterior distributions describe the ability of the field activity to estimate the true value of the nine parameters. Monte Carlo techniques were used to determine the current uncertainty, the reduction of uncertainty if a single parameter was known with certainty, and the reduction of uncertainty expected from each field activity on the model predictions. The mean breakthrough time to the downgradient land withdrawal boundary and the peak concentration at the control boundary were used to evaluate the uncertainty reduction. The radionuclide 137Cs was used as the reference solute, as its migration is dependent on all of the parameters. The results indicate that the current uncertainty of the model yields a 95 percent confidence interval between 42 and 1,412 years for the mean breakthrough time and an 18 order-of-magnitude range in peak concentration. The uncertainty in effective porosity and recharge dominates the uncertainty in the model predictions, while the other parameters are less important. A two-stage process was used to ...
Date: January 1, 1999
Creator: Forsgren, Frank; Pohll, Greg & Tracy, John
Partner: UNT Libraries Government Documents Department

"The Long Goodbye": Uncertainty Management in Alzheimer's Caregivers

Description: Caregivers for individuals diagnosed with Alzheimer's disease (AD) shoulder a remarkably complex burden as compared to other caregivers of elderly individuals. For long distance caregivers, geographical separation further compounds the problems experienced by AD caregivers, as they are isolated from family members and support networks. Both on-site and long-distance AD caregivers experience uncertainty; the findings from this study illustrate how AD caregivers manage the uncertainty of the disease and primary care, as well as how uncertainty differs between on-site and long-distance caregivers. AD caregiver (N = 13) interviews were transcribed and qualitatively analyzed using uncertainty management theory as a thematic lens. The analysis revealed that AD caregivers experience overwhelming feelings of burden, guilt, and doubt; however, these feelings manifest differently depending on caregiver type. The findings of this study demonstrate that sources for obtaining information regarding AD and caregiving were useful for on-site caregivers; however, the sources did not account for the needs of long-distance caregivers or the psychosocial needs of on-site caregivers. Furthermore, AD caregivers did not seek support or information about AD and caregiving from health care professionals. Implications for future research regarding long-distance and on-site AD caregiving are discussed.
Date: May 2011
Creator: Shaunfield, Sara
Partner: UNT Libraries

The origins of quantum interference and uncertainty broadening. A linear ribbon model approach

Description: As an alternative to the orthodox Schroedinger wave mechanics or Heisenberg matrix mechanics approach, a simple linear ribbon model for quantum theory is presented. A different perspective and better physical insights into the origins of quantum interference and the mechanisms for uncertainty broadening are offered. Quantum interference in the atomic scale and superconducting behaviour in the macroscopic scale are compared.
Date: February 1, 1996
Creator: Tang, J.
Partner: UNT Libraries Government Documents Department

Navier-Stokes Solvers and Generalizations for Reacting Flow Problems

Description: This is an overview of our accomplishments during the final term of this grant (1 September 2008 -- 30 June 2012). These fall mainly into three categories: fast algorithms for linear eigenvalue problems; solution algorithms and modeling methods for partial differential equations with uncertain coefficients; and preconditioning methods and solvers for models of computational fluid dynamics (CFD).
Date: January 27, 2013
Creator: Elman, Howard C
Partner: UNT Libraries Government Documents Department

Benchmarking Exercises To Validate The Updated ELLWF GoldSim Slit Trench Model

Description: The Savannah River National Laboratory (SRNL) results of the 2008 Performance Assessment (PA) (WSRC, 2008) sensitivity/uncertainty analyses conducted for the trenches located in the EArea LowLevel Waste Facility (ELLWF) were subject to review by the United States Department of Energy (U.S. DOE) Low-Level Waste Disposal Facility Federal Review Group (LFRG) (LFRG, 2008). LFRG comments were generally approving of the use of probabilistic modeling in GoldSim to support the quantitative sensitivity analysis. A recommendation was made, however, that the probabilistic models be revised and updated to bolster their defensibility. SRS committed to addressing those comments and, in response, contracted with Neptune and Company to rewrite the three GoldSim models. The initial portion of this work, development of Slit Trench (ST), Engineered Trench (ET) and Components-in-Grout (CIG) trench GoldSim models, has been completed. The work described in this report utilizes these revised models to test and evaluate the results against the 2008 PORFLOW model results. This was accomplished by first performing a rigorous code-to-code comparison of the PORFLOW and GoldSim codes and then performing a deterministic comparison of the two-dimensional (2D) unsaturated zone and three-dimensional (3D) saturated zone PORFLOW Slit Trench models against results from the one-dimensional (1D) GoldSim Slit Trench model. The results of the code-to-code comparison indicate that when the mechanisms of radioactive decay, partitioning of contaminants between solid and fluid, implementation of specific boundary conditions and the imposition of solubility controls were all tested using identical flow fields, that GoldSim and PORFLOW produce nearly identical results. It is also noted that GoldSim has an advantage over PORFLOW in that it simulates all radionuclides simultaneously - thus avoiding a potential problem as demonstrated in the Case Study (see Section 2.6). Hence, it was concluded that the follow-on work using GoldSim to develop 1D equivalent models of the PORFLOW multi-dimensional models ...
Date: November 12, 2013
Creator: Taylor, G. A. & Hiergesell, R. A.
Partner: UNT Libraries Government Documents Department

Uncertainty Evaluation in Large-scale Dynamical Systems: Theory and Applications

Description: Significant research efforts have been devoted to large-scale dynamical systems, with the aim of understanding their complicated behaviors and managing their responses in real-time. One pivotal technological obstacle in this process is the existence of uncertainty. Although many of these large-scale dynamical systems function well in the design stage, they may easily fail when operating in realistic environment, where environmental uncertainties modulate system dynamics and complicate real-time predication and management tasks. This dissertation aims to develop systematic methodologies to evaluate the performance of large-scale dynamical systems under uncertainty, as a step toward real-time decision support. Two uncertainty evaluation approaches are pursued: the analytical approach and the effective simulation approach. The analytical approach abstracts the dynamics of original stochastic systems, and develops tractable analysis (e.g., jump-linear analysis) for the approximated systems. Despite the potential bias introduced in the approximation process, the analytical approach provides rich insights valuable for evaluating and managing the performance of large-scale dynamical systems under uncertainty. When a system’s complexity and scale are beyond tractable analysis, the effective simulation approach becomes very useful. The effective simulation approach aims to use a few smartly selected simulations to quickly evaluate a complex system’s statistical performance. This approach was originally developed to evaluate a single uncertain variable. This dissertation extends the approach to be scalable and effective for evaluating large-scale systems under a large-number of uncertain variables. While a large portion of this dissertation focuses on the development of generic methods and theoretical analysis that are applicable to broad large-scale dynamical systems, many results are illustrated through a representative large-scale system application on strategic air traffic management application, which is concerned with designing robust management plans subject to a wide range of weather possibilities at 2-15 hours look-ahead time.
Date: December 2014
Creator: Zhou, Yi (Software engineer)
Partner: UNT Libraries

New Method and Reporting of Uncertainty in LBNL National Energy Modeling System Runs

Description: This report describes LBNL's approach for assessing uncertainty in any National Energy Modeling System (NEMS)-related analysis. Based on years of experience using LBNL-NEMS for various analyses, LBNL developed an alternative approach that aims to provide a simple yet comprehensive perspective of how the results behave under a given set of what we believe to be some of the issues important to large-scale energy modeling. This project has established a standard set of eight sensitivity cases that can be run overnight and are highly likely to produce stable and interesting results. The goal was to establish a limited number of interesting sensitivity cases that would routinely produce adjunct results to LBNL-NEMS reporting that will be of value to our readers. These cases will be routinely reported together with future LBNL-NEMS results in the form of a standard output table. As an example, this work uses a Government Performance and Results Act (GPRA) analysis run as the baseline, but th e goal is to establish a standardized set of cases that would change little over time and be applicable to other analyses in addition to GPRA. The approach developed here cannot serve as a substitute for a sensitivity analysis tailored to the question at hand, but it can provide a fast review of some areas that have proven to be of interest in the past.
Date: October 1, 2002
Creator: Gumerman, Etan Z.; LaCommare, Kristina Hamachi & Marnay, Chris
Partner: UNT Libraries Government Documents Department

Multi-Scale Assessment of Prediction Uncertainty in Coupled Reactive Transport Models Conducted at the Florida State University

Description: This report summarizes the research activities in the Florida State University for quantifying parametric and model uncertainty in groundwater reactive transport modeling. Mathematical and computational research was conducted to investigate the following five questions: (1) How does uncertainty behave and affect groundwater reactive transport models? (2) What cause the uncertainty in groundwater reactive transport modeling? (3) How to quantify parametric uncertainty of groundwater reactive transport modeling? (4) How to quantify model uncertainty of groundwater reactive transport modeling? and (5) How to reduce predictive uncertainty by collecting data of maximum value of information or data-worth? The questions were addressed using Interdisciplinary methods, including computational statistics, Bayesian uncertainty analysis, and groundwater modeling. Both synthetic and real-world data were used to evaluate and demonstrate the developed methods. The research results revealed special challenges to uncertainty quantification for groundwater reactive transport models. For example, competitive reactions and substitution effects of reactions also cause parametric uncertainty. Model uncertainty is more important than parametric uncertainty, and model averaging methods are a vital tool to improve model predictions. Bayesian methods are more accurate than regression methods for uncertainty quantification. However, when Bayesian uncertainty analysis is computationally impractical, uncertainty analysis using regression methods still provides insights into uncertainty analysis. The research results of this study are useful to science-informed decision-making and uncertainty reduction by collecting data of more value of information.
Date: November 9, 2013
Creator: Ye, Ming
Partner: UNT Libraries Government Documents Department

Incorporating uncertainty in RADTRAN 6.0 input files.

Description: Uncertainty may be introduced into RADTRAN analyses by distributing input parameters. The MELCOR Uncertainty Engine (Gauntt and Erickson, 2004) has been adapted for use in RADTRAN to determine the parameter shape and minimum and maximum of the distribution, to sample on the distribution, and to create an appropriate RADTRAN batch file. Coupling input parameters is not possible in this initial application. It is recommended that the analyst be very familiar with RADTRAN and able to edit or create a RADTRAN input file using a text editor before implementing the RADTRAN Uncertainty Analysis Module. Installation of the MELCOR Uncertainty Engine is required for incorporation of uncertainty into RADTRAN. Gauntt and Erickson (2004) provides installation instructions as well as a description and user guide for the uncertainty engine.
Date: February 1, 2010
Creator: Dennis, Matthew L.; Weiner, Ruth F. & Heames, Terence John (Alion Science and Technology)
Partner: UNT Libraries Government Documents Department