14 Matching Results

Search Results

Advanced search parameters have been applied.

Elements of a function analytic approach to probability.

Description: We first provide a detailed motivation for using probability theory as a mathematical context in which to analyze engineering and scientific systems that possess uncertainties. We then present introductory notes on the function analytic approach to probabilistic analysis, emphasizing the connections to various classical deterministic mathematical analysis elements. Lastly, we describe how to use the approach as a means to augment deterministic analysis methods in a particular Hilbert space context, and thus enable a rigorous framework for commingling deterministic and probabilistic analysis tools in an application setting.
Date: February 1, 2008
Creator: Ghanem, Roger Georges (University of Southern California, Los Angeles, CA) & Red-Horse, John Robert
Partner: UNT Libraries Government Documents Department

Scalable analysis tools for sensitivity analysis and UQ (3160) results.

Description: The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.
Date: September 1, 2009
Creator: Karelitz, David B.; Ice, Lisa G.; Thompson, David C.; Bennett, Janine C.; Fabian, Nathan; Scott, W. Alan et al.
Partner: UNT Libraries Government Documents Department

Sensitivity in risk analyses with uncertain numbers.

Description: Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is Dempster-Shafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a ''pinching'' strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered.
Date: June 1, 2006
Creator: Tucker, W. Troy & Ferson, Scott
Partner: UNT Libraries Government Documents Department

Measures of agreement between computation and experiment:validation metrics.

Description: With the increasing role of computational modeling in engineering design, performance estimation, and safety assessment, improved methods are needed for comparing computational results and experimental measurements. Traditional methods of graphically comparing computational and experimental results, though valuable, are essentially qualitative. Computable measures are needed that can quantitatively compare computational and experimental results over a range of input, or control, variables and sharpen assessment of computational accuracy. This type of measure has been recently referred to as a validation metric. We discuss various features that we believe should be incorporated in a validation metric and also features that should be excluded. We develop a new validation metric that is based on the statistical concept of confidence intervals. Using this fundamental concept, we construct two specific metrics: one that requires interpolation of experimental data and one that requires regression (curve fitting) of experimental data. We apply the metrics to three example problems: thermal decomposition of a polyurethane foam, a turbulent buoyant plume of helium, and compressibility effects on the growth rate of a turbulent free-shear layer. We discuss how the present metrics are easily interpretable for assessing computational model accuracy, as well as the impact of experimental measurement uncertainty on the accuracy assessment.
Date: August 1, 2005
Creator: Barone, Matthew Franklin & Oberkampf, William Louis
Partner: UNT Libraries Government Documents Department

Quantifying reliability uncertainty : a proof of concept.

Description: This paper develops Classical and Bayesian methods for quantifying the uncertainty in reliability for a system of mixed series and parallel components for which both go/no-go and variables data are available. Classical methods focus on uncertainty due to sampling error. Bayesian methods can explore both sampling error and other knowledge-based uncertainties. To date, the reliability community has focused on qualitative statements about uncertainty because there was no consensus on how to quantify them. This paper provides a proof of concept that workable, meaningful quantification methods can be constructed. In addition, the application of the methods demonstrated that the results from the two fundamentally different approaches can be quite comparable. In both approaches, results are sensitive to the details of how one handles components for which no failures have been seen in relatively few tests.
Date: October 1, 2009
Creator: Diegert, Kathleen V.; Dvorack, Michael A.; Ringland, James T.; Mundt, Michael Joseph; Huzurbazar, Aparna (Los Alamos National Laboratory, Los Alamos, NM); Lorio, John F. et al.
Partner: UNT Libraries Government Documents Department

Uncertainty Evaluation in Large-scale Dynamical Systems: Theory and Applications

Description: Significant research efforts have been devoted to large-scale dynamical systems, with the aim of understanding their complicated behaviors and managing their responses in real-time. One pivotal technological obstacle in this process is the existence of uncertainty. Although many of these large-scale dynamical systems function well in the design stage, they may easily fail when operating in realistic environment, where environmental uncertainties modulate system dynamics and complicate real-time predication and management tasks. This dissertation aims to develop systematic methodologies to evaluate the performance of large-scale dynamical systems under uncertainty, as a step toward real-time decision support. Two uncertainty evaluation approaches are pursued: the analytical approach and the effective simulation approach. The analytical approach abstracts the dynamics of original stochastic systems, and develops tractable analysis (e.g., jump-linear analysis) for the approximated systems. Despite the potential bias introduced in the approximation process, the analytical approach provides rich insights valuable for evaluating and managing the performance of large-scale dynamical systems under uncertainty. When a system’s complexity and scale are beyond tractable analysis, the effective simulation approach becomes very useful. The effective simulation approach aims to use a few smartly selected simulations to quickly evaluate a complex system’s statistical performance. This approach was originally developed to evaluate a single uncertain variable. This dissertation extends the approach to be scalable and effective for evaluating large-scale systems under a large-number of uncertain variables. While a large portion of this dissertation focuses on the development of generic methods and theoretical analysis that are applicable to broad large-scale dynamical systems, many results are illustrated through a representative large-scale system application on strategic air traffic management application, which is concerned with designing robust management plans subject to a wide range of weather possibilities at 2-15 hours look-ahead time.
Date: December 2014
Creator: Zhou, Yi (Software engineer)
Partner: UNT Libraries

Experimental uncertainty estimation and statistics for data having interval uncertainty.

Description: This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.
Date: May 1, 2007
Creator: Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York) & Hajagos, Janos (Applied Biomathematics, Setauket, New York)
Partner: UNT Libraries Government Documents Department

A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

Description: Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.
Date: October 1, 2006
Creator: Johnson, J. D. (Prostat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ) & Storlie, Curtis B. (North Carolina State University, Raleigh, NC)
Partner: UNT Libraries Government Documents Department

LDRD final report : robust analysis of large-scale combinatorial applications.

Description: Discrete models of large, complex systems like national infrastructures and complex logistics frameworks naturally incorporate many modeling uncertainties. Consequently, there is a clear need for optimization techniques that can robustly account for risks associated with modeling uncertainties. This report summarizes the progress of the Late-Start LDRD 'Robust Analysis of Largescale Combinatorial Applications'. This project developed new heuristics for solving robust optimization models, and developed new robust optimization models for describing uncertainty scenarios.
Date: September 1, 2007
Creator: Carr, Robert D.; Morrison, Todd (University of Colorado, Denver, CO); Hart, William Eugene; Benavides, Nicolas L. (Santa Clara University, Santa Clara, CA); Greenberg, Harvey J. (University of Colorado, Denver, CO); Watson, Jean-Paul et al.
Partner: UNT Libraries Government Documents Department

Extension of latin hypercube samples with correlated variables.

Description: A procedure for extending the size of a Latin hypercube sample (LHS) with rank correlated variables is described and illustrated. The extension procedure starts with an LHS of size m and associated rank correlation matrix C and constructs a new LHS of size 2m that contains the elements of the original LHS and has a rank correlation matrix that is close to the original rank correlation matrix C. The procedure is intended for use in conjunction with uncertainty and sensitivity analysis of computationally demanding models in which it is important to make efficient use of a necessarily limited number of model evaluations.
Date: November 1, 2006
Creator: Hora, Stephen Curtis; Helton, Jon Craig & Sallaberry, Cedric J.
Partner: UNT Libraries Government Documents Department

Survey of sampling-based methods for uncertainty and sensitivity analysis.

Description: Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.
Date: June 1, 2006
Creator: Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. & Storlie, Curt B.
Partner: UNT Libraries Government Documents Department

Toward a more rigorous application of margins and uncertainties within the nuclear weapons life cycle : a Sandia perspective.

Description: This paper presents the conceptual framework that is being used to define quantification of margins and uncertainties (QMU) for application in the nuclear weapons (NW) work conducted at Sandia National Laboratories. The conceptual framework addresses the margins and uncertainties throughout the NW life cycle and includes the definition of terms related to QMU and to figures of merit. Potential applications of QMU consist of analyses based on physical data and on modeling and simulation. Appendix A provides general guidelines for addressing cases in which significant and relevant physical data are available for QMU analysis. Appendix B gives the specific guidance that was used to conduct QMU analyses in cycle 12 of the annual assessment process. Appendix C offers general guidelines for addressing cases in which appropriate models are available for use in QMU analysis. Appendix D contains an example that highlights the consequences of different treatments of uncertainty in model-based QMU analyses.
Date: December 1, 2007
Creator: Klenke, Scott Edward; Novotny, George Charles; Paulsen Robert A., Jr.; Diegert, Kathleen V.; Trucano, Timothy Guy & Pilch, Martin M.
Partner: UNT Libraries Government Documents Department

Verification test problems for the calculation of probability of loss of assured safety in temperature-dependent systems with multiple weak and strong links.

Description: Four verification test problems are presented for checking the conceptual development and computational implementation of calculations to determine the probability of loss of assured safety (PLOAS) in temperature-dependent systems with multiple weak links (WLs) and strong links (SLs). The problems are designed to test results obtained with the following definitions of loss of assured safety: (1) Failure of all SLs before failure of any WL, (2) Failure of any SL before failure of any WL, (3) Failure of all SLs before failure of all WLs, and (4) Failure of any SL before failure of all WLs. The test problems are based on assuming the same failure properties for all links, which results in problems that have the desirable properties of fully exercising the numerical integration procedures required in the evaluation of PLOAS and also possessing simple algebraic representations for PLOAS that can be used for verification of the analysis.
Date: June 1, 2006
Creator: Johnson, Jay Dean (ProStat, Mesa, AZ); Oberkampf, William Louis & Helton, Jon Craig (Arizona State University, Tempe, AZ)
Partner: UNT Libraries Government Documents Department

Incorporating uncertainty in RADTRAN 6.0 input files.

Description: Uncertainty may be introduced into RADTRAN analyses by distributing input parameters. The MELCOR Uncertainty Engine (Gauntt and Erickson, 2004) has been adapted for use in RADTRAN to determine the parameter shape and minimum and maximum of the distribution, to sample on the distribution, and to create an appropriate RADTRAN batch file. Coupling input parameters is not possible in this initial application. It is recommended that the analyst be very familiar with RADTRAN and able to edit or create a RADTRAN input file using a text editor before implementing the RADTRAN Uncertainty Analysis Module. Installation of the MELCOR Uncertainty Engine is required for incorporation of uncertainty into RADTRAN. Gauntt and Erickson (2004) provides installation instructions as well as a description and user guide for the uncertainty engine.
Date: February 1, 2010
Creator: Dennis, Matthew L.; Weiner, Ruth F. & Heames, Terence John (Alion Science and Technology)
Partner: UNT Libraries Government Documents Department