19 Matching Results

Search Results

Advanced search parameters have been applied.

Oscillator Strengths and Transition Probabilities fro 3288 Lines of Fe I

Description: Report of a homogeneous set of intensity-related data calculated for 3288 spectral lines of Fe I in the region from 2100 to 9900 angstroms. The quantities tabulated in the present monograph include log (g__), log (gf), gf,f,gA, and A. Since recent investigations do not support excitations corrections in the case of Fe I and certain other spectra, the present tabulation incorporates a removal of that normalization function. This recalculation affects the values for all lines whose upper energy levels lie above 46000 cm -1 and should significantly improve the internal consistency of the present data.
Date: March 1968
Creator: Corliss, C. H. & Tech, J. L
Partner: UNT Libraries Government Documents Department

A short course on measure and probability theories.

Description: This brief Introduction to Measure Theory, and its applications to Probabilities, corresponds to the lecture notes of a seminar series given at Sandia National Laboratories in Livermore, during the spring of 2003. The goal of these seminars was to provide a minimal background to Computational Combustion scientists interested in using more advanced stochastic concepts and methods, e.g., in the context of uncertainty quantification. Indeed, most mechanical engineering curricula do not provide students with formal training in the field of probability, and even in less in measure theory. However, stochastic methods have been used more and more extensively in the past decade, and have provided more successful computational tools. Scientists at the Combustion Research Facility of Sandia National Laboratories have been using computational stochastic methods for years. Addressing more and more complex applications, and facing difficult problems that arose in applications showed the need for a better understanding of theoretical foundations. This is why the seminar series was launched, and these notes summarize most of the concepts which have been discussed. The goal of the seminars was to bring a group of mechanical engineers and computational combustion scientists to a full understanding of N. WIENER'S polynomial chaos theory. Therefore, these lectures notes are built along those lines, and are not intended to be exhaustive. In particular, the author welcomes any comments or criticisms.
Date: February 1, 2004
Creator: PÔebay, Philippe Pierre
Partner: UNT Libraries Government Documents Department

Human Reliability Analysis in the U.S. Nuclear Power Industry: A Comparison of Atomistic and Holistic Methods

Description: A variety of methods have been developed to generate human error probabilities for use in the US nuclear power industry. When actual operations data are not available, it is necessary for an analyst to estimate these probabilities. Most approaches, including THERP, ASEP, SLIM-MAUD, and SPAR-H, feature an atomistic approach to characterizing and estimating error. The atomistic approach is based on the notion that events and their causes can be decomposed and individually quantified. In contrast, in the holistic approach, such as found in ATHEANA, the analysis centers on the entire event, which is typically quantified as an indivisible whole. The distinction between atomistic and holistic approaches is important in understanding the nature of human reliability analysis quantification and the utility and shortcomings associated with each approach.
Date: September 1, 2005
Creator: Boring, Ronald L.; Gertman, David I.; Joe, Jeffrey C. & Marble, Julie L.
Partner: UNT Libraries Government Documents Department

Survey of sampling-based methods for uncertainty and sensitivity analysis.

Description: Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.
Date: June 1, 2006
Creator: Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. & Storlie, Curt B.
Partner: UNT Libraries Government Documents Department

PROSA-1 : a Probabilistic Response-Surface Analysis Code

Description: Techniques for probabilistic response-surface analysis have been developed to obtain the probability distributions of the consequences of postulated nuclear-reactor accidents. The uncertainties of the consequences are caused by the variability of the system and model input parameters used in the accident analysis. Probability distributions are assigned to the input parameters, and parameter values are systematically chosen from these distributions. These input parameters are then used in deterministic consequence analyses performed by mechanistic accident-analysis codes. The results of these deterministic consequence analyses are used to generate the coefficients for analytical functions that approximate the consequences in terms of the selected input parameters. These approximating functions are used to generate the probability distributions of the consequences with random sampling being used to obtain values for the accident parameters from their distributions. A computer code PROSA has been developed for implementing the probabilistic response-surface technique. Special features of the code generate or treat sensitivities, statistical moments of the input and output variables, region-wise response surfaces, correlated input parameters, and conditional distributions. The code can also be used for calculating important distributions of the input parameters. The use of the code is illustrated in conjunction with the fast-running accident-analysis code SACO to provide probability studies of LMFBR hypothetical core-disruptive accidents. However, the methods and the programming are general and not limited to such applications.
Date: June 1978
Creator: Vaurio, J. K.; Mueller, C.; Kyser, J. M. & Sciaudone, D.
Partner: UNT Libraries Government Documents Department

Statistical Identification of Effective Input Variables

Description: A statistical sensitivity analysis procedure has been developed for ranking the input data of large computer codes in the order of sensitivity-importance. The method is economical for large codes with many input variables, since it uses a relatively small number of computer runs. No prior judgmental elimination of input variables is needed. The screening method is based on stage-wise correlation and extensive regression analysis of output values calculated with selected input value combinations. The regression process deals with multivariate nonlinear functions, and statistical tests are also available for identifying input variables that contribute to threshold effects, i.e., discontinuities in the output variables. A computer code SCREEN has been developed for implementing the screening techniques. The efficiency has been demonstrated by several examples and applied to a fast reactor safety analysis code (Venus-II). However, the methods and the coding are general and not limited to such applications.
Date: September 1982
Creator: Vaurio, J. K.
Partner: UNT Libraries Government Documents Department

Quantization Dimension for Probability Definitions

Description: The term quantization refers to the process of estimating a given probability by a discrete probability supported on a finite set. The quantization dimension Dr of a probability is related to the asymptotic rate at which the expected distance (raised to the rth power) to the support of the quantized version of the probability goes to zero as the size of the support is allowed to go to infinity. This assumes that the quantized versions are in some sense ``optimal'' in that the expected distances have been minimized. In this dissertation we give a short history of quantization as well as some basic facts. We develop a generalized framework for the quantization dimension which extends the current theory to include a wider range of probability measures. This framework uses the theory of thermodynamic formalism and the multifractal spectrum. It is shown that at least in certain cases the quantization dimension function D(r)=Dr is a transform of the temperature function b(q), which is already known to be the Legendre transform of the multifractal spectrum f(a). Hence, these ideas are all closely related and it would be expected that progress in one area could lead to new results in another. It would also be expected that the results in this dissertation would extend to all probabilities for which a quantization dimension function exists. The cases considered here include probabilities generated by conformal iterated function systems (and include self-similar probabilities) and also probabilities generated by graph directed systems, which further generalize the idea of an iterated function system.
Access: This item is restricted to UNT Community Members. Login required if off-campus.
Date: December 2001
Creator: Lindsay, Larry J.
Partner: UNT Libraries

A Comparison of Some Continuity Corrections for the Chi-Squared Test in 3 x 3, 3 x 4, and 3 x 5 Tables

Description: This study was designed to determine whether chis-quared based tests for independence give reliable estimates (as compared to the exact values provided by Fisher's exact probabilities test) of the probability of a relationship between the variables in 3 X 3, 3 X 4 , and 3 X 5 contingency tables when the sample size is 10, 20, or 30. In addition to the classical (uncorrected) chi-squared test, four methods for continuity correction were compared to Fisher's exact probabilities test. The four methods were Yates' correction, two corrections attributed to Cochran, and Mantel's correction. The study was modeled after a similar comparison conducted on 2 X 2 contingency tables and published by Michael Haber.
Date: May 1987
Creator: Mullen, Jerry D. (Jerry Davis)
Partner: UNT Libraries

R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

Description: Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worth of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model evaluation in situations of high ...
Date: September 1, 2008
Creator: Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.; McNamara, Laura A. & Trucano, Timothy Guy
Partner: UNT Libraries Government Documents Department

Probability of loss of assured safety in temperature dependent systems with multiple weak and strong links.

Description: Relationships to determine the probability that a weak link (WL)/strong link (SL) safety system will fail to function as intended in a fire environment are investigated. In the systems under study, failure of the WL system before failure of the SL system is intended to render the overall system inoperational and thus prevent the possible occurrence of accidents with potentially serious consequences. Formal developments of the probability that the WL system fails to deactivate the overall system before failure of the SL system (i.e., the probability of loss of assured safety, PLOAS) are presented for several WWSL configurations: (i) one WL, one SL, (ii) multiple WLs, multiple SLs with failure of any SL before any WL constituting failure of the safety system, (iii) multiple WLs, multiple SLs with failure of all SLs before any WL constituting failure of the safety system, and (iv) multiple WLs, multiple SLs and multiple sublinks in each SL with failure of any sublink constituting failure of the associated SL and failure of all SLs before failure of any WL constituting failure of the safety system. The indicated probabilities derive from time-dependent temperatures in the WL/SL system and variability (i.e., aleatory uncertainty) in the temperatures at which the individual components of this system fail and are formally defined as multidimensional integrals. Numerical procedures based on quadrature (i.e., trapezoidal rule, Simpson's rule) and also on Monte Carlo techniques (i.e., simple random sampling, importance sampling) are described and illustrated for the evaluation of these integrals. Example uncertainty and sensitivity analyses for PLOAS involving the representation of uncertainty (i.e., epistemic uncertainty) with probability theory and also with evidence theory are presented.
Date: December 1, 2004
Creator: Johnson, Jay Dean (ProStat, Mesa, AZ); Oberkampf, William Louis & Helton, Jon Craig (Arizona State University, Tempe, AZ)
Partner: UNT Libraries Government Documents Department

Sensitivity in risk analyses with uncertain numbers.

Description: Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is Dempster-Shafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a ''pinching'' strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered.
Date: June 1, 2006
Creator: Tucker, W. Troy & Ferson, Scott
Partner: UNT Libraries Government Documents Department

PROSA-2: A Probabilistic Response-Surface Analysis and Simulation Code

Description: Response-surface techniques have been developed for obtaining probability distributions of the consequences of postulated nuclear reactor accidents. In these techniques, probability distributions are assigned to the system and model parameters of the accident analysis. A limited number of parameter values (called knot points) are selected and input to a deterministic accident-analysis code. The results of the deterministic analyses are used to generate analytical functions (called response surfaces) that approximate the accident consequences in terms of selected system and model parameters. The response-surface methodology of this report includes both systematical and random knot-point selection schemes, second- and third-degree response surfaces, functional transformations of both input parameters and consequence variables, smooth synthesis of region-wise response surfaces and the treatment of random conditions for conditional distributions. The computer code PROSA-2 developed for implementing these techniques is independent of the deterministic accident-analysis codes.
Date: May 1981
Creator: Vaurio, J. K. & Fletcher, J.
Partner: UNT Libraries Government Documents Department

Ternary Fission of {sup 249}Cf(n,f) and {sup 250}Cf(SF)

Description: During the last years, several Cm and Cf isotopes have been studied by our research group in the frame of a systematic investigation of gas emission characteristics in ternary fission. Here we report on new results on the energy distribution and the emission probability of {sup 3}H, {sup 4}He and {sup 6}He particles emitted in the spontaneous ternary fission of {sup 250}Cf (E{sub exc} = 0 MeV) and in the neutron induced ternary fission of {sup 249}Cf (E{sub exc} = 6.625 MeV). Both measurements were performed using suited and well-calibrated ΔE-E telescope detectors, at the IRMM (Geel, Belgium) for the spontaneous fission and at the very intense neutron beam PF1b at the Institute Laue-Langevin (Grenoble, France) for the neutron induced fission measurement. In this way, the existing database can be enlarged with new results for Z=98 isotopes, which is important for the systematic investigation. Moreover, the investigation of the 'isotope couple' {sup 249}Cf(n,f) - {sup 250}Cf(SF), together with corresponding data for other isotopes, will yield valuable information on the influence of the excitation energy on the particle emission probabilities.
Date: September 1, 2011
Creator: University of Gent, B-9000 Gent, Belgium; CEA Cadarache, F-13108 Saint-Paul-lez-Durance, France; Institute Laue-Langevin, F-38042 Grenoble, France; EC-JRC Institute for Reference Materials and Measurements, B-2440 Geel, Belgium; Wadsworth Center, New York State Department of Health, Albany NY 12201, USA; Lawrence Berkeley National Laboratory, Berkeley, CA 94720, USA et al.
Partner: UNT Libraries Government Documents Department

Dependence in probabilistic modeling, Dempster-Shafer theory, and probability bounds analysis.

Description: This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.
Date: October 1, 2004
Creator: Oberkampf, William Louis; Tucker, W. Troy (Applied Biomathematics, Setauket, NY); Zhang, Jianzhong (Iowa State University, Ames, IA); Ginzburg, Lev (Applied Biomathematics, Setauket, NY); Berleant, Daniel J. (Iowa State University, Ames, IA); Ferson, Scott (Applied Biomathematics, Setauket, NY) et al.
Partner: UNT Libraries Government Documents Department