642 Matching Results

Search Results

Advanced search parameters have been applied.

The power of sensitivity analysis and thoughts on models with large numbers of parameters

Description: The regulatory systems that allow cells to adapt to their environments are exceedingly complex, and although we know a great deal about the intricate mechanistic details of many of these systems, our ability to make accurate predictions about their system-level behaviors is severely limited. We would like to make such predictions for a number of reasons. How can we reverse dysfunctional molecular changes of these systems that cause disease? More generally, how can we harness and direct cellular activities for beneficial purposes? Our ability to make accurate predictions about a system is also a measure ofour fundamental understanding of that system. As evidenced by our mastery of technological systems, a useful understanding ofa complex system can often be obtained through the development and analysis ofa mathematical model, but predictive modeling of cellular regulatory systems, which necessarily relies on quantitative experimentation, is still in its infancy. There is much that we need to learn before modeling for practical applications becomes routine. In particular, we need to address a number of issues surrounding the large number of parameters that are typically found in a model for a cellular regulatory system.
Date: January 1, 2008
Creator: Havlacek, William
Partner: UNT Libraries Government Documents Department

iTOUGH2 Command Reference

Description: iTOUGH2 is a program for parameter estimation, sensitivity analysis, and uncertainty propagation analysis. It is based on the TOUGH2 simulator for non-isothermal multiphase flow in fractured and porous media. This report contains a detailed description of all iTOUGH2 commands.
Date: June 18, 2002
Creator: Finsterle, Stefan
Partner: UNT Libraries Government Documents Department

iTOUGH2 Sample Problems

Description: iTOUGH2 is a program for parameter estimation, sensitivity analysis, and uncertainty propagation analysis. It is based on the TOUGH2 simulator for non-isothermal multiphase flow in fractured and porous media. This report contains a collection of iTOUGH2 sample problems.
Date: June 18, 2002
Creator: Finsterle, Stefan
Partner: UNT Libraries Government Documents Department

Use of Forward Sensitivity Analysis Method to Improve Code Scaling, Applicability, and Uncertainty (CSAU) Methodology

Description: Code Scaling, Applicability, and Uncertainty (CSAU) methodology was developed in late 1980s by US NRC to systematically quantify reactor simulation uncertainty. Basing on CSAU methodology, Best Estimate Plus Uncertainty (BEPU) methods have been developed and widely used for new reactor designs and existing LWRs power uprate. In spite of these successes, several aspects of CSAU have been criticized for further improvement: i.e., (1) subjective judgement in PIRT process; (2) high cost due to heavily relying large experimental database, needing many experts man-years work, and very high computational overhead; (3) mixing numerical errors with other uncertainties; (4) grid dependence and same numerical grids for both scaled experiments and real plants applications; (5) user effects; Although large amount of efforts have been used to improve CSAU methodology, the above issues still exist. With the effort to develop next generation safety analysis codes, new opportunities appear to take advantage of new numerical methods, better physical models, and modern uncertainty qualification methods. Forward sensitivity analysis (FSA) directly solves the PDEs for parameter sensitivities (defined as the differential of physical solution with respective to any constant parameter). When the parameter sensitivities are available in a new advanced system analysis code, CSAU could be significantly improved: (1) Quantifying numerical errors: New codes which are totally implicit and with higher order accuracy can run much faster with numerical errors quantified by FSA. (2) Quantitative PIRT (Q-PIRT) to reduce subjective judgement and improving efficiency: treat numerical errors as special sensitivities against other physical uncertainties; only parameters having large uncertainty effects on design criterions are considered. (3) Greatly reducing computational costs for uncertainty qualification by (a) choosing optimized time steps and spatial sizes; (b) using gradient information (sensitivity result) to reduce sampling number. (4) Allowing grid independence for scaled integral effect test (IET) simulation and real plant applications: (a) ...
Date: October 1, 2010
Creator: Zhao, Haihua; Mousseau, Vincent A. & Dinh, Nam T.
Partner: UNT Libraries Government Documents Department

Climate Change Vulnerability and Resilience: Current Status and Trends for Mexico

Description: Climate change alters different localities on the planet in different ways. The impact on each region depends mainly on the degree of vulnerability that natural ecosystems and human-made infrastructure have to changes in climate and extreme meteorological events, as well as on the coping and adaptation capacity towards new environmental conditions. This study assesses the current resilience of Mexico and Mexican states to such changes, as well as how this resilience will look in the future. In recent studies (Moss et al. 2000, Brenkert and Malone 2005, Malone and Brenket 2008, Ibarrarán et al. 2007), the Vulnerability-Resilience Indicators Model (VRIM) is used to integrate a set of proxy variables that determine the resilience of a region to climate change. Resilience, or the ability of a region to respond to climate variations and natural events that result from climate change, is given by its adaptation and coping capacity and its sensitivity. On the one hand, the sensitivity of a region to climate change is assessed, emphasizing its infrastructure, food security, water resources, and the health of the population and regional ecosystems. On the other hand, coping and adaptation capacity is based on the availability of human resources, economic capacity and environmental capacity.
Date: December 30, 2008
Creator: Ibarraran , Maria E.; Malone, Elizabeth L. & Brenkert, Antoinette L.
Partner: UNT Libraries Government Documents Department

Assessment of Tidal Energy Removal Impacts on Physical Systems: Development of MHK Module and Analysis of Effects on Hydrodynamics

Description: In this report we describe (1) the development, test, and validation of the marine hydrokinetic energy scheme in a three-dimensional coastal ocean model (FVCOM); and (2) the sensitivity analysis of effects of marine hydrokinetic energy configurations on power extraction and volume flux in a coastal bay. Submittal of this report completes the work on Task 2.1.2, Effects of Physical Systems, Subtask 2.1.2.1, Hydrodynamics and Subtask 2.1.2.3, Screening Analysis, for fiscal year 2011 of the Environmental Effects of Marine and Hydrokinetic Energy project.
Date: September 1, 2011
Creator: Yang, Zhaoqing & Wang, Taiping
Partner: UNT Libraries Government Documents Department

Self-validated Variance-based Methods for Sensitivity Analysis of Model Outputs

Description: Global sensitivity analysis (GSA) has the advantage over local sensitivity analysis in that GSA does not require strong model assumptions such as linearity or monotonicity. As a result, GSA methods such as those based on variance decomposition are well-suited to multi-physics models, which are often plagued by large nonlinearities. However, as with many other sampling-based methods, inadequate sample size can badly pollute the result accuracies. A natural remedy is to adaptively increase the sample size until sufficient accuracy is obtained. This paper proposes an iterative methodology comprising mechanisms for guiding sample size selection and self-assessing result accuracy. The elegant features in the the proposed methodology are the adaptive refinement strategies for stratified designs. We first apply this iterative methodology to the design of a self-validated first-order sensitivity analysis algorithm. We also extend this methodology to design a self-validated second-order sensitivity analysis algorithm based on refining replicated orthogonal array designs. Several numerical experiments are given to demonstrate the effectiveness of these methods.
Date: April 20, 2009
Creator: Tong, C
Partner: UNT Libraries Government Documents Department

Solving iTOUGH2 simulation and optimization problems using the PEST protocol

Description: The PEST protocol has been implemented into the iTOUGH2 code, allowing the user to link any simulation program (with ASCII-based inputs and outputs) to iTOUGH2's sensitivity analysis, inverse modeling, and uncertainty quantification capabilities. These application models can be pre- or post-processors of the TOUGH2 non-isothermal multiphase flow and transport simulator, or programs that are unrelated to the TOUGH suite of codes. PEST-style template and instruction files are used, respectively, to pass input parameters updated by the iTOUGH2 optimization routines to the model, and to retrieve the model-calculated values that correspond to observable variables. We summarize the iTOUGH2 capabilities and demonstrate the flexibility added by the PEST protocol for the solution of a variety of simulation-optimization problems. In particular, the combination of loosely coupled and tightly integrated simulation and optimization routines provides both the flexibility and control needed to solve challenging inversion problems for the analysis of multiphase subsurface flow and transport systems.
Date: February 1, 2011
Creator: Finsterle, S.A. & Zhang, Y.
Partner: UNT Libraries Government Documents Department

iTOUGH2: From parameter estimation to model structure identification

Description: iTOUGH2 provides inverse modeling capabilities for the TOUGH2 family of nonisothermal multiphase flow simulators. It can be used for a formalized sensitivity analysis, parameter estimation by automatic model calibration, and uncertainty propagation analyses. While iTOUGH2 has been successfully applied for the estimation of a variety of parameters based on different data types, it is recognized that errors in the conceptual model have a great impact on both the estimated parameters and the subsequent model predictions. Identification of the most suitable model structure is therefore one of the most important and most difficult tasks. Within the iTOUGH2 framework, model identification can be partly addressed through appropriate parameterization of alternative conceptual-model elements. In addition, statistical measures are provided that help rank the performance of different conceptual models. We present a number of features added to the code that allow for a better parameterization of conceptual model elements, specifically heterogeneity. We discuss how these new features can be used to support the identification of key model structure elements and their impact on model predictions.
Date: May 12, 2003
Creator: Finsterle, Stefan
Partner: UNT Libraries Government Documents Department

Sensitivity Assessment of Ozone Models

Description: The activities under this contract effort were aimed at developing sensitivity analysis techniques and fully equivalent operational models (FEOMs) for applications in the DOE Atmospheric Chemistry Program (ACP). MRC developed a new model representation algorithm that uses a hierarchical, correlated function expansion containing a finite number of terms. A full expansion of this type is an exact representation of the original model and each of the expansion functions is explicitly calculated using the original model. After calculating the expansion functions, they are assembled into a fully equivalent operational model (FEOM) that can directly replace the original mode.
Date: January 24, 2000
Creator: Shorter, Jeffrey A.; Rabitz, Herschel A. & Armstrong, Russell A.
Partner: UNT Libraries Government Documents Department

Proposed Methodology for Establishing Area of Applicability

Description: This paper presents the application of sensitivity and uncertainty (S/U) analysis methodologies to the data validation tasks of a criticality safety computational study. The S/U methods presented are designed to provide a formal means of establishing the area (or range) of applicability for criticality safety data validation studies. The development of parameters that are analogous to the standard trending parameters form the key to the technique. These parameters are the so-called D parameters, which represent the differences by energy group of S/U-generated sensitivity profiles, and c parameters, which are the k correlation coefficients, each of which give information relative to the similarity between pairs of selected systems. The use of a Generalized Linear Least-Squares Methodology (GLLSM) tool is also described in this paper. These methods and guidelines are also applied to a sample validation for uranium systems with enrichments greater than 5 wt %.
Date: September 20, 1999
Creator: Broadhead, B. L.; Hopper, C. M. & Parks, C. V.
Partner: UNT Libraries Government Documents Department

Costs of Oil Dependence: A 2000 Update

Description: Oil dependence remains a potentially serious economic and strategic problem for the United States. This report updates previous estimates of the costs of oil dependence to the U.S. economy and introduces several methodological enhancements. Estimates of the costs to the U.S. economy of the oil market upheavals of the last 30 years are in the vicinity of $7 trillion, present value 1998 dollars, about as large as the sum total of payments on the national debt over the same period. Simply adding up historical costs in 1998 dollars without converting to present value results in a Base Case cost estimate of $3.4 trillion. Sensitivity analysis indicates that cost estimates are sensitive to key parameters. A lower bound estimate of $1.7 trillion and an upper bound of $7.1 trillion (not present value) indicate that the costs of oil dependence have been large under almost any plausible set of assumptions. These cost estimates do not include military, strategic or political costs associated with U.S. and world dependence on oil imports.
Date: May 17, 2000
Creator: Greene, D.L.
Partner: UNT Libraries Government Documents Department

Possible Effects of Sampling Biases on Reproduction Rate Estimates for Dolphins in the Eastern Tropical Pacific

Description: From introduction: The purpose of this report is to present a sensitivity analysis on the effects that biased sampling of young calves and lactating females would have on the various estimated life history parameters. This sensitivity analysis provides one measure of reliability of the various estimates. Also, where inconsistencies in the estimates can be identified, this analysis provides one means for evaluating which estimates are likely in error.
Date: March 1983
Creator: Polacheck, Tom
Partner: UNT Libraries Government Documents Department

Analysis of boiling experiment using inverse modeling

Description: Numerical predictions of geothermal reservoir behavior strongly depend on the assumed steam-water relative permeabilities, which are difficult and time-consuming to measure in the laboratory. This paper describes the esti- mation of the parameters of the relative per- meability and capillary pressure functions by automatically matching simulation results to data from a transient boiling experiment performed on a Berea sandstone. A sensitivity analysis reveals the strong dependence of the observed system behavior on effects such as heat transfer from the heater to the core, as well as heat losses through the insulation. Parameters of three conceptual models were estimated by inverse modeling. Each calibra- tion yields consistent effective steam perme- abilities, but the shape of the liquid relative permeability remains ambiguous.
Date: May 1, 1998
Creator: Finsterle, S.; Guerrero, M. & Satik, C.
Partner: UNT Libraries Government Documents Department

Extended Forward Sensitivity Analysis for Uncertainty Quantification

Description: Verification and validation (V&V) are playing more important roles to quantify uncertainties and realize high fidelity simulations in engineering system analyses, such as transients happened in a complex nuclear reactor system. Traditional V&V in the reactor system analysis focused more on the validation part or did not differentiate verification and validation. The traditional approach to uncertainty quantification is based on a 'black box' approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. The 'black box' method mixes numerical errors with all other uncertainties. It is also not efficient to perform sensitivity analysis. Contrary to the 'black box' method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In these types of approaches equations for the propagation of uncertainty are constructed and the sensitivities are directly solved for as variables in the simulation. This paper presents the forward sensitivity analysis as a method to help uncertainty qualification. By including time step and potentially spatial step as special sensitivity parameters, the forward sensitivity method is extended as one method to quantify numerical errors. Note that by integrating local truncation errors over the whole system through the forward sensitivity analysis process, the generated time step and spatial step sensitivity information reflect global numerical errors. The discretization errors can be systematically compared against uncertainties due to other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty qualification. By knowing the relative sensitivity of time and space steps with other interested physical parameters, the simulation is allowed to run at optimized time and space steps without affecting the confidence of ...
Date: September 1, 2011
Creator: Zhao, Haihua & Mousseau, Vincent A.
Partner: UNT Libraries Government Documents Department

Sensitivity technologies for large scale simulation.

Description: Sensitivity analysis is critically important to numerous analysis algorithms, including large scale optimization, uncertainty quantification,reduced order modeling, and error estimation. Our research focused on developing tools, algorithms and standard interfaces to facilitate the implementation of sensitivity type analysis into existing code and equally important, the work was focused on ways to increase the visibility of sensitivity analysis. We attempt to accomplish the first objective through the development of hybrid automatic differentiation tools, standard linear algebra interfaces for numerical algorithms, time domain decomposition algorithms and two level Newton methods. We attempt to accomplish the second goal by presenting the results of several case studies in which direct sensitivities and adjoint methods have been effectively applied, in addition to an investigation of h-p adaptivity using adjoint based a posteriori error estimation. A mathematical overview is provided of direct sensitivities and adjoint methods for both steady state and transient simulations. Two case studies are presented to demonstrate the utility of these methods. A direct sensitivity method is implemented to solve a source inversion problem for steady state internal flows subject to convection diffusion. Real time performance is achieved using novel decomposition into offline and online calculations. Adjoint methods are used to reconstruct initial conditions of a contamination event in an external flow. We demonstrate an adjoint based transient solution. In addition, we investigated time domain decomposition algorithms in an attempt to improve the efficiency of transient simulations. Because derivative calculations are at the root of sensitivity calculations, we have developed hybrid automatic differentiation methods and implemented this approach for shape optimization for gas dynamics using the Euler equations. The hybrid automatic differentiation method was applied to a first order approximation of the Euler equations and used as a preconditioner. In comparison to other methods, the AD preconditioner showed better convergence behavior. Our ultimate ...
Date: January 1, 2005
Creator: Collis, Samuel Scott; Bartlett, Roscoe Ainsworth; Smith, Thomas Michael; Heinkenschloss, Matthias (Rice University, Houston, TX); Wilcox, Lucas C. (Brown University, Providence, RI); Hill, Judith C. (Carnegie Mellon University, Pittsburgh, PA) et al.
Partner: UNT Libraries Government Documents Department

Methods in Use for Sensitivity Analysis, Uncertainty Evaluation, and Target Accuracy Assessment

Description: Sensitivity coefficients can be used for different objectives like uncertainty estimates, design optimization, determination of target accuracy requirements, adjustment of input parameters, and evaluations of the representativity of an experiment with respect to a reference design configuration. In this paper the theory, based on the adjoint approach, that is implemented in the ERANOS fast reactor code system is presented along with some unique tools and features related to specific types of problems as is the case for nuclide transmutation, reactivity loss during the cycle, decay heat, neutron source associated to fuel fabrication, and experiment representativity.
Date: October 1, 2007
Creator: Palmiotti, G.; Salvatores, M. & Aliberti, G.
Partner: UNT Libraries Government Documents Department

Strategies for cost-effective carbon reductions: A sensitivity analysis of alternative scenarios

Description: Analyses of alternative futures often present results for a limited set of scenarios, with little if any sensitivity analysis to identify the factors affecting the scenario results. This approach creates an artificial impression of certainty associated with the scenarios considered, and inhibits understanding of the underlying forces. This paper summarizes the economic and carbon savings sensitivity analysis completed for the Scenarios for a Clean Energy Future study (IWG, 2000). Its 19 sensitivity cases provide insight into the costs and carbon-reduction impacts of a carbon permit trading system, demand-side efficiency programs, and supply-side policies. Impacts under different natural gas and oil price trajectories are also examined. The results provide compelling evidence that policy opportunities exist to reduce carbon emissions and save society money.
Date: July 11, 2001
Creator: Gumerman, Etan; Koomey, Jonathan G. & Brown, Marilyn
Partner: UNT Libraries Government Documents Department

Uncertainty and sensitivity analyses of ballast life-cycle cost and payback period

Description: The paper introduces an innovative methodology for evaluating the relative significance of energy-efficient technologies applied to fluorescent lamp ballasts. The method involves replacing the point estimates of life cycle cost of the ballasts with uncertainty distributions reflecting the whole spectrum of possible costs, and the assessed probability associated with each value. The results of uncertainty and sensitivity analyses will help analysts reduce effort in data collection and carry on analysis more efficiently. These methods also enable policy makers to gain an insightful understanding of which efficient technology alternatives benefit or cost what fraction of consumers, given the explicit assumptions of the analysis.
Date: June 1, 2000
Creator: McMahon, James E.; Liu, Xiaomin; Turiel, Ike; Hakim, Sajid & Fisher, Diane
Partner: UNT Libraries Government Documents Department

Studies in Chemical Dynamics

Description: This final report draws together the research carried from February, 1986 through January, 2003 concerning a series of topics in chemical dynamics. The specific areas of study include molecular collisions, chemical kinetics, data inversion to extract potential energy surfaces, and model reduction of complex kinetic systems.
Date: June 27, 2003
Creator: Rabitz, Herschel & Ho, Tak-San
Partner: UNT Libraries Government Documents Department

Sensitivity Analysis of the DARHT-II 2.5MV/2kA Diode

Description: This report summarizes the study of the tolerance limits on the assembly of the cathode and the Pierce electrode for the DARHT-II diode (2.5 MV, 2 kA case), performed through a series of computer simulations using the PIC code WARP [1]. We have considered sources of beam quality degradation like the errors in axial and transverse positioning, and the size of the radial gap between the cathode and the Pierce electrode (shroud). The figure of merit was chosen to be the RMS beam (edge) emittance at a distance of 1 meter from the cathode, as defined by {var_epsilon}{sub x} = 4 {beta}{gamma} {radical}(<x{sup 2}><x{prime}{sup 2}>-<xx{prime}>{sup 2}) {center_dot}. The analysis shows that to position the cathode at the correct axial and transverse location is more important than the size of the radial gap.
Date: December 22, 2006
Creator: Henestroza, Enrique
Partner: UNT Libraries Government Documents Department

A Global Sensitivity Analysis Methodology for Multi-physics Applications

Description: Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.
Date: February 2, 2007
Creator: Tong, C. H. & Graziani, F. R.
Partner: UNT Libraries Government Documents Department