555 Matching Results

Search Results

Advanced search parameters have been applied.

An Analysis of Salinity in Streams of the Green River Basin

Description: Abstract: Dissolved-solids concentrations and loads can be estimated from streamflow records using a regression model derived from chemical analyses of monthly samples. The model takes seasonal effects into account by the inclusion of simple-harmonic time functions. Monthly mean dissolved-solids loads simulated for a 6-year period at U.S. Geological Survey water-quality stations in the Green River Basin of Wyoming agree closely with corresponding loads estimated from daily specific-conductance records. In a demonstration of uses of the model, an average gain of 114,000 tons of dissolved solids per year was estimated for 6-year period in a 70-mile reach of the Green River from Fontemelle Reservoir to the town of Green River, including the lower 30-mile reach of the Big Sandy River.
Date: October 1977
Creator: DeLong, Lewis L.
Partner: UNT Libraries Government Documents Department

Structural damage detection using the holder exponent.

Description: This paper implements a damage detection strategy that identifies damage sensitive features associated with nonlinearities. Some rion-linezlrities result from discontinuities introduced into the data by certain types of darnage. These discontinuities may also result from noise in the measured dynamic response data or can be caused by random excitation of the system. The Holder Exponent, which is a measure of the degree to which a signal is differentiable, is used to detect the discontinuities. By studying the Holder bponent as a function af time, a statistical model is developed that classifies changes in the Holder Exponent that are associated with clamage-induced discontinuities. The results show that for certain cases, the Holder Exponent is an effective technique to detect damage.
Date: January 1, 2002
Creator: Farrar, C. R. (Charles R.); Do, N. B. (Nguyen B.); Green, S. R. (Scott R.) & Schwartz, T. A. (Timothy A.)
Partner: UNT Libraries Government Documents Department

AutomaDeD: Automata-Based Debugging for Dissimilar Parallel Tasks

Description: Today's largest systems have over 100,000 cores, with million-core systems expected over the next few years. This growing scale makes debugging the applications that run on them a daunting challenge. Few debugging tools perform well at this scale and most provide an overload of information about the entire job. Developers need tools that quickly direct them to the root cause of the problem. This paper presents AutomaDeD, a tool that identifies which tasks of a large-scale application first manifest a bug at a specific code region at a specific point during program execution. AutomaDeD creates a statistical model of the application's control-flow and timing behavior that organizes tasks into groups and identifies deviations from normal execution, thus significantly reducing debugging effort. In addition to a case study in which AutomaDeD locates a bug that occurred during development of MVAPICH, we evaluate AutomaDeD on a range of bugs injected into the NAS parallel benchmarks. Our results demonstrate that detects the time period when a bug first manifested itself with 90% accuracy for stalls and hangs and 70% accuracy for interference faults. It identifies the subset of processes first affected by the fault with 80% accuracy and 70% accuracy, respectively and the code region where where the fault first manifested with 90% and 50% accuracy, respectively.
Date: March 23, 2010
Creator: Bronevetsky, G; Laguna, I; Bagchi, S; de Supinski, B R; Ahn, D & Schulz, M
Partner: UNT Libraries Government Documents Department

Characterizing the Response of Commercial and Industrial Facilities to Dynamic Pricing Signals from the Utility

Description: We describe a method to generate statistical models of electricity demand from Commercial and Industrial (C&I) facilities including their response to dynamic pricing signals. Models are built with historical electricity demand data. A facility model is the sum of a baseline demand model and a residual demand model; the latter quantifies deviations from the baseline model due to dynamic pricing signals from the utility. Three regression-based baseline computation methods were developed and analyzed. All methods performed similarly. To understand the diversity of facility responses to dynamic pricing signals, we have characterized the response of 44 C&I facilities participating in a Demand Response (DR) program using dynamic pricing in California (Pacific Gas and Electric's Critical Peak Pricing Program). In most cases, facilities shed load during DR events but there is significant heterogeneity in facility responses. Modeling facility response to dynamic price signals is beneficial to the Independent System Operator for scheduling supply to meet demand, to the utility for improving dynamic pricing programs, and to the customer for minimizing energy costs.
Date: July 1, 2010
Creator: Mathieu, Johanna L.; Gadgil, Ashok J.; Callaway, Duncan S.; Price, Phillip N. & Kiliccote, Sila
Partner: UNT Libraries Government Documents Department

Extracting a Whisper from the DIN: A Bayesian-Inductive Approach to Learning an Anticipatory Model of Cavitation

Description: For several reasons, Bayesian parameter estimation is superior to other methods for inductively learning a model for an anticipatory system. Since it exploits prior knowledge, the analysis begins from a more advantageous starting point than other methods. Also, since "nuisance parameters" can be removed from the Bayesian analysis, the description of the model need not be as complete as is necessary for such methods as matched filtering. In the limit of perfectly random noise and a perfect description of the model, the signal-to-noise ratio improves as the square root of the number of samples in the data. Even with the imperfections of real-world data, Bayesian methods approach this ideal limit of performance more closely than other methods. These capabilities provide a strategy for addressing a major unsolved problem in pump operation: the identification of precursors of cavitation. Cavitation causes immediate degradation of pump performance and ultimate destruction of the pump. However, the most efficient point to operate a pump is just below the threshold of cavitation. It might be hoped that a straightforward method to minimize pump cavitation damage would be to simply adjust the operating point until the inception of cavitation is detected and then to slightly readjust the operating point to let the cavitation vanish. However, due to the continuously evolving state of the fluid moving through the pump, the threshold of cavitation tends to wander. What is needed is to anticipate cavitation, and this requires the detection and identification of precursor features that occur just before cavitation starts.
Date: November 7, 1999
Creator: Kercel, S.W.
Partner: UNT Libraries Government Documents Department

UNCERTAINTY IN PHASE ARRIVAL TIME PICKS FOR REGIONAL SEISMIC EVENTS: AN EXPERIMENTAL DESIGN

Description: The detection and timing of seismic arrivals play a critical role in the ability to locate seismic events, especially at low magnitude. Errors can occur with the determination of the timing of the arrivals, whether these errors are made by automated processing or by an analyst. One of the major obstacles encountered in properly estimating travel-time picking error is the lack of a clear and comprehensive discussion of all of the factors that influence phase picks. This report discusses possible factors that need to be modeled to properly study phase arrival time picking errors. We have developed a multivariate statistical model, experimental design, and analysis strategy that can be used in this study. We have embedded a general form of the International Data Center(IDC)/U.S. National Data Center (USNDC) phase pick measurement error model into our statistical model. We can use this statistical model to optimally calibrate a picking error model to regional data. A follow-on report will present the results of this analysis plan applied to an implementation of an experiment/data-gathering task.
Date: February 1, 2001
Creator: VELASCO, A. & AL, ET
Partner: UNT Libraries Government Documents Department

A new global hydrogen equation of state model

Description: Simple statistical mechanics models have been assembled into a wide-range equation of state for the hydrogen isotopes. The solid is represented by an Einstein-Grtineisen model delimited by a Lindemann melting curve. The fluid is represented by an ideal gas plus a soft-sphere fluid configurational term. Dissociation and ionization are approximated by modifying the ideal gas chemical-equilibrium formulation. The T = 0 isotherm and dissociation models have been fitted to new diamond-anvil isotherm and laser-generated shock data. The main limitation of the model is in ionization at high compression.
Date: June 25, 1999
Creator: Young, D
Partner: UNT Libraries Government Documents Department

Population dynamics of minimally cognitive individuals. Part I: Introducing knowledge into the dynamics

Description: The author presents a new approach for modeling the dynamics of collections of objects with internal structure. Based on the fact that the behavior of an individual in a population is modified by its knowledge of other individuals, a procedure for accounting for knowledge in a population of interacting objects is presented. It is assumed that each object has partial (or complete) knowledge of some (or all) other objects in the population. The dynamical equations for the objects are then modified to include the effects of this pairwise knowledge. This procedure has the effect of projecting out what the population will do from the much larger space of what it could do, i.e., filtering or smoothing the dynamics by replacing the complex detailed physical model with an effective model that produces the behavior of interest. The procedure therefore provides a minimalist approach for obtaining emergent collective behavior. The use of knowledge as a dynamical quantity, and its relationship to statistical mechanics, thermodynamics, information theory, and cognition microstructure are discussed.
Date: July 1, 1995
Creator: Schmieder, R.W.
Partner: UNT Libraries Government Documents Department

Development of a Mathematical Air-Leakage Model from MeasuredData

Description: A statistical model was developed to relate residential building shell leakage to building characteristics such as building height, floor area, floor leakage, duct leakage, and year built or the age of the house. Statistical regression techniques were used to determine which of the potential building characteristics best described the data. Seven preliminary regressions were performed to investigate the influence of each variable. The results of the eighth and last multivariable linear regression form the predictive model. The major factors that influence the tightness of a residential building are participation in an energy efficiency program (40% tighter than ordinary homes), having low-income occupants (145% leakier than ordinary) and the age of a house (1% increase in Normalized Leakage per year). This predictive model may be applied to data within the range of the data that was used to develop the model.
Date: May 1, 2006
Creator: McWilliams, Jennifer & Jung, Melanie
Partner: UNT Libraries Government Documents Department

An improved statistical model for linear antenna input impedance in an electrically large cavity.

Description: This report presents a modification of a previous model for the statistical distribution of linear antenna impedance. With this modification a simple formula is determined which yields accurate results for all ratios of modal spectral width to spacing. It is shown that the reactance formula approaches the known unit Lorentzian in the lossless limit.
Date: March 1, 2005
Creator: Johnson, William Arthur; Warne, Larry Kevin; Jorgenson, Roy Eberhardt & Lee, Kelvin S. H. (ITT Industries/AES, Los Angeles, CA)
Partner: UNT Libraries Government Documents Department

Monitoring and Evaluation of Smolt Migration in the Columbia Basin Volume VIII : Comparison of the RPA Testing Rules, Technical Report 2002.

Description: The 2000 FCRPS Biological Opinion (BO) suggested two statistical hypothesis tests to assess the RPA compliance by the years 2005 and 2008. With the decision rules proposed in the BO, Skalski and Ngouenet (2001) developed a compliance framework based on classical t-tests and used Monte-Carlo simulations to calculate power curves. Unfortunately, the two-sample t tests proposed in the BO only have moderate-to-low probability of correctly assessing the true status of the smolt survival recovery. We have developed a superior two-phase regression statistical model for testing the RPA compliance. The two-phase regression model improves the statistical power over the standard two-sample t-tests. In addition, the two-phase regression model has a higher probability of correctly assessing the true status of the smolt survival recovery. These classical statistical power curve approaches do not incorporate prior knowledge into the decision process. Therefore, we propose to examine Bayesian methods that complement classical statistics in situations where uncertainty must be taken into account. The Bayesian analysis will incorporate scientific/biological knowledge/expertise to thoroughly assess the RPA compliance in 2005 and 2008.
Date: August 1, 2002
Creator: Skalski, John & Ngouenet, Roger
Partner: UNT Libraries Government Documents Department

Advances in Bayesian Model Based Clustering Using Particle Learning

Description: Recent work by Carvalho, Johannes, Lopes and Polson and Carvalho, Lopes, Polson and Taddy introduced a sequential Monte Carlo (SMC) alternative to traditional iterative Monte Carlo strategies (e.g. MCMC and EM) for Bayesian inference for a large class of dynamic models. The basis of SMC techniques involves representing the underlying inference problem as one of state space estimation, thus giving way to inference via particle filtering. The key insight of Carvalho et al was to construct the sequence of filtering distributions so as to make use of the posterior predictive distribution of the observable, a distribution usually only accessible in certain Bayesian settings. Access to this distribution allows a reversal of the usual propagate and resample steps characteristic of many SMC methods, thereby alleviating to a large extent many problems associated with particle degeneration. Furthermore, Carvalho et al point out that for many conjugate models the posterior distribution of the static variables can be parametrized in terms of [recursively defined] sufficient statistics of the previously observed data. For models where such sufficient statistics exist, particle learning as it is being called, is especially well suited for the analysis of streaming data do to the relative invariance of its algorithmic complexity with the number of data observations. Through a particle learning approach, a statistical model can be fit to data as the data is arriving, allowing at any instant during the observation process direct quantification of uncertainty surrounding underlying model parameters. Here we describe the use of a particle learning approach for fitting a standard Bayesian semiparametric mixture model as described in Carvalho, Lopes, Polson and Taddy. In Section 2 we briefly review the previously presented particle learning algorithm for the case of a Dirichlet process mixture of multivariate normals. In Section 3 we describe several novel extensions to the original ...
Date: November 19, 2009
Creator: Merl, D M
Partner: UNT Libraries Government Documents Department

Topology for statistical modeling of petascale data.

Description: This document presents current technical progress and dissemination of results for the Mathematics for Analysis of Petascale Data (MAPD) project titled 'Topology for Statistical Modeling of Petascale Data', funded by the Office of Science Advanced Scientific Computing Research (ASCR) Applied Math program. Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is thus to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, our approach is based on the complementary techniques of combinatorial topology and statistical modeling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modeling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. This document summarizes the technical advances we have made to date that were made possible in whole or in part by MAPD funding. These technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modeling, and (3) new integrated topological and statistical methods.
Date: July 1, 2011
Creator: Pascucci, Valerio (University of Utah, Salt Lake City, UT); Mascarenhas, Ajith Arthur; Rusek, Korben (Texas A&M University, College Station, TX); Bennett, Janine Camille; Levine, Joshua (University of Utah, Salt Lake City, UT); Pebay, Philippe Pierre et al.
Partner: UNT Libraries Government Documents Department

Stars in Photographic Emulsions Initiated by Deuterons Part II. Theoretical

Description: The theory of high energy nuclear stars depends on a theory of nuclear transparency and on a theory of nuclear evaporation. The transparency can be computed on the basis of a model proposed by R. Serber as soon as the interactions between the nucleons and the incident particle are known. The evaporation can be computed on the basis of the statistical model of the nucleus as soon as the nuclear entropy and binding energies of the evaporated particles are known. The calculations have been formulated with approximate values for the above interactions, entropies, and binding energies; and by means of various mathematical methods: a method of averages, a method of reaction integrals, and one using diffusion equations. Probability distributions have been obtained for the number of prongs per star, and distributions are being computed for the energy and angle of a prong. The results are in qualitative agreement with the observations on photographic emulsions described in Part I.
Date: January 1, 1948
Creator: Horning, Wendell & Baumhoff, L.
Partner: UNT Libraries Government Documents Department

Stars in Photographic Emulsions Initiated by Deuterons Part II. Theoretical

Description: The theory of high energy nuclear stars depends on a theory of nuclear transparency and on a theory of nuclear evaporation. The transparency can be computed on the basis of a model proposed by R. Serber as soon as the interactions between the nucleons and the incident particle are known. The evaporation can be computed on the basis of the statistical model of the nucleus as soon as the nuclear entropy and binding energies of the evaporated particles are known. With approximate values for the above interactions, entropies, and binding energies, a probability distribution has been computed for the number of prongs per star. The results are in qualitative agreement with the observations on photographic emulsions described in Part 1.
Date: September 7, 1948
Creator: Horning, W. & Baumhoff, L.
Partner: UNT Libraries Government Documents Department

ESTIMATING THE STRENGTH OF SINGLE-ENDED DISLOCATION SOURCES IN MICROMETER-SIZED SINGLE CRYSTALS

Description: A recent study indicated that the behavior of single-ended dislocation sources contributes to the flow strength of micrometer-scale crystals. In this study 3D discrete dislocation dynamics simulations of micrometer-sized volumes are used to calculate the effects of anisotropy of dislocation line tension (increasing Poisson's ratio, {nu}) on the strength of single-ended dislocation sources and, to compare them with the strength of double-ended sources of equal length. This is done by directly modeling their plastic response within a 1 micron cubed FCC Ni single crystal using DDS. In general, double-ended sources are stronger than single-ended sources of an equal length and exhibit no significant effects from truncating the long-range elastic fields at this scale. The double-ended source strength increases with Poisson ratio ({nu}), exhibiting an increase of about 50% at u = 0.38 (value for Ni) as compared to the value at {nu} = 0. Independent of dislocation line direction, for {nu} greater than 0.20, the strengths of single-ended sources depend upon the sense of the stress applied. The value for {alpha}, in the expression for strength, {tau} = {alpha}(L){micro}b/L is shown to vary from 0.4 to 0.84 depending upon the character of the dislocation and the direction of operation of the source at {nu} corresponding to that of Ni, 0.38 and a length of 933b. By varying the lengths of the sources from 933b to 233b, it was shown that the scaling of the strength of single-ended and double-ended sources with their length both follow a ln(L/b)/(L/b) dependence. Surface image stresses are shown to have little effect on the critical stress of single-ended sources at a length of {approx}250b or greater. The relationship between these findings and a recent statistical model for the hardening of small volumes is also discussed.
Date: May 3, 2007
Creator: Rao, S I; Dimiduk, D M; Tang, M; Parthasarathy, T A; Uchic, M D & Woodward, C
Partner: UNT Libraries Government Documents Department

Survival Estimates for the Passage of Spring-Migrating Juvenile Salmonids through Snake and Columbia River Dams and Reservoirs, 2008.

Description: In 2008, the National Marine Fisheries Service completed the sixteenth year of a study to estimate survival and travel time of juvenile salmonids Oncorhynchus spp. passing through dams and reservoirs on the Snake and Columbia Rivers. All estimates were derived from detections of fish tagged with passive integrated transponder (PIT) tags. We PIT tagged and released a total of 18,565 hatchery steelhead O. mykiss, 15,991 wild steelhead, and 9,714 wild yearling Chinook salmon O. tshawytscha at Lower Granite Dam in the Snake River. In addition, we utilized fish PIT tagged by other agencies at traps and hatcheries upstream from the hydropower system and at sites within the hydropower system in both the Snake and Columbia Rivers. These included 122,061 yearling Chinook salmon tagged at Lower Granite Dam for evaluation of latent mortality related to passage through Snake River dams. PIT-tagged smolts were detected at interrogation facilities at Lower Granite, Little Goose, Lower Monumental, Ice Harbor, McNary, John Day, and Bonneville Dams and in the PIT-tag detector trawl operated in the Columbia River estuary. Survival estimates were calculated using a statistical model for tag-recapture data from single release groups (the single-release model). Primary research objectives in 2008 were to: (1) estimate reach survival and travel time in the Snake and Columbia Rivers throughout the migration period of yearling Chinook salmon and steelhead, (2) evaluate relationships between survival estimates and migration conditions, and (3) evaluate the survival estimation models under prevailing conditions. This report provides reach survival and travel time estimates for 2008 for PIT-tagged yearling Chinook salmon (hatchery and wild), hatchery sockeye salmon O. nerka, hatchery coho salmon O. kisutch, and steelhead (hatchery and wild) in the Snake and Columbia Rivers. Additional details on the methodology and statistical models used are provided in previous reports cited here. Survival and detection probabilities ...
Date: June 23, 2009
Creator: Faulkner, James R.; Smith, Steven G. & Muir, William D.
Partner: UNT Libraries Government Documents Department

Updated Evaluations for Americium Isotopes

Description: Here we describe evaluations for Am isotopes that will be included in the next release of ENDL. Current ENDL99 evaluations for these isotopes are quite outdated and almost entirely undocumented. Because Am is important for several DNT applications, and because quality evaluations are either readily available or easily calculated, the effort to update ENDL seems warranted. Results from good existing evaluations are adopted whenever possible. To this end we devote the next section of this report to a consideration of the availability of evaluations The quality of different evaluations as well as comparisons against experiments are also presented and used to motivate our choice of adopted data sets. Plans for modifying and improving adopted evaluations are also discussed. For {sup 240}Am there are no existing evaluations. To fill this gap, we are providing a new Am evaluation based on calculations with the statistical model reaction codes TALYS and EMPIRE. This evaluation is described below. The ENDF/B-VI formatted file containing this evaluation is given in the appendix.
Date: September 22, 2005
Creator: Brown, D A & Pruet, J
Partner: UNT Libraries Government Documents Department

USER 2.1; User Specified Estimation Routine, Techncial Manual 2003.

Description: This document is primarily a description of the user interface for USER2.1; it is not a description of the statistical theory and calculations behind USER. This project is funded by the Bonneville Power Administration, U.S. Department of Energy, under Contract No. 004126, Project No. 198910700 as part of the BPA's program to protect, mitigate, and enhance fish and wildlife affected by the development and operation of hydroelectric facilities on the Columbia River and its tributaries. The analysis of fish and wildlife data requires investigators to have the ability to develop statistical models tailored to their study requirements. Hence, a flexible platform to develop statistical likelihood models to estimate demographic parameters is necessary. To this end, Program USER (User Specified Estimation Routine) was developed to provide a convenient platform for investigators to develop statistical models and analyze tagging and count data. The program is capable of developing models and analyzing any count data that can be described by multinomial or product multinomial distributions. Such data include release-recapture studies using PIT-tags, radio-tags, balloon-tags, and acoustic-tags to estimate survival, movement, and demographic data on the age and/or sex structure of wild populations. The user of the program can specify the parameters and model structure at will to tailor the analyses to the specific requirements of the field sampling program, the data, and populations under investigation. All of this is available without the need for the user to know advanced programming languages or numerical analysis techniques, and without involving cumbersome software developed for extraneous purposes. Program USER represents a powerful statistical modeling routine that can be readily used by investigators with a wide range of interests and quantitative skills.
Date: July 1, 2003
Creator: Lady, James; Westhagen, Peter & Skalski, John
Partner: UNT Libraries Government Documents Department

Characterization of Floodflows Along the Arkansas River Without Regulation by Pueblo Reservoir, Portland to John Martin Reservoir, Southeastern Colorado

Description: This report describes "a method for estimating flow characteristics of flood hydrographs between Portland, Colorado, and John Martin Reservoir." It contains maps, graphs, and tables.
Date: 1980
Creator: Little, John R. & Bauer, Daniel P.
Partner: UNT Libraries Government Documents Department

Modeling and Measurement Constraints in Fault Diagnostics for HVAC Systems

Description: Many studies have shown that energy savings of five to fifteen percent are achievable in commercial buildings by detecting and correcting building faults, and optimizing building control systems. However, in spite of good progress in developing tools for determining HVAC diagnostics, methods to detect faults in HVAC systems are still generally undeveloped. Most approaches use numerical filtering or parameter estimation methods to compare data from energy meters and building sensors to predictions from mathematical or statistical models. They are effective when models are relatively accurate and data contain few errors. In this paper, we address the case where models are imperfect and data are variable, uncertain, and can contain error. We apply a Bayesian updating approach that is systematic in managing and accounting for most forms of model and data errors. The proposed method uses both knowledge of first principle modeling and empirical results to analyze the system performance within the boundaries defined by practical constraints. We demonstrate the approach by detecting faults in commercial building air handling units. We find that the limitations that exist in air handling unit diagnostics due to practical constraints can generally be effectively addressed through the proposed approach.
Date: May 30, 2010
Creator: Najafi, Massieh; Auslander, David M.; Bartlett, Peter L.; Haves, Philip & Sohn, Michael D.
Partner: UNT Libraries Government Documents Department

Robust real-time change detection in high jitter.

Description: A new method is introduced for real-time detection of transient change in scenes observed by staring sensors that are subject to platform jitter, pixel defects, variable focus, and other real-world challenges. The approach uses flexible statistical models for the scene background and its variability, which are continually updated to track gradual drift in the sensor's performance and the scene under observation. Two separate models represent temporal and spatial variations in pixel intensity. For the temporal model, each new frame is projected into a low-dimensional subspace designed to capture the behavior of the frame data over a recent observation window. Per-pixel temporal standard deviation estimates are based on projection residuals. The second approach employs a simple representation of jitter to generate pixelwise moment estimates from a single frame. These estimates rely on spatial characteristics of the scene, and are used gauge each pixel's susceptibility to jitter. The temporal model handles pixels that are naturally variable due to sensor noise or moving scene elements, along with jitter displacements comparable to those observed in the recent past. The spatial model captures jitter-induced changes that may not have been seen previously. Change is declared in pixels whose current values are inconsistent with both models.
Date: August 1, 2009
Creator: Simonson, Katherine Mary & Ma, Tian J.
Partner: UNT Libraries Government Documents Department

Statistical validation of physical system models

Description: It is common practice in applied mechanics to develop mathematical models for mechanical system behavior. Frequently, the actual physical system being modeled is also available for testing, and sometimes the test data are used to help identify the parameters of the mathematical model. However, no general-purpose technique exists for formally, statistically judging the quality of a model. This paper suggests a formal statistical procedure for the validation of mathematical models of physical systems when data taken during operation of the physical system are available. The statistical validation procedure is based on the bootstrap, and it seeks to build a framework where a statistical test of hypothesis can be run to determine whether or not a mathematical model is an acceptable model of a physical system with regard to user-specified measures of system behavior. The approach to model validation developed in this study uses experimental data to estimate the marginal and joint confidence intervals of statistics of interest of the physical system. These same measures of behavior are estimated for the mathematical model. The statistics of interest from the mathematical model are located relative to the confidence intervals for the statistics obtained from the experimental data. These relative locations are used to judge the accuracy of the mathematical model. A numerical example is presented to demonstrate the application of the technique.
Date: October 1, 1996
Creator: Paez, T. L.; Barney, P.; Hunter, N. F.; Ferregut, C. & Perez, L.E.
Partner: UNT Libraries Government Documents Department