551 Matching Results

Search Results

Advanced search parameters have been applied.

Computing the correlation and other things directly from the raw pairs

Description: We want a faster and more robust way to compute the correlation, expanded in Spherical (or Cartesian) Harmonics. We also want to include the cross-{ell},m data covariance that are there, but currently ignored. We don't want to get bogged down in fancy binning in x-y-z or r-{theta}-{phi}, just r. Want to just look at C{sub {ell}m} to decide how many terms to keep--or better yet the pair distributions directly.
Date: August 16, 2007
Creator: Brown, D A
Partner: UNT Libraries Government Documents Department

Uncertainty Propagation in Calibration of Parallel Kinematic Machines

Description: Over the last decade, multi-axis machine tools and robots based on parallel kinematic mechanisms (PKMs) have been developed and marketed worldwide. Positional accuracy in these machines is controlled by accurate knowledge of the kinematic parameters which consists of the joint center locations and distances between joint pairs. Since these machines tend to be rather large in size, the kinematic parameters (joint center locations, and initial strut lengths) are difficult to determine when these machines are in their fully assembled state. Work recently completed by the University of Florida and Sandia National Laboratories has yielded a method for determining all of the kinematic parameters of an assembled parallel kinematic device. This paper contains a brief synopsis of the calibration method created, an error budget, an uncertainty analysis for the recovered kinematic parameters and the propagation of these uncertainties to the tool tip.
Date: November 2, 1999
Creator: JOKIEL JR.,BERNHARD & ZIERGERT,JOHN C.
Partner: UNT Libraries Government Documents Department

Analytical chemistry measurement assurance programs: More than just measurement control programs

Description: Assurance of measurement accuracy and precision is required and/or recommended by regulations and guides for good laboratory practices for analytical chemistry laboratories. Measurement Control programs(MCPs) and or Measurement Assurance programs (MAPs) are means for determining and controlling the accuracy and precision of a laboratory`s measurements. Regulations and guides often allow for interpretation of what is necessary to assure measurement quality and how it is done. Consequently, a great diversity exists between laboratories` measurement quality control programs. This paper will describe various levels of measurement control(MC) and the differences between a comprehensive MAP and various levels of MCPs. It will explain the benefits of establishing a comprehensive MAP based on a set of basic principles. MCPs range from go/no-go testing of a check standard`s measurement against control limits to a comprehensive MAP. Features of the latter include: an independent verisimilitude (matrix matched) standard having known uncertainties; customer tolerance limits as well as control limits; statistical tests for bias and precision testing; and estimating the total measurement process uncertainty based upon the combination of both the measurement system and standard`s uncertainties. A commercial measurement assurance program (JTIPMAP TradeMark) was evaluated by the author`s laboratories and compared to locally developed as well as other commercial software packages. Results of the evaluation, comparisons, conclusions and recommendations are presented.
Date: January 1, 1997
Creator: Clark, J.P. & Shull, A.H.
Partner: UNT Libraries Government Documents Department

Study of the uncertainty of the gluon distribution

Description: The uncertainty in the calculation of many important new processes at the Tevatron and LHC is dominated by that concerning the gluon distribution function. We investigate the uncertainty in the gluon distribution of the proton by systematically varying the gluon parameters in the global QCD analysis of parton distributions. The results depend critically on the parton momentum fraction x and the QCD scale Q{sup 2}. The uncertainties are presented for integrated gluon-gluon and gluon-quark luminosities for both the Tevatron and LHC as a function of {radical}{tau}={radical}x{sub 1}x{sub 2}={radical}{cflx s}/s, the most relevant quantity for new particle production. The uncertainties are reasonably small, except for large x.
Date: July 1, 1998
Creator: Huston, J., FERMI
Partner: UNT Libraries Government Documents Department

Risk communication: Uncertainties and the numbers game

Description: The science of risk assessment seeks to characterize the potential risk in situations that may pose hazards to human health or the environment. However, the conclusions reached by the scientists and engineers are not an end in themselves - they are passed on to the involved companies, government agencies, legislators, and the public. All interested parties must then decide what to do with the information. Risk communication is a type of technical communication that involves some unique challenges. This paper first defines the relationships between risk assessment, risk management, and risk communication and then explores two issues in risk communication: addressing uncertainty and putting risk number into perspective.
Date: August 30, 1995
Creator: Ortigara, M.
Partner: UNT Libraries Government Documents Department

ISO/GUM UNCERTAINTIES AND CIAAW (UNCERTAINTY TREATMENT FOR RECOMMENDED ATOMIC WEIGHTS AND ISOTOPIC ABUNDANCES)

Description: The International Organization for Standardization (ISO) has published a Guide to the expression of Uncertainty in Measurement (GUM). The IUPAC Commission on Isotopic Abundance and Atomic Weight (CIAAW) began attaching uncertainty limits to their recommended values about forty years ago. CIAAW's method for determining and assigning uncertainties has evolved over time. We trace this evolution to their present method and their effort to incorporate the basic ISO/GUM procedures into evaluations of these uncertainties. We discuss some dilemma the CIAAW faces in their present method and whether it is consistent with the application of the ISO/GUM rules. We discuss the attempt to incorporate variations in measured isotope ratios, due to natural fractionation, into the ISO/GUM system. We make some observations about the inconsistent treatment in the incorporation of natural variations into recommended data and uncertainties. A recommendation for expressing atomic weight values using a tabulated range of values for various chemical elements is discussed.
Date: July 23, 2007
Creator: Holden, N. E.
Partner: UNT Libraries Government Documents Department

Using the Mount Pinatubo Volcanic Eruption to Determine Climate Sensitivity: Comments on "Climate Forcing by the Volcanic Eruption of Mount Pinatubo" by David H. Douglass and Robert S. Knox

Description: [1] Douglass and Knox [2005], hereafter referred to as DK, present an analysis of the observed cooling following the 1991 Mt. Pinatubo eruption and claim that these data imply a very low value for the climate sensitivity (equivalent to 0.6 C equilibrium warming for a CO{sub 2} doubling). We show here that their analysis is flawed and their results are incorrect.
Date: April 22, 2005
Creator: Wigley, T L; Ammann, C M; Santer, B D & Taylor, K E
Partner: UNT Libraries Government Documents Department

Constraints on parton density functions from D0

Description: Five recent results from D0 which either impact or have the potential to impact on uncertainties in parton density functions are presented. Many analyses at D0 are sensitive to the modeling of the partonic structure of the proton. When theoretical and experimental uncertainties are well controlled there exists the possibility for additional constraints on parton density functions (PDF). Five measurements are presented which either have already been included in global parton fits or have the potential to contribute in the future.
Date: April 1, 2008
Creator: Hays, Jonathan M. & /Imperial Coll., London
Partner: UNT Libraries Government Documents Department

Sorting out Q values, threshold energies and level excitations in ENDL and ENDF

Description: While attempting to convert data to/from ENDF and ENDL, I discovered inconsistencies in the treatment of Q values in ENDF. These are my notes documenting these inconsistencies. The most interesting section is the last section where I compare Q values in JENDL-3.3 and ENDF/B-VII{beta}1 for various Americium isotopes.
Date: November 2, 2005
Creator: Brown, D A
Partner: UNT Libraries Government Documents Department

Uncertainty quantification in reacting flow modeling.

Description: Uncertainty quantification (UQ) in the computational modeling of physical systems is important for scientific investigation, engineering design, and model validation. In this work we develop techniques for UQ based on spectral and pseudo-spectral polynomial chaos (PC) expansions, and we apply these constructions in computations of reacting flow. We develop and compare both intrusive and non-intrusive spectral PC techniques. In the intrusive construction, the deterministic model equations are reformulated using Galerkin projection into a set of equations for the time evolution of the field variable PC expansion mode strengths. The mode strengths relate specific parametric uncertainties to their effects on model outputs. The non-intrusive construction uses sampling of many realizations of the original deterministic model, and projects the resulting statistics onto the PC modes, arriving at the PC expansions of the model outputs. We investigate and discuss the strengths and weaknesses of each approach, and identify their utility under different conditions. We also outline areas where ongoing and future research are needed to address challenges with both approaches.
Date: October 1, 2003
Creator: Le MaÒitre, Olivier P. (UniversitÔe d'Evry Val d'Essonne, Evry, France); Reagan, Matthew T.; Knio, Omar M. (Johns Hopkins University, Baltimore, MD); Ghanem, Roger Georges (Johns Hopkins University, Baltimore, MD) & Najm, Habib N.
Partner: UNT Libraries Government Documents Department

Confidence Calculation with AMV+

Description: The iterative advanced mean value algorithm (AMV+), introduced nearly ten years ago, is now widely used as a cost-effective probabilistic structural analysis tool when the use of sampling methods is cost prohibitive (Wu et al., 1990). The need to establish confidence bounds on calculated probabilities arises because of the presence of uncertainties in measured means and variances of input random variables. In this paper an algorithm is proposed that makes use of the AMV+ procedure and analytically derived probability sensitivities to determine confidence bounds on calculated probabilities.
Date: February 19, 1999
Creator: Fossum, A.F.
Partner: UNT Libraries Government Documents Department

Progress Report for UNLV High Pressure Science and Engineering Center

Description: In this report we present results of an in-depth analysis of the SP error densities for 29 satellites. These satellites were divided into three groups--Low Earth Orbit (LEO), Near Circular Orbit (NCO) and Highly Eccentric Orbit (HEO). Included in the first group were those satellites with eccentricities of less than 0.2 and perigees below 450km. The second group included satellites in near circular orbits (eccentricities of less than 0.015) and perigees from 700km to 1500km. The third group consisted of those satellites that were in highly eccentric orbits, namely those with eccentricities greater than 0.2. These satellites have perigees far into the thermosphere. Table 1 contains a summary of the orbit characteristics for the 29 satellites. In our study we attempted to unravel and elucidate the networks of relationships above. The satellite groupings and the report are organized in a way that reflects these efforts. We begin in Section 2 with a summary of the methods used in our analysis. One objective in this study was to establish a baseline for future work in satellite orbit propagators. Section 2 contains descriptions of the SP, truth orbits, and the satellite observation data used to establish this baseline. In the report we show how satellite error densities evolve in time up to thirty-six hours. We present error profiles, error histograms, rms errors and 95/9970 confidence limits for the along-track cross-track, and radial axes of motion for satellites in each of the three groupings. We present results of a regression analysis that establishes a physical model of the error densities. We also link the errors in the various regimes to the quality and quantity of the observational data.
Date: November 20, 1998
Creator: Mailhiot, C.; Pepper, D.; Lindle, D. & Nicol, M.
Partner: UNT Libraries Government Documents Department

Aspects of the quality of data from SGP cart site broadband radiation sensors

Description: This report presents details of the performance of broadband radiometers the the southern Great Plains (SGP) cloud and radiation testbed (CART) site to estimate the uncertainties of irradiance observations. Net radiation is observed with net radiometer in the energy balance Bowen ratio station at the central facility and compared with the net radiation computed as the sum of component irradiances recorded by nearby pyranometers and pyrgeometers. This paper observes the uncertainties of readings from net radiometers which are known to be substantial.
Date: June 1, 1995
Creator: Splitt, M.E. & Wesely, M.L.
Partner: UNT Libraries Government Documents Department

Applicability of International and DOE Target Values to ALD Destructive Measurement Applications

Description: International Target values and target value applicability are a function of the nuclear material processing campaign or application for which the accountability measurement method is being applied. Safeguarding significant quantities of nuclear-grade materials requires that accountability measurements be as accurate, precise, and representative as practically possible. In general, the ITV provides a benchmark for determining generic acceptability of the performance of the various accountability measurement methods, since it represents a performance level that is accepted as highly reliable. There are cases where it is acceptable to select alternative accountability methods not specifically referenced by the ITV, or to use the recognized measurement method, even though the uncertainties are greater than the target values.
Date: December 19, 2002
Creator: Holland, M.K.
Partner: UNT Libraries Government Documents Department

ORIGEN-S Decay Data Library and Half-Life Uncertainties

Description: The results of an extensive update of the decay data of the ORIGEN-S library are presented in this report. The updated decay data were provided for both the ORIGEN-S and ORIGEN2 libraries in the same project. A complete edit of the decay data plus the available half-life uncertainties are included in Appendix A. A detailed description of the types of data contained in the library, the format of the library, and the data sources are also presented. Approximately 24% of the library nuclides are stable, 66% were updated from ENDF/B-VI, about 8% were updated from ENSDF, and the remaining 2% were not updated. Appendix B presents a listing of percentage changes in decay heat from the old to the updated library for all nuclides containing a difference exceeding 1% in any parameter.
Date: January 1, 1998
Creator: Hermann, O.W.
Partner: UNT Libraries Government Documents Department

Spent Fuel Dissolution Rates as a Function of Burnup and Water Chemistry

Description: Several months ago, a report called PNNL-11895, ''Spent Fuel Dissolution Rates as a Function of Burnup and Water Chemistry'', by W. J. Gray dated June 1998 was mailed out. Unfortunately, an error was discovered in this document. The technetium (Tc) data in Figures 5 to 8, pages 21 to 24, are incorrect. Replacement figures, which show the corrected Tc data, are presented. No other data in the report was affected by this error.
Date: March 5, 1999
Creator: Gray, W.J.
Partner: UNT Libraries Government Documents Department

Certification testing at the National Wind Technology Center

Description: The International Electrotechnical Commission is developing a new standard that defines power performance measurement techniques. The standard will provide the basis for international recognition of a wind turbine`s performance primarily for certification, but also for qualification for tax and investment incentives, and for contracts. According to the standard, the power performance characteristics are defined by a measured power curve and by projections of annual energy production for a range of wind conditions. The National Wind Technology Center (NWTC) has adopted these power performance measurement techniques. This paper reviews the results of the NWTC`s first test conducted under the new protocol on the Atlantic Orient Corporation`s AOC 15/50 wind turbine at the NWTC. The test required collecting sufficient data to establish a statistically significant database over a range of wind speeds and conditions. From the data, the power curve was calculated. Then the results from a site calibration procedure determined the flow distortion between winds measured at the turbine location and those measured at the meteorological tower. Finally, this paper discusses the uncertainty analysis that was performed in accordance with the standard. Use of these procedures resulted in the definition of the AOC 15/50`s power curve within about 3 kW.
Date: November 1, 1996
Creator: Huskey, A. & Link, H.
Partner: UNT Libraries Government Documents Department

Towards a formal taxonomy of hybrid uncertainty representations

Description: Recent years have seen a proliferation of methods in addition to probability theory to represent information and uncertainty, including fuzzy sets and systems, fuzzy measures, rough sets, random sets, possibility distributions, imprecise probabilities, etc. We can identify these fields collectively as General Information Theory. The components of GIT represent information according to different axiomatic bases, and are thus capable of capturing different semantic aspects of uncertainty. Traditionally, these semantic criteria include such categories as fuzziness, vagueness, nonspecificity, conflict, and randomness. So it is clear that there is a pressing need for the GIT community to synthesize these methods, searching out larger formal frameworks within which to place these various components with respect to each other. Ideally, syntactic (mathematical) generalization can both aid and be aided by the semantic analysis available in terms of the conceptual categories outlined above. In this paper we present some preliminary ideas about how to formally relate various uncertainty representations together in a taxonomic lattice, capturing both syntactic and semantic generalization. Some partial and provisional results are shown. Assume a simple finite universe of discourse {Omega} = (a, b, c). We want to describe a situation in which we ask a question of the sort {open_quotes}what is the value of a variable x which takes values in {Omega}?{close_quotes}. When there is no uncertainty, we have a single alternative, say x = a. In logical terms, we would say that the proposition p: {open_quotes}the value of x is a{close_quotes} is TRUE. Our approach begins with two primitive concepts which can change our knowledge of x, each of which represents a different form of uncertainty, nonspecificity and fuxxiness.
Date: February 1, 1997
Creator: Joslyn, C. & Rocha, L.
Partner: UNT Libraries Government Documents Department

Prediction: Design of experiments based on approximating covariance kernels

Description: Using Mercer`s expansion to approximate the covariance kernel of an observed random function the authors transform the prediction problem to the regression problem with random parameters. The latter one is considered in the framework of convex design theory. First they formulate results in terms of the regression model with random parameters, then present the same results in terms of the original problem.
Date: November 1, 1998
Creator: Fedorov, V.
Partner: UNT Libraries Government Documents Department

Neutron total and capture cross section measurements and resonance parameter analysis of tungsten from 0.01 eV to 200 eV

Description: Natural tungsten metal was measured using neutron time-of-flight spectroscopy at the Rensselaer Polytechnic Institute (RPI) Gaerttner Laboratory linear accelerator to determine the tungsten resonance parameters. Three separate measurements were performed: transmission, capture, and self-indication. Previous measurements did not employ all three experiment types and used less sophisticated methods. The current work improves on the published tungsten data base and reduces resonance parameter uncertainties.
Date: June 15, 1998
Creator: Werner, C.J.; Block, R.C.; Slovacek, R.E.; Overberg, M.E.; Moretti, B.E.; Burke, J.A. et al.
Partner: UNT Libraries Government Documents Department

Systematic error revisited

Description: The American National Standards Institute (ANSI) defines systematic error as An error which remains constant over replicative measurements. It would seem from the ANSI definition that a systematic error is not really an error at all; it is merely a failure to calibrate the measurement system properly because if error is constant why not simply correct for it? Yet systematic errors undoubtedly exist, and they differ in some fundamental way from the kind of errors we call random. Early papers by Eisenhart and by Youden discussed systematic versus random error with regard to measurements in the physical sciences, but not in a fundamental way, and the distinction remains clouded by controversy. The lack of a general agreement on definitions has led to a plethora of different and often confusing methods on how to quantify the total uncertainty of a measurement that incorporates both its systematic and random errors. Some assert that systematic error should be treated by non- statistical methods. We disagree with this approach, and we provide basic definitions based on entropy concepts, and a statistical methodology for combining errors and making statements of total measurement of uncertainty. We illustrate our methods with radiometric assay data.
Date: August 5, 1996
Creator: Glosup, J.G. & Axelrod, M.C.
Partner: UNT Libraries Government Documents Department

Thermal analysis of thermo-gravimetric measurements of spent nuclear fuel oxidation rates

Description: A detailed thermal analysis was completed of the sample temperatures in the Thermo-Gravimetric Analysis (TGA) system used to measure irradiated N Reactor fuel oxidation rates. Sample temperatures during the oxidation process did not show the increase which was postulated as a result of the exothermic reactions. The analysis shows the axial conduction of heat in the sample holder effectively removes the added heat and only a very small, i.e., <10 C, increase in temperature is calculated. A room temperature evaporation test with water showed the sample thermocouple sensitivity to be more than adequate to account for a temperature change of approximately 5 C. Therefore, measured temperatures in the TGA are within approximately 10 C of the actual sample temperatures and no adjustments to reported data to account for the heat input from the oxidation process are necessary.
Date: October 9, 1997
Creator: Cramer, E.R.
Partner: UNT Libraries Government Documents Department