31 Matching Results

Search Results

Advanced search parameters have been applied.

Assessing Sources of Stress to Aquatic Ecosystems: Using Biomarkers and Bioindicators to Characterize Exodure-Response Profiles of Anthropogenic Activities

Description: Establishing causal relationships between sources of environmental stressors and aquatic ecosystem health if difficult because of the many biotic and abiotic factors which can influence or modify responses of biological systems to stress, the orders of magnitude involved in extrapolation over both spatial and temporal scales, and compensatory mechanisms such as density-dependent responses that operate in populations. To address the problem of establishing causality between stressors and effects on aquatic systems, a diagnostic approach, based on exposure-response profiles for various anthropogenic activities, was developed to help identify sources of stress responsible for effects on aquatic systems at ecological significant levels of biological organization (individual, population, community). To generate these exposure-effects profiles, biomarkers of exposure were plotted against bioindicators of corresponding effects for several major anthropogenic activities including petrochemical , pulp and paper, domestic sewage, mining operations, land-development activities, and agricultural activities. Biomarkers of exposure to environmental stressors varied depending on the type of anthropogenic activity involved. Bioindicator effects, however, including histopathological lesions, bioenergetic status, individual growth, reproductive impairment, and community-level responses were similar among many of the major anthropogenic activities. This approach is valuable to help identify and diagnose sources of stressors in environments impacted by multiple stressors. By identifying the types and sources of environmental stressors, aquatic ecosystems can be more effectively protected and managed to maintain acceptable levels of environmental quality and ecosystem fitness.
Date: March 29, 1999
Creator: Adams, S.M.
Partner: UNT Libraries Government Documents Department

The Dynamics of Fields of Higher Spin

Description: Report presenting a relativistic theory of motion that is free of many of the difficulties common in relativistic equations of motion. This Lagrangian theory describes fields and particles with arbitrary mass and charge and having any discrete spin, integer or half integer.
Date: August 1976
Creator: Hayward, Raymond W.
Partner: UNT Libraries Government Documents Department

Causal inheritence in plane wave quotients

Description: We investigate the appearance of closed timelike curves in quotients of plane waves along spacelike isometries. First we formulate a necessary and sufficient condition for a quotient of a general spacetime to preserve stable causality. We explicitly show that the plane waves are stably causal; in passing, we observe that some pp-waves are not even distinguishing. We then consider the classification of all quotients of the maximally supersymmetric ten-dimensional plane wave under a spacelike isometry, and show that the quotient will lead to closed timelike curves iff the isometry involves a translation along the u direction. The appearance of these closed timelike curves is thus connected to the special properties of the light cones in plane wave spacetimes. We show that all other quotients preserve stable causality.
Date: November 24, 2003
Creator: Hubeny, Veronika E.; Rangamani, Mukund & Ross, Simon F.
Partner: UNT Libraries Government Documents Department

Domestic Influences for Interstate Cooperation: Do Domestic Conditions Affect the Occurrence of Cooperative Events in Democratic Regimes?

Description: This research addressed two main issues that have become evident in studies of interstate cooperation. The first issue has to do with the relationship between cooperation and conflict. Can they be represented on a single, uni-dimensional continuum, or are they better represented by two theoretically and empirically separable dimensions? Granger causality tests were able to clarify the nature of cooperative events. The second issue is related to factors that might facilitate or discourage cooperation with other countries as a foreign policy tool. Factors used to explain cooperation and conflict include domestic variables, which have not been fully accounted for in previous empirical analyses. It is hypothesized that economic variables, such as inflation rates, GDP, and manufacturing production indices affect the likelihood of cooperative event occurrences. The effect of political dynamics, such as electoral cycles, support rates and national capability status, can also affect the possibility of cooperative foreign policies. The domestic factors in panel data was tested with Feasible Generalized Least Square (FGLS) in order to take care of heteroscedasticity and autocorrelations in residuals. The individual case analysis used linear time series analysis.
Date: August 2004
Creator: Yi, Seong-Woo
Partner: UNT Libraries

New Particle-in-Cell Code for Numerical Simulation of Coherent Synchrotron Radiation

Description: We present a first look at the new code for self-consistent, 2D simulations of beam dynamics affected by the coherent synchrotron radiation. The code is of the particle-in-cell variety: the beam bunch is sampled by point-charge particles, which are deposited on the grid; the corresponding forces on the grid are then computed using retarded potentials according to causality, and interpolated so as to advance the particles in time. The retarded potentials are evaluated by integrating over the 2D path history of the bunch, with the charge and current density at the retarded time obtained from interpolation of the particle distributions recorded at discrete timesteps. The code is benchmarked against analytical results obtained for a rigid-line bunch. We also outline the features and applications which are currently being developed.
Date: May 1, 2010
Creator: Balsa Terzic, Rui Li
Partner: UNT Libraries Government Documents Department

Numerical Tests and Properties of Waves in Radiating Fluids

Description: We discuss the properties of an analytical solution for waves in radiating fluids, with a view towards its implementation as a quantitative test of radiation hydrodynamics codes. A homogeneous radiating fluid in local thermodynamic equilibrium is periodically driven at the boundary of a one-dimensional domain, and the solution describes the propagation of the waves thus excited. Two modes are excited for a given driving frequency, generally referred to as a radiative acoustic wave and a radiative diffusion wave. While the analytical solution is well known, several features are highlighted here that require care during its numerical implementation. We compare the solution in a wide range of parameter space to a numerical integration with a Lagrangian radiation hydrodynamics code. Our most significant observation is that flux-limited diffusion does not preserve causality for waves on a homogeneous background.
Date: September 3, 2009
Creator: Johnson, B M & Klein, R I
Partner: UNT Libraries Government Documents Department

ScalaTrace: Tracing, Analysis and Modeling of HPC Codes at Scale

Description: Characterizing the communication behavior of large-scale applications is a difficult and costly task due to code/system complexity and their long execution times. An alternative to running actual codes is to gather their communication traces and then replay them, which facilitates application tuning and future procurements. While past approaches lacked lossless scalable trace collection, we contribute an approach that provides orders of magnitude smaller, if not near constant-size, communication traces regardless of the number of nodes while preserving structural information. We introduce intra- and inter-node compression techniques of MPI events, we develop a scheme to preserve time and causality of communication events, and we present results of our implementation for BlueGene/L. Given this novel capability, we discuss its impact on communication tuning and on trace extrapolation. To the best of our knowledge, such a concise representation of MPI traces in a scalable manner combined with time-preserving deterministic MPI call replay are without any precedence.
Date: March 31, 2010
Creator: Mueller, F; Wu, X; Schulz, M; de Supinski, B & Gamblin, T
Partner: UNT Libraries Government Documents Department

Measurement accuracy, bit-strings, Manthey`s quaternions, and RRQM

Description: The author continues the discussion started last year. By now three potentially divergent research programs have surfaced in ANPA: (1) the Bastin-Kilmister understanding of the combinatorial hierarchy (Clive`s {open_quotes}Menshevik{close_quotes} position); (2) the author`s bit-string {open_quotes}Theory of Everything{close_quotes} (which Clive has dubbed {open_quotes}Bolshevik{close_quotes}); (3) Manthey`s cycle hierarchy based on co-occurrence and mutual exclusion that Clive helped him map onto quaternions (as an yet unnamed heresy?). Unless a common objective can be found, these three points of view will continue to diverge. The authors suggests the reconstruction of relativistic quantum mechanism (RRQM) as a reasonable, and attainable, goal that might aid convergence rather than divergence.
Date: February 1, 1995
Creator: Noyes, H. P.
Partner: UNT Libraries Government Documents Department

Predicting the Cosmological Constant from the Causal Entropic Principle

Description: We compute the expected value of the cosmological constant in our universe from the Causal Entropic Principle. Since observers must obey the laws of thermodynamics and causality, the principle asserts that physical parameters are most likely to be found in the range of values for which the total entropy production within a causally connected region is maximized. Despite the absence of more explicit anthropic criteria, the resulting probability distribution turns out to be in excellent agreement with observation. In particular, we find that dust heated by stars dominates the entropy production, demonstrating the remarkable power of this thermodynamic selection criterion. The alternative approach-weighting by the number of"observers per baryon" -- is less well-defined, requires problematic assumptions about the nature of observers, and yet prefers values larger than present experimental bounds.
Date: May 1, 2007
Creator: Bousso, Raphael; Bousso, Raphael; Harnik, Roni; Kribs, Graham D. & Perez, Gilad
Partner: UNT Libraries Government Documents Department

Hyper-fast interstellar travel via a modification of spacetime geometry

Description: We analyze difficulties with proposals for hyper-fast interstellar travel via modifying the spacetime geometry, using as illustrations the Alcubierre warp drive and the Krasnikov tube. As it is easy to see, no violations of local causality or any other known physical principles are involved as far as motion of spacecrafts is concerned. However, the generation and support of the appropriate spacetime geometry configurations does create problems, the most significant of which are a violation of the weak energy condition, a violation of local causality, and a violation of the global causality protection. The violation of the chronology protection is the most serious of them as it opens a possibility of time travel. We trace the origin of the difficulties to the classical nature of the gravity field. This strongly indicates that hyper-fast interstellar travel should be transferred to the realm of a fully quantized gravitational theory. We outline an approach to further the research in this direction.
Date: August 1, 1997
Creator: Kheyfets, A. & Miller, W.A.
Partner: UNT Libraries Government Documents Department

Wakefields in a Dielectric Tube with Frequency Dependent Dielectric Constant

Description: Laser driven dielectric accelerators could operate at a fundamental mode frequency where consideration must be given to the frequency dependence of the dielectric constant when calculating wakefields. Wakefields are calculated for a frequency dependence that arises from a single atomic resonance. Causality is considered, and the effects on the short range wakefields are calculated.
Date: May 27, 2005
Creator: Siemann, R. H. & Chao, A. W.
Partner: UNT Libraries Government Documents Department

Existence of Weakly Damped Kinetic Alfven Eigenmodes in Reversed Shear Tokamak

Description: A kinetic theory of weakly damped Alfven Eigenmode (AE) solutions strongly interacting with the continuum is developed for tokamak plasmas with reversed magnetic shear. We show that the ideal MHD model is not sufficient for the eigenmode solutions if the standard causality condition bypass rule is applied. Finite Larmor radius effects are required, which introduce multiple kinetic subeigenmodes and collisionless radiative damping. The theory explains the existence of experimentally observed Alfvenic instabilities with frequencies sweeping down and reaching their minimum (bottom).
Date: August 12, 2008
Creator: Gorelenkov, N. N.
Partner: UNT Libraries Government Documents Department

The Use of Object-Oriented Analysis Methods in Surety Analysis

Description: Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automatic model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.
Date: May 1, 1999
Creator: Craft, Richard L.; Funkhouser, Donald R. & Wyss, Gregory D.
Partner: UNT Libraries Government Documents Department

FFTF Event Fact Sheet root cause analysis calendar year 1985 through 1988

Description: The Event Fact Sheets written from January 1985 through mid August 1988 were reviewed to determine their root causes. The review group represented many of the technical disciplines present in plant operation. The review was initiated as an internal critique aimed at maximizing the ``lessons learned`` from the event reporting system. The root causes were subjected to a Pareto analysis to determine the significant causal factor groups. Recommendations for correction of the high frequency causal factors were then developed and presented to the FFTF Plant management. In general, the distributions of the causal factors were found to closely follow the industry averages. The impacts of the events were also studied and it was determined that we generally report events of a level of severity below that of the available studies. Therefore it is concluded that the recommendations for corrective action are ones to improve the overall quality of operations and not to correct significant operational deficiencies. 17 figs.
Date: December 1, 1988
Creator: Griffin, G.B.
Partner: UNT Libraries Government Documents Department

Adaptive Dynamic Bayesian Networks

Description: A discrete-time Markov process can be compactly modeled as a dynamic Bayesian network (DBN)--a graphical model with nodes representing random variables and directed edges indicating causality between variables. Each node has a probability distribution, conditional on the variables represented by the parent nodes. A DBN's graphical structure encodes fixed conditional dependencies between variables. But in real-world systems, conditional dependencies between variables may be unknown a priori or may vary over time. Model errors can result if the DBN fails to capture all possible interactions between variables. Thus, we explore the representational framework of adaptive DBNs, whose structure and parameters can change from one time step to the next: a distribution's parameters and its set of conditional variables are dynamic. This work builds on recent work in nonparametric Bayesian modeling, such as hierarchical Dirichlet processes, infinite-state hidden Markov networks and structured priors for Bayes net learning. In this paper, we will explain the motivation for our interest in adaptive DBNs, show how popular nonparametric methods are combined to formulate the foundations for adaptive DBNs, and present preliminary results.
Date: October 26, 2007
Creator: Ng, B M
Partner: UNT Libraries Government Documents Department

Predicting the Cosmological Constant from the CausalEntropic Principle

Description: We compute the expected value of the cosmological constant in our universe from the Causal Entropic Principle. Since observers must obey the laws of thermodynamics and causality, it asserts that physical parameters are most likely to be found in the range of values for which the total entropy production within a causally connected region is maximized. Despite the absence of more explicit anthropic criteria, the resulting probability distribution turns out to be in excellent agreement with observation. In particular, we find that dust heated by stars dominates the entropy production, demonstrating the remarkable power of this thermodynamic selection criterion. The alternative approach--weighting by the number of ''observers per baryon''--is less well-defined, requires problematic assumptions about the nature of observers, and yet prefers values larger than present experimental bounds.
Date: February 20, 2007
Creator: Bousso, Raphael; Harnik, Roni; Kribs, Graham D. & Perez, Gilad
Partner: UNT Libraries Government Documents Department

Relativistic Nuclear Many-Body Theory

Description: Nonrelativistic models of nuclear systems have provided important insight into nuclear physics.In future experiments, nuclear systems will be examined under extreme conditions of dmensity and temperature, and their response will be probed at momentum and energy transfers larger than the nucleon mass.It is therefore essential to develop reliable models that go beyond the traditional nonrelativistic many-body framework.General properties of physics; such as quantum mechanics, Lorentz covariance, and microscopic causality, motivate the use of quantum field theories to describe the interacting, relativistic:, nuclear many-body system. Renormalizable models based on hadronic degrees of freedom (<I>quantum hadrodynamics</I>) are presented, and the assumptions underlying this framework are discussed.Some applications and successes of quantum hadrodynamics are described, with an emphasis on the new features arising from relativity. Examples include the nuclear equation of state, the sh
Date: August 1, 1991
Creator: Serot, Brian & Walecka, J.
Partner: UNT Libraries Government Documents Department

Feature-Based Statistical Analysis of Combustion Simulation Data

Description: We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion science; however, it is applicable to many other science domains.
Date: November 18, 2011
Creator: Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J et al.
Partner: UNT Libraries Government Documents Department

Investigations of paleoclimate variations using accelerator mass spectrometry

Description: This project has used Accelerator Mass Spectrometry (AMS) {sup 14}C measurements to study climate and carbon cycle variations on time scales from decades to millennia over the past 30,000 years, primarily in the western US and the North Pacific. {sup 14}C dates provide a temporal framework for records of climate change, and natural radiocarbon acts as a carbon cycle tracer in independently dated records. The overall basis for the study is the observation that attempts to model future climate and carbon cycle changes cannot be taken seriously if the models have not been adequately tested. Paleoclimate studies are unique because they provide realistic test data under climate conditions significantly different from those of the present, whereas instrumental results can only sample the system as it is today. The aim of this project has been to better establish the extent, timing, and causes of past climate perturbations, and the carbon cycle changes with which they are linked. This provides real-world data for model testing, both for the development of individual models and also for inter-model diagnosis and comparison activities such as those of LLNL's PCMDI program; it helps us achieve a better basic understanding of how the climate system works so that models can be improved; and it gives an indication of the natural variability in the climate system underlying any anthropogenically-driven changes. The research has involved four projects which test hypotheses concerning the overall behavior of the North Pacific climate system. All are aspects of an overall theme that climate linkages are strong and direct, so that regional climate records are correlated, details of fine structure are important, and accurate and precise dating is critical for establishing correlations and even causality. An important requirement for such studies is the requirement for an accurate and precise radiocarbon calibration, to allow better ...
Date: August 24, 2000
Creator: Southon, J R; Kashgarian, M & Brown, T A
Partner: UNT Libraries Government Documents Department

A Case for Including Atmospheric Thermodynamic Variables in Wind Turbine Fatigue Loading Parameter Identification

Description: This paper makes the case for establishing efficient predictor variables for atmospheric thermodynamics that can be used to statistically correlate the fatigue accumulation seen on wind turbines. Recently, two approaches to this issue have been reported. One uses multiple linear-regression analysis to establish the relative causality between a number of predictors related to the turbulent inflow and turbine loads. The other approach, using many of the same predictors, applies the technique of principal component analysis. An examination of the ensemble of predictor variables revealed that they were all kinematic in nature; i.e., they were only related to the description of the velocity field. Boundary-layer turbulence dynamics depends upon a description of the thermal field and its interaction with the velocity distribution. We used a series of measurements taken within a multi-row wind farm to demonstrate the need to include atmospheric thermodynamic variables as well as velocity-related ones in the search for efficient turbulence loading predictors in various turbine-operating environments. Our results show that a combination of vertical stability and hub-height mean shearing stress variables meet this need over a period of 10 minutes.
Date: August 2, 1999
Creator: Kelley, N. D.
Partner: UNT Libraries Government Documents Department

Electromagnetic Wave Propagation in Two-Dimensional Photonic Crystals

Description: In this dissertation, they have undertaken the challenge to understand the unusual propagation properties of the photonic crystal (PC). The photonic crystal is a medium where the dielectric function is periodically modulated. These types of structures are characterized by bands and gaps. In other words, they are characterized by frequency regions where propagation is prohibited (gaps) and regions where propagation is allowed (bands). In this study they focus on two-dimensional photonic crystals, i.e., structures with periodic dielectric patterns on a plane and translational symmetry in the perpendicular direction. They start by studying a two-dimensional photonic crystal system for frequencies inside the band gap. The inclusion of a line defect introduces allowed states in the otherwise prohibited frequency spectrum. The dependence of the defect resonance state on different parameters such as size of the structure, profile of incoming source, etc., is investigated in detail. For this study, they used two popular computational methods in photonic crystal research, the Finite Difference Time Domain method (FDTD) and the Transfer Matrix Method (TMM). The results for the one-dimensional defect system are analyzed, and the two methods, FDTD and TMM, are compared. Then, they shift their attention only to periodic two-dimensional crystals, concentrate on their band properties, and study their unusual refractive behavior. Anomalous refractive phenomena in photonic crystals included cases where the beam refracts on the ''wrong'' side of the surface normal. The latter phenomenon, is known as negative refraction and was previously observed in materials where the wave vector, the electric field, and the magnetic field form a left-handed set of vectors. These materials are generally called left-handed materials (LHM) or negative index materials (NIM). They investigated the possibility that the photonic crystal behaves as a LHM, and how this behavior relates with the observed negatively refractive phenomena. They found that in the ...
Date: December 12, 2003
Creator: Foteinopoulou, Stavroula
Partner: UNT Libraries Government Documents Department