7,363 Matching Results

Search Results

Advanced search parameters have been applied.


Description: This is a brief description of a computer program which was written by Oleh Weres to generate discrete grids for IFD* type computer programs. The output of the program includes data which can be used directly for input to the program SHAFT78. The program is specifically intended for large-scale two or three-dimensional reservoir simulation. The program requires, as input, the x, y, z coordinates of the discrete element locations being used to specify a particular reservoir's geological system. From the list of element locations, the program finds the midpoints of lines joining adjacent elements. At each midpoint the program constructs a perpendicular plane. The intersections of the planes in the three-space defines an irregular (in general) n-sided polyhedron around each element center. In two-dimensions the program produces a unique 'tiling' which has polygons with all faces perpendicular to the lines joining adjacent elements. The areas between adjoining elements and the volume of each element are calculated. The end result, in general, is a three-dimensional grid of n-sided polyhedra for which the element locations, the connecting (flow) areas, and the element volumes are all known. Since the grids are finite the program must have information about the boundary of the grid. This is supplied as a set of 'dummy' elements which are used only to limit the extent of the grid and are not intended for use in the reservoir simulation.
Date: June 1, 1978
Creator: Weres, O. & Schroeder, R.C.
Partner: UNT Libraries Government Documents Department

Joint inversion of geophysical data for site characterization and restoration monitoring

Description: The purpose of this project is to develop a computer code for joint inversion of seismic and electrical data, to improve underground imaging for site characterization and remediation monitoring. The computer code developed in this project will invert geophysical data to obtain direct estimates of porosity and saturation underground, rather than inverting for seismic velocity and electrical resistivity or other geophysical properties. This is intended to be a significant improvement in the state-of-the-art of underground imaging, since interpretation of data collected at a contaminated site would become much less subjective. Potential users include DOE scientists and engineers responsible for characterizing contaminated sites and monitoring remediation of contaminated sites. In this three-year project, we use a multi-phase approach consisting of theoretical and numerical code development, laboratory investigations, testing on available laboratory and borehole geophysics data sets, and a controlled field experiment, to develop practical tools for joint electrical and seismic data interpretation.
Date: May 28, 1998
Creator: Berge, P. A.
Partner: UNT Libraries Government Documents Department

GYRO Simulations of Core Momentum Transport in DIII-D and JET Plasmas

Description: Momentum, energy, and particle transport in DIII-D and JET ELMy H-mode plasmas is simulated with GYRO and compared with measurements analyzed using TRANSP. The simulated transport depends sensitively on the nabla(T(sub)i) turbulence drive and the nabla(E(sub)r) turbulence suppression inputs. With their nominal values indicated by measurements, the simulations over-predict the momentum and energy transport in the DIII-D plasmas, and under-predict in the JET plasmas. Reducing |nabla(T(sub)i)| and increasing |nabla(E(sub)r)| by up to 15% leads to approximate agreement (within a factor of two) for the DIII-D cases. For the JET cases, increasing |nabla(T(sub)i)| or reducing |nabla(E(sub)r)| results in approximate agreement for the energy flow, but the ratio of the simulated energy and momentum flows remains higher than measurements by a factor of 2-4.
Date: June 27, 2005
Creator: Budny, R. V.; Candy, J. & Waltz, R. E.
Partner: UNT Libraries Government Documents Department

On the secure obfuscation of deterministic finite automata.

Description: In this paper, we show how to construct secure obfuscation for Deterministic Finite Automata, assuming non-uniformly strong one-way functions exist. We revisit the software protection approaches originally proposed by [5, 10, 12, 17] and revise them to the current obfuscation setting of Barak et al. [2]. Under this model, we introduce an efficient oracle that retains some 'small' secret about the original program. Using this secret, we can construct an obfuscator and two-party protocol that securely obfuscates Deterministic Finite Automata against malicious adversaries. The security of this model retains the strong 'virtual black box' property originally proposed in [2] while incorporating the stronger condition of dependent auxiliary inputs in [15]. Additionally, we show that our techniques remain secure under concurrent self-composition with adaptive inputs and that Turing machines are obfuscatable under this model.
Date: June 1, 2008
Creator: Anderson, William Erik
Partner: UNT Libraries Government Documents Department

The BBP Algorithm for Pi

Description: The 'Bailey-Borwein-Plouffe' (BBP) algorithm for {pi} is based on the BBP formula for {pi}, which was discovered in 1995 and published in 1996 [3]: {pi} = {summation}{sub k=0}{sup {infinity}} 1/16{sup k} (4/8k+1 - 2/8k+4 - 1/8k+5 - 1/8k+6). This formula as it stands permits {pi} to be computed fairly rapidly to any given precision (although it is not as efficient for that purpose as some other formulas that are now known [4, pg. 108-112]). But its remarkable property is that it permits one to calculate (after a fairly simple manipulation) hexadecimal or binary digits of {pi} beginning at an arbitrary starting position. For example, ten hexadecimal digits {pi} beginning at position one million can be computed in only five seconds on a 2006-era personal computer. The formula itself was found by a computer program, and almost certainly constitutes the first instance of a computer program finding a significant new formula for {pi}. It turns out that the existence of this formula has implications for the long-standing unsolved question of whether {pi} is normal to commonly used number bases (a real number x is said to be b-normal if every m-long string of digits in the base-b expansion appears, in the limit, with frequency b{sup -m}). Extending this line of reasoning recently yielded a proof of normality for class of explicit real numbers (although not yet including {pi}) [4, pg. 148-156].
Date: September 17, 2006
Creator: Bailey, David H.
Partner: UNT Libraries Government Documents Department

Illustrating the future prediction of performance based on computer code, physical experiments, and critical performance parameter samples

Description: In this paper, we present a generic example to illustrate various points about making future predictions of population performance using a biased performance computer code, physical performance data, and critical performance parameter data sampled from the population at various times. We show how the actual performance data help to correct the biased computer code and the impact of uncertainty especially when the prediction is made far from where the available data are taken. We also demonstrate how a Bayesian approach allows both inferences about the unknown parameters and predictions to be made in a consistent framework.
Date: January 1, 2009
Creator: Hamada, Michael S & Higdon, David M
Partner: UNT Libraries Government Documents Department

A C++ Infrastructure for Automatic Introduction and Translation of OpenMP Directives

Description: In this paper we describe a C++ infrastructure for source-to-source translation. We demonstrate the translation of a serial program with high-level abstractions to a lower-level parallel program in two separate phases. In the first phase OpenMP directives are introduced, driven by the semantics of high-level abstractions. Then the OpenMP directives are translated to a C++ program that explicitly creates and manages parallelism according to the specified directives. Both phases are implemented using the same mechanisms in our infrastructure.
Date: July 28, 2003
Creator: Quinlan, D J; Scordan, M; Yi, Q & de Supinski, B R
Partner: UNT Libraries Government Documents Department

USERDA computer program summaries. Numbers 177--239

Description: Since 1960 the Argonne Code Center has served as a U. S. Atomic Energy Commission information center for computer programs developed and used primarily for the solution of problems in nuclear physics, reactor design, reactor engineering and operation. The Center, through a network of registered installations, collects, validates, maintains, and distributes a library of these computer programs and publishes a compilation of abstracts describing them. In 1972 the scope of the Center's activities was officially expanded to include computer programs developed in all of the U. S. Atomic Energy Commission program areas and the compilation and publication of this report. The Computer Program Summary report contains summaries of computer programs at the specification stage, under development, being checked out, in use, or available at ERDA offices, laboratories, and contractor installations. Programs are divided into the following categories: cross section and resonance integral calculations; spectrum calculations, generation of group constants, lattice and cell problems; static design studies; depletion, fuel management, cost analysis, and reactor economics; space-independent kinetics; space--time kinetics, coupled neutronics-- hydrodynamics--thermodynamics and excursion simulations; radiological safety, hazard and accident analysis; heat transfer and fluid flow; deformation and stress distribution computations, structural analysis and engineering design studies; gamma heating and shield design programs; reactor systems analysis; data preparation; data management; subsidiary calculations; experimental data processing; general mathematical and computing system routines; materials; environmental and earth sciences; space sciences; electronics and engineering equipment; chemistry; particle accelerators and high-voltage machines; physics; controlled thermonuclear research; biology and medicine; and data. (RWR)
Date: October 1, 1975
Partner: UNT Libraries Government Documents Department

Damage Detection and Identification of Finite Element Models Using State-Space Based Signal Processing a Summation of Work Completed at the Lawrence Livermore National Laboratory February 1999 to April 2000

Description: Until recently, attempts to update Finite Element Models (FEM) of large structures based upon recording structural motions were mostly ad hoc, requiring a large amount of engineering experience and skill. Studies have been undertaken at LLNL to use state-space based signal processing techniques to locate the existence and type of model mismatches common in FEM. Two different methods (Gauss-Newton gradient search and extended Kalman filter) have been explored, and the progress made in each type of algorithm as well as the results from several simulated and one actual building model will be discussed. The algorithms will be examined in detail, and the computer programs written to implement the algorithms will be documented.
Date: April 28, 2000
Creator: Burnett, G.C.
Partner: UNT Libraries Government Documents Department

Anisotropic scattering in the variational nodal simplified spherical harmonics formulation

Description: Under the assumption of isotropic scattering, the simplified spherical harmonics method (SP{sub N}) was recently formulated in variational nodal form and implemented successfully as an option of the VARIANT code. The authors here remove the isotopic scattering restriction. The variational nodal form of the SPN approximation is formulated and implemented with both within-group and group-to-group anisotropic scattering. Results are presented for a model problem previously utilized with the standard P{sub N} variational nodal method.
Date: May 1, 1996
Creator: Lewis, E.E. & Palmiotti, G.
Partner: UNT Libraries Government Documents Department

Visual Sample Plan (VSP) Models and Code Verification

Description: VSP is an easy to use, visual and graphic software tool being developed to select the right number and location of environmental samples so that the results of statistical tests performed to provide input to environmental decisions have the required confidence and performance. It is a significant help for implementing the 6th and 7th steps of the Data Quality Objectives (DQO) planning process (''Specify Tolerable Limits on Decision Errors'' and ''Optimize the Design for Obtaining Data,'' respectively).
Date: March 6, 2001
Creator: Gilbert, Richard O; Davidson, James R & Pulsipher, Brent A
Partner: UNT Libraries Government Documents Department

Application of software quality assurance to a specific scientific code development task

Description: This paper describes an application of software quality assurance to a specific scientific code development program. The software quality assurance program consists of three major components: administrative control, configuration management, and user documentation. The program attempts to be consistent with existing local traditions of scientific code development while at the same time providing a controlled process of development.
Date: March 1, 1986
Creator: Dronkers, J.J.
Partner: UNT Libraries Government Documents Department

Uranium and plutonium isotopic analysis using MGA++

Description: The Lawrence Livermore National Laboratory develops sophisticated gamma-ray analysis codes for the isotopic analysis of nuclear materials based on the principles used in the original MultiGroup Analysis (MGA) code. Over the years, the MGA methodology has been upgraded and expanded far beyond its original capabilities and is now comprised of a suite of codes known as MGA++. The early MGA code analyzed Pu gamma-ray data collected with high-purity germanium (HPGe) detectors to yield Pu isotopic ratios. While the original MGA code relied solely on the lower-energy gamma rays (around 100 keV), the most recent addition to the MGA++ code suite, MGAHI, analyzes Pu data using higher-energy gamma rays (200 keV and higher) and is particulatly useful for Pu samples - that are enclosed in thick-walled containers. The MGA++ suite also includes capabilities to perform U isotopic analysis on data collected with either HPGe or cadmium-zinc-tellutide (CZT) detectors. These codes are commercially available and are known as U235 and CZTU, respectively. A graphical user interface has also been developed for viewing the data and the fitting procedure. In addition, we are developing new codes that will integrate into the MGA++ suite. These will include Pu isotopic analysis capabilities for data collected with CZT detectors, U isotopic analysis with HPGe detectors which utilizes only higher energy gamma rays, and isotopic analyses on mixtures of Pu and U.
Date: July 1, 1998
Creator: Buckley, W; Clark, D; Friensehner, A; Parker, W; Raschke, K; Romine, W et al.
Partner: UNT Libraries Government Documents Department

Software interoperability for energy simulation

Description: This paper provides an overview of software interoperability as it relates to the energy simulation of buildings. The paper begins with a discussion of the difficulties in using sophisticated analysis tools like energy simulation at various stages in the building life cycle, and the potential for interoperability to help overcome these difficulties. An overview of the Industry Foundation Classes (IFC), a common data model for supporting interoperability under continuing development by the International Alliance for Interoperability (IAI) is then given. The process of creating interoperable software is described next, followed by specific details for energy simulation tools. The paper closes with the current status of, and future plans for, the ongoing efforts to achieve software interoperability.
Date: July 31, 2002
Creator: Hitchcock, Robert J.
Partner: UNT Libraries Government Documents Department