2,166 Matching Results

Search Results

Advanced search parameters have been applied.

DOCUMENTATION FOR PROGRAM OGRE

Description: This is a brief description of a computer program which was written by Oleh Weres to generate discrete grids for IFD* type computer programs. The output of the program includes data which can be used directly for input to the program SHAFT78. The program is specifically intended for large-scale two or three-dimensional reservoir simulation. The program requires, as input, the x, y, z coordinates of the discrete element locations being used to specify a particular reservoir's geological system. From the list of element locations, the program finds the midpoints of lines joining adjacent elements. At each midpoint the program constructs a perpendicular plane. The intersections of the planes in the three-space defines an irregular (in general) n-sided polyhedron around each element center. In two-dimensions the program produces a unique 'tiling' which has polygons with all faces perpendicular to the lines joining adjacent elements. The areas between adjoining elements and the volume of each element are calculated. The end result, in general, is a three-dimensional grid of n-sided polyhedra for which the element locations, the connecting (flow) areas, and the element volumes are all known. Since the grids are finite the program must have information about the boundary of the grid. This is supplied as a set of 'dummy' elements which are used only to limit the extent of the grid and are not intended for use in the reservoir simulation.
Date: June 1, 1978
Creator: Weres, O. & Schroeder, R.C.
Partner: UNT Libraries Government Documents Department

Joint inversion of geophysical data for site characterization and restoration monitoring

Description: The purpose of this project is to develop a computer code for joint inversion of seismic and electrical data, to improve underground imaging for site characterization and remediation monitoring. The computer code developed in this project will invert geophysical data to obtain direct estimates of porosity and saturation underground, rather than inverting for seismic velocity and electrical resistivity or other geophysical properties. This is intended to be a significant improvement in the state-of-the-art of underground imaging, since interpretation of data collected at a contaminated site would become much less subjective. Potential users include DOE scientists and engineers responsible for characterizing contaminated sites and monitoring remediation of contaminated sites. In this three-year project, we use a multi-phase approach consisting of theoretical and numerical code development, laboratory investigations, testing on available laboratory and borehole geophysics data sets, and a controlled field experiment, to develop practical tools for joint electrical and seismic data interpretation.
Date: May 28, 1998
Creator: Berge, P. A.
Partner: UNT Libraries Government Documents Department

GYRO Simulations of Core Momentum Transport in DIII-D and JET Plasmas

Description: Momentum, energy, and particle transport in DIII-D and JET ELMy H-mode plasmas is simulated with GYRO and compared with measurements analyzed using TRANSP. The simulated transport depends sensitively on the nabla(T(sub)i) turbulence drive and the nabla(E(sub)r) turbulence suppression inputs. With their nominal values indicated by measurements, the simulations over-predict the momentum and energy transport in the DIII-D plasmas, and under-predict in the JET plasmas. Reducing |nabla(T(sub)i)| and increasing |nabla(E(sub)r)| by up to 15% leads to approximate agreement (within a factor of two) for the DIII-D cases. For the JET cases, increasing |nabla(T(sub)i)| or reducing |nabla(E(sub)r)| results in approximate agreement for the energy flow, but the ratio of the simulated energy and momentum flows remains higher than measurements by a factor of 2-4.
Date: June 27, 2005
Creator: Budny, R. V.; Candy, J. & Waltz, R. E.
Partner: UNT Libraries Government Documents Department

On the secure obfuscation of deterministic finite automata.

Description: In this paper, we show how to construct secure obfuscation for Deterministic Finite Automata, assuming non-uniformly strong one-way functions exist. We revisit the software protection approaches originally proposed by [5, 10, 12, 17] and revise them to the current obfuscation setting of Barak et al. [2]. Under this model, we introduce an efficient oracle that retains some 'small' secret about the original program. Using this secret, we can construct an obfuscator and two-party protocol that securely obfuscates Deterministic Finite Automata against malicious adversaries. The security of this model retains the strong 'virtual black box' property originally proposed in [2] while incorporating the stronger condition of dependent auxiliary inputs in [15]. Additionally, we show that our techniques remain secure under concurrent self-composition with adaptive inputs and that Turing machines are obfuscatable under this model.
Date: June 1, 2008
Creator: Anderson, William Erik
Partner: UNT Libraries Government Documents Department

Illustrating the future prediction of performance based on computer code, physical experiments, and critical performance parameter samples

Description: In this paper, we present a generic example to illustrate various points about making future predictions of population performance using a biased performance computer code, physical performance data, and critical performance parameter data sampled from the population at various times. We show how the actual performance data help to correct the biased computer code and the impact of uncertainty especially when the prediction is made far from where the available data are taken. We also demonstrate how a Bayesian approach allows both inferences about the unknown parameters and predictions to be made in a consistent framework.
Date: January 1, 2009
Creator: Hamada, Michael S & Higdon, David M
Partner: UNT Libraries Government Documents Department

The BBP Algorithm for Pi

Description: The 'Bailey-Borwein-Plouffe' (BBP) algorithm for {pi} is based on the BBP formula for {pi}, which was discovered in 1995 and published in 1996 [3]: {pi} = {summation}{sub k=0}{sup {infinity}} 1/16{sup k} (4/8k+1 - 2/8k+4 - 1/8k+5 - 1/8k+6). This formula as it stands permits {pi} to be computed fairly rapidly to any given precision (although it is not as efficient for that purpose as some other formulas that are now known [4, pg. 108-112]). But its remarkable property is that it permits one to calculate (after a fairly simple manipulation) hexadecimal or binary digits of {pi} beginning at an arbitrary starting position. For example, ten hexadecimal digits {pi} beginning at position one million can be computed in only five seconds on a 2006-era personal computer. The formula itself was found by a computer program, and almost certainly constitutes the first instance of a computer program finding a significant new formula for {pi}. It turns out that the existence of this formula has implications for the long-standing unsolved question of whether {pi} is normal to commonly used number bases (a real number x is said to be b-normal if every m-long string of digits in the base-b expansion appears, in the limit, with frequency b{sup -m}). Extending this line of reasoning recently yielded a proof of normality for class of explicit real numbers (although not yet including {pi}) [4, pg. 148-156].
Date: September 17, 2006
Creator: Bailey, David H.
Partner: UNT Libraries Government Documents Department

USERDA computer program summaries. Numbers 177--239

Description: Since 1960 the Argonne Code Center has served as a U. S. Atomic Energy Commission information center for computer programs developed and used primarily for the solution of problems in nuclear physics, reactor design, reactor engineering and operation. The Center, through a network of registered installations, collects, validates, maintains, and distributes a library of these computer programs and publishes a compilation of abstracts describing them. In 1972 the scope of the Center's activities was officially expanded to include computer programs developed in all of the U. S. Atomic Energy Commission program areas and the compilation and publication of this report. The Computer Program Summary report contains summaries of computer programs at the specification stage, under development, being checked out, in use, or available at ERDA offices, laboratories, and contractor installations. Programs are divided into the following categories: cross section and resonance integral calculations; spectrum calculations, generation of group constants, lattice and cell problems; static design studies; depletion, fuel management, cost analysis, and reactor economics; space-independent kinetics; space--time kinetics, coupled neutronics-- hydrodynamics--thermodynamics and excursion simulations; radiological safety, hazard and accident analysis; heat transfer and fluid flow; deformation and stress distribution computations, structural analysis and engineering design studies; gamma heating and shield design programs; reactor systems analysis; data preparation; data management; subsidiary calculations; experimental data processing; general mathematical and computing system routines; materials; environmental and earth sciences; space sciences; electronics and engineering equipment; chemistry; particle accelerators and high-voltage machines; physics; controlled thermonuclear research; biology and medicine; and data. (RWR)
Date: October 1, 1975
Partner: UNT Libraries Government Documents Department

Application of software quality assurance to a specific scientific code development task

Description: This paper describes an application of software quality assurance to a specific scientific code development program. The software quality assurance program consists of three major components: administrative control, configuration management, and user documentation. The program attempts to be consistent with existing local traditions of scientific code development while at the same time providing a controlled process of development.
Date: March 1, 1986
Creator: Dronkers, J.J.
Partner: UNT Libraries Government Documents Department

Software interoperability for energy simulation

Description: This paper provides an overview of software interoperability as it relates to the energy simulation of buildings. The paper begins with a discussion of the difficulties in using sophisticated analysis tools like energy simulation at various stages in the building life cycle, and the potential for interoperability to help overcome these difficulties. An overview of the Industry Foundation Classes (IFC), a common data model for supporting interoperability under continuing development by the International Alliance for Interoperability (IAI) is then given. The process of creating interoperable software is described next, followed by specific details for energy simulation tools. The paper closes with the current status of, and future plans for, the ongoing efforts to achieve software interoperability.
Date: July 31, 2002
Creator: Hitchcock, Robert J.
Partner: UNT Libraries Government Documents Department

Visual Sample Plan (VSP) Models and Code Verification

Description: VSP is an easy to use, visual and graphic software tool being developed to select the right number and location of environmental samples so that the results of statistical tests performed to provide input to environmental decisions have the required confidence and performance. It is a significant help for implementing the 6th and 7th steps of the Data Quality Objectives (DQO) planning process (''Specify Tolerable Limits on Decision Errors'' and ''Optimize the Design for Obtaining Data,'' respectively).
Date: March 6, 2001
Creator: Gilbert, Richard O; Davidson, James R & Pulsipher, Brent A
Partner: UNT Libraries Government Documents Department

Creation of a GUI for Zori, a Quantum Monte Carlo program, usingRappture

Description: In their research laboratories, academic institutions produce some of the most advanced software for scientific applications. However, this software is usually developed only for local application in the research laboratory or for method development. In spite of having the latest advances in the particular field of science, such software often lacks adequate documentation and therefore is difficult to use by anyone other than the code developers. As such codes become more complex, so typically do the input files and command statements necessary to operate them. Many programs offer the flexibility of performing calculations based on different methods that have their own set of variables and options to be specified. Moreover, situations can arise in which certain options are incompatible with each other. For this reason, users outside the development group can be unaware of how the program runs in detail. The opportunity can be lost to make the software readily available outside of the laboratory of origin. This is a long-standing problem in scientific programming. Rappture, Rapid Application Infrastructure [1], is a new GUI development kit that enables a developer to build an I/O interface for a specific application. This capability enables users to work only with the generated GUI and avoids the problem of the user needing to learn details of the code. Further, it reduces input errors by explicitly specifying the variables required. Zori, a quantum Monte Carlo (QMC) program, developed by the Lester group at the University of California, Berkeley [2], is one of the few free tools available for this field. Like many scientific computer packages, Zori suffers from the problems described above. Potential users outside the research group have acquired it, but some have found the code difficult to use. Furthermore, new members of the Lester group usually have to take considerable time learning all ...
Date: December 1, 2007
Creator: Olivares-Amaya, R.; Salomon Ferrer, R.; Lester Jr., W.A. & Amador-Bedolla, C.
Partner: UNT Libraries Government Documents Department

Damage Detection and Identification of Finite Element Models Using State-Space Based Signal Processing a Summation of Work Completed at the Lawrence Livermore National Laboratory February 1999 to April 2000

Description: Until recently, attempts to update Finite Element Models (FEM) of large structures based upon recording structural motions were mostly ad hoc, requiring a large amount of engineering experience and skill. Studies have been undertaken at LLNL to use state-space based signal processing techniques to locate the existence and type of model mismatches common in FEM. Two different methods (Gauss-Newton gradient search and extended Kalman filter) have been explored, and the progress made in each type of algorithm as well as the results from several simulated and one actual building model will be discussed. The algorithms will be examined in detail, and the computer programs written to implement the algorithms will be documented.
Date: April 28, 2000
Creator: Burnett, G.C.
Partner: UNT Libraries Government Documents Department

Uranium and plutonium isotopic analysis using MGA++

Description: The Lawrence Livermore National Laboratory develops sophisticated gamma-ray analysis codes for the isotopic analysis of nuclear materials based on the principles used in the original MultiGroup Analysis (MGA) code. Over the years, the MGA methodology has been upgraded and expanded far beyond its original capabilities and is now comprised of a suite of codes known as MGA++. The early MGA code analyzed Pu gamma-ray data collected with high-purity germanium (HPGe) detectors to yield Pu isotopic ratios. While the original MGA code relied solely on the lower-energy gamma rays (around 100 keV), the most recent addition to the MGA++ code suite, MGAHI, analyzes Pu data using higher-energy gamma rays (200 keV and higher) and is particulatly useful for Pu samples - that are enclosed in thick-walled containers. The MGA++ suite also includes capabilities to perform U isotopic analysis on data collected with either HPGe or cadmium-zinc-tellutide (CZT) detectors. These codes are commercially available and are known as U235 and CZTU, respectively. A graphical user interface has also been developed for viewing the data and the fitting procedure. In addition, we are developing new codes that will integrate into the MGA++ suite. These will include Pu isotopic analysis capabilities for data collected with CZT detectors, U isotopic analysis with HPGe detectors which utilizes only higher energy gamma rays, and isotopic analyses on mixtures of Pu and U.
Date: July 1, 1998
Creator: Buckley, W; Clark, D; Friensehner, A; Parker, W; Raschke, K; Romine, W et al.
Partner: UNT Libraries Government Documents Department

ACRF Ingest Software Status: New, Current, and Future (November 2007)

Description: The purpose of this report is to provide status of the ingest software used to process instrument data for the Atmospheric Radiation Measurement Program Climate Research Facility (ACRF). The report is divided into 4 sections: (1) for news about ingests currently under development, (2) for current production ingests, (3) for future ingest development plans, and (4) for information on retired ingests. Please note that datastreams beginning in “xxx” indicate cases where ingests run at multiple ACRF sites, which results in a datastream(s) for each location.
Date: November 1, 2007
Creator: Koontz, AS; Choudhury, S & Ermold, BD: Gaustad, KL
Partner: UNT Libraries Government Documents Department