6,518 Matching Results

Search Results

Advanced search parameters have been applied.

DOCUMENTATION FOR PROGRAM OGRE

Description: This is a brief description of a computer program which was written by Oleh Weres to generate discrete grids for IFD* type computer programs. The output of the program includes data which can be used directly for input to the program SHAFT78. The program is specifically intended for large-scale two or three-dimensional reservoir simulation. The program requires, as input, the x, y, z coordinates of the discrete element locations being used to specify a particular reservoir's geological system. From the list of element locations, the program finds the midpoints of lines joining adjacent elements. At each midpoint the program constructs a perpendicular plane. The intersections of the planes in the three-space defines an irregular (in general) n-sided polyhedron around each element center. In two-dimensions the program produces a unique 'tiling' which has polygons with all faces perpendicular to the lines joining adjacent elements. The areas between adjoining elements and the volume of each element are calculated. The end result, in general, is a three-dimensional grid of n-sided polyhedra for which the element locations, the connecting (flow) areas, and the element volumes are all known. Since the grids are finite the program must have information about the boundary of the grid. This is supplied as a set of 'dummy' elements which are used only to limit the extent of the grid and are not intended for use in the reservoir simulation.
Date: June 1, 1978
Creator: Weres, O. & Schroeder, R.C.
Partner: UNT Libraries Government Documents Department

Joint inversion of geophysical data for site characterization and restoration monitoring

Description: The purpose of this project is to develop a computer code for joint inversion of seismic and electrical data, to improve underground imaging for site characterization and remediation monitoring. The computer code developed in this project will invert geophysical data to obtain direct estimates of porosity and saturation underground, rather than inverting for seismic velocity and electrical resistivity or other geophysical properties. This is intended to be a significant improvement in the state-of-the-art of underground imaging, since interpretation of data collected at a contaminated site would become much less subjective. Potential users include DOE scientists and engineers responsible for characterizing contaminated sites and monitoring remediation of contaminated sites. In this three-year project, we use a multi-phase approach consisting of theoretical and numerical code development, laboratory investigations, testing on available laboratory and borehole geophysics data sets, and a controlled field experiment, to develop practical tools for joint electrical and seismic data interpretation.
Date: May 28, 1998
Creator: Berge, P. A.
Partner: UNT Libraries Government Documents Department

On the secure obfuscation of deterministic finite automata.

Description: In this paper, we show how to construct secure obfuscation for Deterministic Finite Automata, assuming non-uniformly strong one-way functions exist. We revisit the software protection approaches originally proposed by [5, 10, 12, 17] and revise them to the current obfuscation setting of Barak et al. [2]. Under this model, we introduce an efficient oracle that retains some 'small' secret about the original program. Using this secret, we can construct an obfuscator and two-party protocol that securely obfuscates Deterministic Finite Automata against malicious adversaries. The security of this model retains the strong 'virtual black box' property originally proposed in [2] while incorporating the stronger condition of dependent auxiliary inputs in [15]. Additionally, we show that our techniques remain secure under concurrent self-composition with adaptive inputs and that Turing machines are obfuscatable under this model.
Date: June 1, 2008
Creator: Anderson, William Erik
Partner: UNT Libraries Government Documents Department

GYRO Simulations of Core Momentum Transport in DIII-D and JET Plasmas

Description: Momentum, energy, and particle transport in DIII-D and JET ELMy H-mode plasmas is simulated with GYRO and compared with measurements analyzed using TRANSP. The simulated transport depends sensitively on the nabla(T(sub)i) turbulence drive and the nabla(E(sub)r) turbulence suppression inputs. With their nominal values indicated by measurements, the simulations over-predict the momentum and energy transport in the DIII-D plasmas, and under-predict in the JET plasmas. Reducing |nabla(T(sub)i)| and increasing |nabla(E(sub)r)| by up to 15% leads to approximate agreement (within a factor of two) for the DIII-D cases. For the JET cases, increasing |nabla(T(sub)i)| or reducing |nabla(E(sub)r)| results in approximate agreement for the energy flow, but the ratio of the simulated energy and momentum flows remains higher than measurements by a factor of 2-4.
Date: June 27, 2005
Creator: Budny, R. V.; Candy, J. & Waltz, R. E.
Partner: UNT Libraries Government Documents Department

Illustrating the future prediction of performance based on computer code, physical experiments, and critical performance parameter samples

Description: In this paper, we present a generic example to illustrate various points about making future predictions of population performance using a biased performance computer code, physical performance data, and critical performance parameter data sampled from the population at various times. We show how the actual performance data help to correct the biased computer code and the impact of uncertainty especially when the prediction is made far from where the available data are taken. We also demonstrate how a Bayesian approach allows both inferences about the unknown parameters and predictions to be made in a consistent framework.
Date: January 1, 2009
Creator: Hamada, Michael S & Higdon, David M
Partner: UNT Libraries Government Documents Department

The BBP Algorithm for Pi

Description: The 'Bailey-Borwein-Plouffe' (BBP) algorithm for {pi} is based on the BBP formula for {pi}, which was discovered in 1995 and published in 1996 [3]: {pi} = {summation}{sub k=0}{sup {infinity}} 1/16{sup k} (4/8k+1 - 2/8k+4 - 1/8k+5 - 1/8k+6). This formula as it stands permits {pi} to be computed fairly rapidly to any given precision (although it is not as efficient for that purpose as some other formulas that are now known [4, pg. 108-112]). But its remarkable property is that it permits one to calculate (after a fairly simple manipulation) hexadecimal or binary digits of {pi} beginning at an arbitrary starting position. For example, ten hexadecimal digits {pi} beginning at position one million can be computed in only five seconds on a 2006-era personal computer. The formula itself was found by a computer program, and almost certainly constitutes the first instance of a computer program finding a significant new formula for {pi}. It turns out that the existence of this formula has implications for the long-standing unsolved question of whether {pi} is normal to commonly used number bases (a real number x is said to be b-normal if every m-long string of digits in the base-b expansion appears, in the limit, with frequency b{sup -m}). Extending this line of reasoning recently yielded a proof of normality for class of explicit real numbers (although not yet including {pi}) [4, pg. 148-156].
Date: September 17, 2006
Creator: Bailey, David H.
Partner: UNT Libraries Government Documents Department

USERDA computer program summaries. Numbers 177--239

Description: Since 1960 the Argonne Code Center has served as a U. S. Atomic Energy Commission information center for computer programs developed and used primarily for the solution of problems in nuclear physics, reactor design, reactor engineering and operation. The Center, through a network of registered installations, collects, validates, maintains, and distributes a library of these computer programs and publishes a compilation of abstracts describing them. In 1972 the scope of the Center's activities was officially expanded to include computer programs developed in all of the U. S. Atomic Energy Commission program areas and the compilation and publication of this report. The Computer Program Summary report contains summaries of computer programs at the specification stage, under development, being checked out, in use, or available at ERDA offices, laboratories, and contractor installations. Programs are divided into the following categories: cross section and resonance integral calculations; spectrum calculations, generation of group constants, lattice and cell problems; static design studies; depletion, fuel management, cost analysis, and reactor economics; space-independent kinetics; space--time kinetics, coupled neutronics-- hydrodynamics--thermodynamics and excursion simulations; radiological safety, hazard and accident analysis; heat transfer and fluid flow; deformation and stress distribution computations, structural analysis and engineering design studies; gamma heating and shield design programs; reactor systems analysis; data preparation; data management; subsidiary calculations; experimental data processing; general mathematical and computing system routines; materials; environmental and earth sciences; space sciences; electronics and engineering equipment; chemistry; particle accelerators and high-voltage machines; physics; controlled thermonuclear research; biology and medicine; and data. (RWR)
Date: October 1, 1975
Partner: UNT Libraries Government Documents Department

Creation of a GUI for Zori, a Quantum Monte Carlo program, usingRappture

Description: In their research laboratories, academic institutions produce some of the most advanced software for scientific applications. However, this software is usually developed only for local application in the research laboratory or for method development. In spite of having the latest advances in the particular field of science, such software often lacks adequate documentation and therefore is difficult to use by anyone other than the code developers. As such codes become more complex, so typically do the input files and command statements necessary to operate them. Many programs offer the flexibility of performing calculations based on different methods that have their own set of variables and options to be specified. Moreover, situations can arise in which certain options are incompatible with each other. For this reason, users outside the development group can be unaware of how the program runs in detail. The opportunity can be lost to make the software readily available outside of the laboratory of origin. This is a long-standing problem in scientific programming. Rappture, Rapid Application Infrastructure [1], is a new GUI development kit that enables a developer to build an I/O interface for a specific application. This capability enables users to work only with the generated GUI and avoids the problem of the user needing to learn details of the code. Further, it reduces input errors by explicitly specifying the variables required. Zori, a quantum Monte Carlo (QMC) program, developed by the Lester group at the University of California, Berkeley [2], is one of the few free tools available for this field. Like many scientific computer packages, Zori suffers from the problems described above. Potential users outside the research group have acquired it, but some have found the code difficult to use. Furthermore, new members of the Lester group usually have to take considerable time learning all ...
Date: December 1, 2007
Creator: Olivares-Amaya, R.; Salomon Ferrer, R.; Lester Jr., W.A. & Amador-Bedolla, C.
Partner: UNT Libraries Government Documents Department

ACRF Ingest Software Status: New, Current, and Future (November 2007)

Description: The purpose of this report is to provide status of the ingest software used to process instrument data for the Atmospheric Radiation Measurement Program Climate Research Facility (ACRF). The report is divided into 4 sections: (1) for news about ingests currently under development, (2) for current production ingests, (3) for future ingest development plans, and (4) for information on retired ingests. Please note that datastreams beginning in “xxx” indicate cases where ingests run at multiple ACRF sites, which results in a datastream(s) for each location.
Date: November 1, 2007
Creator: Koontz, AS; Choudhury, S & Ermold, BD: Gaustad, KL
Partner: UNT Libraries Government Documents Department

Field Quality Optimization in a Common Coil Magnet Design

Description: This paper presents the results of initial field quality optimization of body and end harmonics in a 'common coil magnet design'. It is shown that a good field quality, as required in accelerator magnets, can be obtained by distributing conductor blocks in such a way that they simulate an elliptical coil geometry. This strategy assures that the amount of conductor used in this block design is similar to that is used in a conventional cosine theta design. An optimized yoke that keeps all harmonics small over the entire range of operation using a single power supply is also presented. The field harmonics are primarily optimized with the computer program ROXIE.
Date: September 1, 1999
Creator: Gupta, R. & Ramberger, S.
Partner: UNT Libraries Government Documents Department

INTERFACING AUTOCAD WITH MAGNETIC DESIGN

Description: This report is a summary of work done towards developing an AutoCAD based system for design and analysis of magnets. The computer programs that have been developed are an attempt to integrate the new SUN computer based system with existing software on the old HP1000 System. We believe this is a good start for the further development of the whole system. The programming languages used are AutoLISP for the programs used by AutoCAD, and Fortran (Microsoft Fortran) for all others. The entire work has been done on IBM-AT, with the well known limits of the memory, speed of execution and operating system, therefore, some adjustment may be needed for the more powerful SUN system.
Date: February 1, 1988
Creator: Sorin, M. & Caspi, S.
Partner: UNT Libraries Government Documents Department

Software Requirements Specification: Multi-scale Epidemiological and Economic Simulation and Analysis (MESA) Scenario Bank

Description: This document builds on the discussion notes from September 21, 2006. It provides a summary of the ideas relating to the scenario bank tables and their associated requirements. Two conceptual groupings were identified for the contents requirements of the scenario bank. The first, called ProjectTemplate, shall consist of <Project, Scenarios, and Miscellaneous Files> groups. The second, ProjectArchive, shall consist of groups of <Project, Scenarios, Results, and Miscellaneous Files>. The figure below illustrates the multiplicity of the associations between the different tables, with color coding used to distinguish between current MESA (brown) and USDA (light green) requirements. Scenario bank tables are shown in black with their general contents specified within the box. The metadata associated with each table is expected to include database key information as well as relevant timestamps. Each File is expected to be a file with an arbitrary format.
Date: November 8, 2006
Creator: Dahlgren, T L; Hazlett, S G; Slone, D M & Smith, S G
Partner: UNT Libraries Government Documents Department