411 Matching Results

Search Results

Advanced search parameters have been applied.

GridRun: A lightweight packaging and execution environment forcompact, multi-architecture binaries

Description: GridRun offers a very simple set of tools for creating and executing multi-platform binary executables. These ''fat-binaries'' archive native machine code into compact packages that are typically a fraction the size of the original binary images they store, enabling efficient staging of executables for heterogeneous parallel jobs. GridRun interoperates with existing distributed job launchers/managers like Condor and the Globus GRAM to greatly simplify the logic required launching native binary applications in distributed heterogeneous environments.
Date: February 1, 2004
Creator: Shalf, John & Goodale, Tom
Partner: UNT Libraries Government Documents Department

Using gpfs 2.2 to enable a cross platform accessibility of singlestorage

Description: With IBM's aid I have conducted a cross compatibility test of GPFS 2.2 between an IBM F50 Power2 running AIX 5.2 ML/3 and 8 Dual Pentium 4/2.2 GHz running Redhat 9.0. The objective was to demonstrate a single shared instance of the file system and storage between the disparate operating systems and hardware systems. The cross compatibility test was successful. The chronology of events that led to this successful test are documented below.
Date: December 1, 1994
Creator: Baird, Will
Partner: UNT Libraries Government Documents Department

Grid computing : enabling a vision for collaborative research.

Description: In this paper the authors provide a motivation for Grid computing based on a vision to enable a collaborative research environment. The authors vision goes beyond the connection of hardware resources. They argue that with an infrastructure such as the Grid, new modalities for collaborative research are enabled. They provide an overview showing why Grid research is difficult, and they present a number of management-related issues that must be addressed to make Grids a reality. They list projects that provide solutions to subsets of these issues.
Date: April 9, 2002
Creator: von Laszewski, G.
Partner: UNT Libraries Government Documents Department

Gist: A scientific graphics package for Python

Description: {open_quotes}Gist{close_quotes} is a scientific graphics library written by David H. Munro of Lawrence Livermore National Laboratory (LLNL). It features support for three common graphics output devices: X Windows, (Color) PostScript, and ANSI/ISO Standard Computer Graphics Metafiles (CGM). The library is small (written directly to Xlib), portable, efficient, and full-featured. It produces X versus Y plots with {open_quotes}good{close_quotes} tick marks and tick labels, 2-dimensional quadrilateral mesh plots with contours, vector fields, or pseudo color maps on such meshes, with 3-dimensional plots on the way. The Python Gist module utilizes the new {open_quotes}Numeric{close_quotes} module due to J. Hugunin and others. It is therefore fast and able to handle large datasets. The Gist module includes an X Windows event dispatcher which can be dynamically added (e.g., via importing a dynamically loaded module) to the Python interpreter after a simple two-line modification to the Python core. This makes fast mouse-controlled zoom, pan, and other graphic operations available to the researcher while maintaining the usual Python command-line interface. Munro`s Gist library is already freely available. The Python Gist module is currently under review and is also expected to qualify for unlimited release.
Date: May 8, 1996
Creator: Busby, L.E.
Partner: UNT Libraries Government Documents Department

Utility of coupling nonlinear optimization methods with numerical modeling software

Description: Results of using GLO (Global Local Optimizer), a general purpose nonlinear optimization software package for investigating multi-parameter problems in science and engineering is discussed. The package consists of the modular optimization control system (GLO), a graphical user interface (GLO-GUI), a pre-processor (GLO-PUT), a post-processor (GLO-GET), and nonlinear optimization software modules, GLOBAL & LOCAL. GLO is designed for controlling and easy coupling to any scientific software application. GLO runs the optimization module and scientific software application in an iterative loop. At each iteration, the optimization module defines new values for the set of parameters being optimized. GLO-PUT inserts the new parameter values into the input file of the scientific application. GLO runs the application with the new parameter values. GLO-GET determines the value of the objective function by extracting the results of the analysis and comparing to the desired result. GLO continues to run the scientific application over and over until it finds the ``best`` set of parameters by minimizing (or maximizing) the objective function. An example problem showing the optimization of material model is presented (Taylor cylinder impact test).
Date: August 5, 1996
Creator: Murphy, M.J.
Partner: UNT Libraries Government Documents Department

New approaches to recognizing functional domains in biological sequences. Final report, April 1, 1993--March 31, 1997

Description: The purpose of this project is to develop new approaches and programs for determining the function of DNA domains. This will aid in the understanding of the sequence data obtained through the Human Genome Project. One of the great challenges of that project is to abstract important biological information from the raw sequences that emerge. The efforts have focused on several areas determining the protein coding regions in genomic DNA; recognizing patterns of DNA binding proteins, including nucleosomes, from the sequence using multi-alphabet analyses; better recognition methods for RNA genes and other patterns where structural considerations are important along with sequence; enhancing the ``Sequence Landscape`` approach to pattern recognition and applying it to various problems in domain classification. GeneParser is the program the authors developed to identify optimal classification boundaries in genomic DNA. This was the first approach to combine several types of evidence into the classification and obtain optimal and suboptimal predictions by a Dynamic Programming algorithm. The authors also explored the use of neural networks to obtain the optimal weighting of the different types of evidence.
Date: December 1, 1997
Partner: UNT Libraries Government Documents Department

Development of a natural gas systems analysis model (GSAM)

Description: This report provides an overview of the activities to date and schedule for future testing, validation, and authorized enhancements of Natural Gas Systems Analysis Model (GSAM). The goal of this report is to inform DOE managers of progress in model development and to provide a benchmark for ongoing and future research. Section II of the report provides a detailed discussion on the major GSAM development programs performed and completed during the period of performance, July 1, 1998 to September 30, 1999. Key improvements in the new GSAM version are summarized in Section III. Programmer's guides for GSAM main modules were produced to provide detailed descriptions of all major subroutines and main variables of the computer code. General logical flowcharts of the subroutines are also presented in the guides to provide overall picture of interactions between the subroutines. A standard structure of routine explanation is applied in every programmer's guide. The explanation is started with a brief description or main purpose of the routine, lists of input and output files read and created, and lists of invoked/child and calling/parent routines. In some of the guides, interactions between the routine itself and its parent and child routines are presented in the form of graphical flowchart. The explanation is then proceeded with step by step description of computer code in the subroutine where each step delegates a section of related code. Between steps, if a certain section of code needs further explanation, a Note is inserted with relevant explanation.
Date: October 1, 1999
Partner: UNT Libraries Government Documents Department

OS3D/GIMRT software for modeling multicomponent-multidimensional reactive transport

Description: OS3D/GIMRT is a numerical software package for simulating multicomponent reactive transport in porous media. The package consists of two principal components: (1) the code OS3D (Operator Splitting 3-Dimensional Reactive Transport) which simulates reactive transport by either splitting the reaction and transport steps in time, i.e., the classic time or operator splitting approach, or by iterating sequentially between reactions and transport, and (2) the code GIMRT (Global Implicit Multicomponent Reactive Transport) which treats up to two dimensional reactive transport with a one step or global implicit approach. Although the two codes do not yet have totally identical capabilities, they can be run from the same input file, allowing comparisons to be made between the two approaches in many cases. The advantages and disadvantages of the two approaches are discussed more fully below, but in general OS3D is designed for simulation of transient concentration fronts, particularly under high Peclet number transport conditions, because of its use of a total variation diminishing or TVD transport algorithm. GIMRT is suited for simulating water-rock alteration over long periods of time where the aqueous concentration field is at or close to a quasi-stationary state and the numerical transport errors are less important. Where water-rock interaction occurs over geological periods of time, GIMRT may be preferable to OS3D because of its ability to take larger time steps.
Date: May 17, 2000
Creator: Steefel, CI & Yabusaki, SB
Partner: UNT Libraries Government Documents Department

GENWPG : a WPG graphics file generator.

Description: This document explains how to use a utility that can plot x-y data and bar-charts, and generate a DrawPerfect WPG graphics file. The format of a sample input text file (using the default page style) is described, and the graphics file it generated was printed on an HP LaserJet IV and is attached. Also attached are output graphs that demonstrate additional page styles. This utility lets you quickly generate x-y graphs and bar-charts for inclusion in WordPerfect documents. R was written because no other readily-available graphics package lets you overlay independent x-y curves on top of one other. Lotus 1-2-3, Harvard Graphics, and FoxGraph require all curves on the same page to have one common column of independent x-values or y-values. We were stuck in manual loops of cutting and pasting curves from one plot to another before they were included in our documents. Not any more! This document describes how to overcome this problem.
Date: February 4, 1998
Creator: Jusko, M. & Whitfield, R.
Partner: UNT Libraries Government Documents Department

Application experiences with the Globus toolkit.

Description: The Globus grid toolkit is a collection of software components designed to support the development of applications for high-performance distributed computing environments, or ''computational grids'' [14]. The Globus toolkit is an implementation of a ''bag of services'' architecture, which provides application and tool developers not with a monolithic system but rather with a set of stand-alone services. Each Globus component provides a basic service, such as authentication, resource allocation, information, communication, fault detection, and remote data access. Different applications and tools can combine these services in different ways to construct ''grid-enabled'' systems. The Globus toolkit has been used to construct the Globus Ubiquitous Supercomputing Testbed, or GUSTO: a large-scale testbed spanning 20 sites and included over 4000 compute nodes for a total compute power of over 2 TFLOPS. Over the past six months, we and others have used this testbed to conduct a variety of application experiments, including multi-user collaborative environments (tele-immersion), computational steering, distributed supercomputing, and high throughput computing. The goal of this paper is to review what has been learned from these experiments regarding the effectiveness of the toolkit approach. To this end, we describe two of the application experiments in detail, noting what worked well and what worked less well. The two applications are a distributed supercomputing application, SF-Express, in which multiple supercomputers are harnessed to perform large distributed interactive simulations; and a tele-immersion application, CAVERNsoft, in which the focus is on connecting multiple people to a distributed simulated world.
Date: June 9, 1998
Creator: Brunett, S.
Partner: UNT Libraries Government Documents Department

Explanation of how to run the global local optimization code (GLO) to find surface heat flux

Description: From the evaluation[1] of the inverse techniques available, it was determined that the Global Local Optimization Code[2] can determine the surface heat flux using known experimental data at various points in the geometry. This code uses a whole domain approach in which an analysis code (such as TOPAZ2D or ABAQUS) can be run to get the appropriate data needed to minimize the heat flux function. This document is a compilation of our notes on how to run this code to find the surface heat flux. First, the code is described and the overall set-up procedure is reviewed. Then, creation of the configuration file is described. A specific configuration file is given with appropriate explanation. Using this information, the reader should be able to run GLO to find the surface heat flux.
Date: March 1, 1999
Creator: Aceves, S; Sahai, V & Stein, W
Partner: UNT Libraries Government Documents Department

Further Developments in Dynamic Focusing

Description: Dynamic focusing has been proposed [1] as a way to eliminate a conventional collimation and final focus system in linear colliders, and is a scheme that is more readily extended to colliders at several TeV center-of-mass energy. In this paper we examine several outstanding issues, in particular, the optimization of the lens and main beam parameters. Simulations of the lens-lens, lens-main, and main-main beam collisions using a modified version of the GUINEAPIG beam-beam code are in progress.
Date: April 9, 1999
Creator: Thompson, Kathleen A
Partner: UNT Libraries Government Documents Department

GXQ program user`s guide. Revision 1

Description: This report documents the program user`s guide of a general purpose atmospheric dispersion code named GXQ. GXQ is an IBM Compatible microcomputer based program for calculating atmospheric dispersion coefficients using Hanford site specific joint frequency data. It uses the Gaussian straight line model to calculate either an atmospheric dispersion coefficient (X/Q{prime}) or a maximum normalized air concentration (X/Q). Several options are available to the user which alter the standard Gaussian model to allow for plume depletion, building wake, plume meander, sector averaging, gravitational settling and plume rise. Additional options control handling of the joint frequency data and output. Combinations of the above allow calculation of X/Q{prime} in accordance with Nuclear Regulatory Commission Regulatory Guide 1.145.
Date: May 10, 1995
Creator: Hey, B.E.
Partner: UNT Libraries Government Documents Department

Software, component, and service deployment in computational Grids.

Description: Grids comprise an infrastructure that enables scientists to use a diverse set of distributed remote services and resources as part of complex scientific problem-solving processes. We analyze some of the challenges involved in deploying software and components transparently in Grids. We report on three practical solutions used by the Globus Project. Lessons learned from this experience lead us to believe that it is necessary to support a variety of software and component deployment strategies. These strategies are based on the hosting environment.
Date: April 18, 2002
Creator: von Laszewski, G.; Blau, E.; Bletzinger, M.; Gawor, J.; Lane, P.; Martin, S. et al.
Partner: UNT Libraries Government Documents Department

GNU debugger internal architecture

Description: This document describes the internal and architecture and implementation of the GNU debugger, gdb. Topics include inferior process management, command execution, symbol table management and remote debugging. Call graphs for specific functions are supplied. This document is not a complete description but offers a developer an overview which is the place to start before modification.
Date: December 16, 1993
Creator: Miller, P.; Nessett, D. & Pizzi, R.
Partner: UNT Libraries Government Documents Department