528 Matching Results

Search Results

Advanced search parameters have been applied.

A User's Manual for MASH V1.5 - A Monte Carlo Adjoint Shielding Code System

Description: The Monte Carlo ~djoint ~ielding Code System, MASH, calculates neutron and gamma- ray environments and radiation protection factors for armored military vehicles, structures, trenches, and other shielding configurations by coupling a forward discrete ordinates air- over-ground transport calculation with an adjoint Monte Carlo treatment of the shielding geometry. Efficiency and optimum use of computer time are emphasized. The code system includes the GRTUNCL and DORT codes for air-over-ground transport calculations, the MORSE code with the GIFT5 combinatorial geometry package for adjoint shielding calculations, and several peripheral codes that perform the required data preparations, transformations, and coupling functions. The current version, MASH v 1.5, is the successor to the original MASH v 1.0 code system initially developed at Oak Ridge National Laboratory (ORNL). The discrete ordinates calculation determines the fluence on a coupling surface surrounding the shielding geometry due to an external neutron/gamma-ray source. The Monte Carlo calculation determines the effectiveness of the fluence at that surface in causing a response in a detector within the shielding geometry, i.e., the "dose importance" of the coupling surface fluence. A coupling code folds the fluence together with the dose importance, giving the desired dose response. The coupling code can determine the dose response as a function of the shielding geometry orientation relative to the source, distance from the source, and energy response of the detector. This user's manual includes a short description of each code, the input required to execute the code along with some helpful input data notes, and a representative sample problem.
Date: October 1998
Creator: Slater, C. O.; Barnes, J. M.; Johnson, J. O. & Drischler, J. D.
Partner: UNT Libraries Government Documents Department

GDCT user`s manual

Description: This manual provides the user on how to use the Graphical Database Configuration Tool (GDCT) to build EPICS databases and visualize links between records and process variables.
Date: October 7, 1993
Creator: Kowalkowski, J.
Partner: UNT Libraries Government Documents Department

Model documentation report: Commercial Sector Demand Module of the National Energy Modeling System

Description: This report documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Commercial Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components. This report serves three purposes. First, it is a reference document providing a detailed description for model analysts, users, and the public. Second, this report meets the legal requirement of the Energy Information Administration (EIA) to provide adequate documentation in support of its statistical and forecast reports (Public Law 93-275, section 57(b)(1)). Third, it facilitates continuity in model development by providing documentation from which energy analysts can undertake model enhancements, data updates, and parameter refinements as future projects.
Date: February 1, 1995
Partner: UNT Libraries Government Documents Department

CAVEcomm users manual

Description: The CAVEcomm library is a set of routines designed to generalize the communications between virtual environments and supercomputers.
Date: December 1, 1996
Creator: Disz, T.L.; Papka, M.E.; Pellegrino, M. & Szymanski, M.
Partner: UNT Libraries Government Documents Department

Explanation of how to run the global local optimization code (GLO) to find surface heat flux

Description: From the evaluation[1] of the inverse techniques available, it was determined that the Global Local Optimization Code[2] can determine the surface heat flux using known experimental data at various points in the geometry. This code uses a whole domain approach in which an analysis code (such as TOPAZ2D or ABAQUS) can be run to get the appropriate data needed to minimize the heat flux function. This document is a compilation of our notes on how to run this code to find the surface heat flux. First, the code is described and the overall set-up procedure is reviewed. Then, creation of the configuration file is described. A specific configuration file is given with appropriate explanation. Using this information, the reader should be able to run GLO to find the surface heat flux.
Date: March 1, 1999
Creator: Aceves, S; Sahai, V & Stein, W
Partner: UNT Libraries Government Documents Department

TERRAIN: A computer program to process digital elevation models for modeling surface flow

Description: This document provides a step by step procedure, TERRAIN, for processing digital elevation models to calculate overland flow paths, watershed boundaries, slope, and aspect. The algorithms incorporated into TERRAIN have been used at two different geographic scales: first for small research watersheds where surface wetness measurements are made, and second for regional water modeling for entire counties. For small areas methods based on flow distribution may be more desirable, especially if time-dependent flow models are to be used. The main improvement in TERRAIN compared with earlier programs on which it is based is that it combines the conditioning routines, which remove depressions to avoid water storage, into a single process. Efficiency has also been improved, reducing run times as much as 10:1 and enabling the processing of very large grids in strips for regional modeling. Additionally, the ability to calculate the nutrient load delivered any cell in a watershed has been added. These improvements make TERRAIN a powerful tool for modeling surface flow.
Date: August 1, 1995
Creator: Schwartz, P.M.; Levine, D.A.; Hunsaker, C.T. & Timmins, S.P.
Partner: UNT Libraries Government Documents Department

Design intent tool: User guide

Description: This database tool provides a structured approach to recording design decisions that impact a facility's design intent in areas such as energy efficiency.Owners and de signers alike can plan, monitor and verify that a facility's design intent is being met during each stage of the design process. Additionally, the Tool gives commissioning agents, facility operators and future owners and renovators an understanding of how the building and its subsystems are intended to operate, and thus track and benchmark performance.
Date: August 23, 2002
Creator: Mills, Evan; Abell, Daniel; Bell, Geoffrey; Faludi, Jeremy; Greenberg, Steve; Hitchcock, Rob et al.
Partner: UNT Libraries Government Documents Department

SLURM: Simple Linux Utility for Resource Management

Description: Simple Linux Utility for Resource Management (SLURM) is an open source, fault-tolerant, and highly scalable cluster management and job scheduling system for Linux clusters of thousands of nodes. Components include machine status, partition management, job management, scheduling and stream copy modules. This paper presents an overview of the SLURM architecture and functionality.
Date: December 19, 2002
Creator: Jette, M & Grondona, M
Partner: UNT Libraries Government Documents Department

Tool Gear Documentation

Description: Tool Gear is designed to allow tool developers to insert instrumentation code into target programs using the DPCL library. This code can gather data and send it back to the Client for display or analysis. Tools can use the Tool Gear client without using the DPCL Collector. Any collector using the right protocols can send data to the Client for display and analysis. However, this document will focus on how to gather data with the DPCL Collector. There are three parts to the task of using Tool Gear to gather data through DPCL: (1) Write the instrumentation code that will be loaded and run in the target program. The code should be in the form of one or more functions, which can pass data structures back to the Client by way of DPCL. The collections of functions is compiled into a library, as described in this report. (2) Write the code that tells the DPCL Collector about the instrumentation and how to forward data back to the Client. (3) Extend the client to accept data from the Collector and display it in a useful way. The rest of this report describes how to carry out each of these steps.
Date: April 3, 2002
Creator: May, J & Gyllenhaal, J
Partner: UNT Libraries Government Documents Department

SLURM: Simple Linux Utility for Resource Management

Description: Simple Linux Utility for Resource Management (SLURM) is an open source, fault-tolerant, and highly scalable cluster management and job scheduling system for Linux clusters of thousands of nodes. Components include machine status, partition management, job management, scheduling and stream copy modules. The design also includes a scalable, general-purpose communication infrastructure. This paper presents a overview of the SLURM architecture and functionality.
Date: July 8, 2002
Creator: Jette, M; Dunlap, C; Garlick, J & Grondona, M
Partner: UNT Libraries Government Documents Department

HBTprogs Version 1.0

Description: This is the manual for a collection of programs that can be used to invert angled-averaged (i.e. one dimensional) two-particle correlation functions. This package consists of several programs that generate kernel matrices (basically the relative wavefunction of the pair, squared), programs that generate test correlation functions from test sources of various types and the program that actually inverts the data using the kernel matrix.
Date: March 15, 2002
Creator: Brown, D & Danielewicz, P
Partner: UNT Libraries Government Documents Department

MPPL Reference Manual

Description: MPPL (''More Productive Programming Language'') allows programmers to write in a language that is more convenient and powerful than Fortran 77. MPPL then transforms statements written in the MPPL language into standard Fortran 77. This language is essentially an extension to Fortran 77 that provides free-form input and many structured constructs such as ''while'' and ''for'' loops. MPPL's macro pre-processor and file-inclusion facility encourage the creation of structured, easy-to-read programs that contain fewer labels. MPPL provides a more productive programming environment for Fortran 77 users on the Unix, Linux, AIX, IRIX, Solairs, HP-UX, Tru64 operating systems. MPPL can be used independently as well as with Basis.
Date: July 1, 2002
Creator: Dubois, P F; Motteler, Z C; Willmann, P A; Allsman, R A; Benedetti, C M; Crotinger, J A et al.
Partner: UNT Libraries Government Documents Department

An introduction to computer viruses

Description: This report on computer viruses is based upon a thesis written for the Master of Science degree in Computer Science from the University of Tennessee in December 1989 by David R. Brown. This thesis is entitled An Analysis of Computer Virus Construction, Proliferation, and Control and is available through the University of Tennessee Library. This paper contains an overview of the computer virus arena that can help the reader to evaluate the threat that computer viruses pose. The extent of this threat can only be determined by evaluating many different factors. These factors include the relative ease with which a computer virus can be written, the motivation involved in writing a computer virus, the damage and overhead incurred by infected systems, and the legal implications of computer viruses, among others. Based upon the research, the development of a computer virus seems to require more persistence than technical expertise. This is a frightening proclamation to the computing community. The education of computer professionals to the dangers that viruses pose to the welfare of the computing industry as a whole is stressed as a means of inhibiting the current proliferation of computer virus programs. Recommendations are made to assist computer users in preventing infection by computer viruses. These recommendations support solid general computer security practices as a means of combating computer viruses.
Date: March 1, 1992
Creator: Brown, D.R.
Partner: UNT Libraries Government Documents Department

NMG documentation, part 1: user`s guide

Description: This is the first of a three-part report documenting NMG, the Numerical Mathematics Guide. Part I is aimed at the user of the system. It contains an introduction, with an outline of the complete report, and Chapter 1, User`s Point of View. Part II is aimed at the programmer and contains Chapter 2, How It Works. Part III is aimed at the maintainer of NMG and contains Chapter 3, Maintenance, and Chapter 4, Validation. Each chapter has its own page numbering and table of contents.
Date: July 1, 1996
Creator: Fritsch, F.N. & Dickinson, R.P. Jr.
Partner: UNT Libraries Government Documents Department

Development of automated image co-registration techniques: Part II - multisensor imagery

Description: This is the second in a series of PNNL Multispectral Imagery (ST474D) reports on automated co-registration and rectification of multisensor imagery. In the first report, a semi-automated registration procedure was introduced based on methods proposed by Chen and Lee which emphasized registration of same sensor imagery. The Chen and Lee approach is outlined in Figure 1, and is described in detail in the first report. PNNL made several enhancements to the Chen and Lee approach; these modifications are outlined in Figure 2 and are also described in detail in the first report. The PNNL enhancements to the Chen and Lee approach introduced in the first phase have been named Multisensor Image Registration Automation (MIRA). These improvements increased computational efficiency and offered additional algorithms for coarse matching of disparate image types. In the MIRA approach, one set of optimum GCP locations are determined based on a Delaunay triangulation technique using an initial set of GCPs provided by the user, rather than repeating this step for each added control point as is proposed by Chen and Lee. The Chen and Lee approach uses an adjacent pixel difference algorithm for coarse matching patches of the reference image with the source image, while the MIRA approach adds other algorithms. Also the MIRA approach checks to determine if the a newly determined GCP fits the existing warping equation.
Date: October 1, 1996
Creator: Lundeen, T.F.; Andrews, A.K.; Perry, E.M.; Whyatt, M.V. & Steinmaus, K.L.
Partner: UNT Libraries Government Documents Department

User documentation for KINSOL, a nonlinear solver for sequential and parallel computers

Description: KINSOL is a general purpose nonlinear system solver callable from either C or Fortran programs It is based on NKSOL [3], but is written in ANSI-standard C rather than Fortran77 Its most notable feature is that it uses Krylov Inexact Newton techniques in the system`s approximate solution, thus sharing significant modules previously written within CASC at LLNL to support CVODE[6, 7]/PVODE[9, 5] It also requires almost no matrix storage for solving the Newton equations as compared to direct methods The name KINSOL is derived from those techniques Krylov Inexact Newton SOLver The package was arranged so that selecting one of two forms of a single module in the compilation process will allow the entire package to be created in either sequential (serial) or parallel form The parallel version of KINSOL uses MPI (Message-Passing Interface) [8] and an appropriately revised version of the vector module NVECTOR, as mentioned above, to achieve parallelism and portability KINSOL in parallel form is intended for the SPMD (Single Program Multiple Data) model with distributed memory, in which all vectors are identically distributed across processors In particular, the vector module NVECTOR is designed to help the user assign a contiguous segment of a given vector to each of the processors for parallel computation Several primitives were added to NVECTOR as originally written for PVODE to implement KINSOL KINSOL has been run on a Cray-T3D, an eight- processor DEC ALPHA and a cluster of workstations It is currently being used in a simulation of tokamak edge plasmas and in groundwater two-phase flow studies at LLNL The remainder of this paper is organized as follows Section 2 sets the mathematical notation and summarizes the basic methods Section 3 summarizes the organization of the KINSOL solver, while Section 4 summarizes its usage Section 5 describes a preconditioner module, Section ...
Date: July 1, 1998
Creator: Taylor, A. G., LLNL
Partner: UNT Libraries Government Documents Department

Questionnaire for sensitive positions (QSP) version 4.0 -- Users guide document

Description: The US Government does background investigations and reinvestigations to establish that applicants are eligible for required security clearance. The QSP system is an automated Paradox application developed by Boeing in 1988 and used by DOE-RL for data collection, retention, and printing by facsimile of the Standard Form 86 containing a person`s data needed to conduct an investigation. In March 1991 the QSP form was revised by the Office of Personnel Management (OPM). The QSP system was modified and enhanced to QSP version 3.0 and released for use in 1992. Copies of QSP version 3.0 were provided to approximately 20 other sites when requested. In February 1995 the OPM approved the new Standard Form 86 ``Questionnaire for National Security Positions.`` The QSP system was modified and upgraded to QSP version 4.0 to agree with the revised form.
Date: May 21, 1996
Creator: Hausel, J.M.
Partner: UNT Libraries Government Documents Department

Model Commissioning Plan and Guide Specifications

Description: The objectives of Model Commissioning Plan and Guide Specifications are to ensure that the design team applies commissioning concepts to the design and prepares commissioning specifications and a commission plan for inclusion in the bid construction documents.
Date: March 1, 1997
Partner: UNT Libraries Government Documents Department

xdamp Version 2: An IDL{reg_sign}-based data manipulation program

Description: The original DAMP (DAta Manipulation Program) was written by Mark Hedemann of Sandia National Laboratories and used the CA-DISSPLA{trademark} (available from Computer Associates International, Inc., Garden City, NY) graphics package as its engine. It was used to plot, modify, and otherwise manipulate the one-dimensional data waveforms (data vs. time) from a wide variety of accelerators. With the waning of CA-DISSPLA and the increasing popularity of UNIX{reg_sign}-based workstations, a replacement was needed. This package uses the IDL{reg_sign} software, available from Research Systems Incorporated in Boulder, Colorado, as the engine, and creates a set of widgets to manipulate the data in a manner similar to the original DAMP.IDL is currently supported on a wide variety of UNIX platforms such as IBM{reg_sign} workstations, Hewlett Packard workstations, SUN{reg_sign} workstations, Microsoft{reg_sign} Windows{trademark} computers, Macintosh{reg_sign} computers and Digital Equipment Corporation VMS{reg_sign} systems. Thus, xdamp is portable across many platforms. The authors have verified operation, albeit with some minor IDL bugs, on IBM PC computers using Windows, Windows 95 and Windows NT; IBM UNIX platforms; DEC Alpha and VMS systems; HP 9000/700 series workstations; and Macintosh computers, both regular and PowerPC{trademark} versions. Version 2 updates xdamp to require IDL version 4.0.1, adds many enhancements, and fixes a number of bugs.
Date: December 1, 1996
Creator: Ballard, W.P.
Partner: UNT Libraries Government Documents Department

Users guide for mpich, a portable implementation of MPI

Description: MPI (Message Passing Interface) is a standard specification for message-passing libraries. mpich is a portable implementation of the full MPI specification for a wide variety of parallel computing environments. This report describes how to build and run MPI programs using the mpich implementation of MPI.
Date: July 1, 1996
Creator: Gropp, W. & Lusk, E.
Partner: UNT Libraries Government Documents Department

Simple relationships for estimating intraparticle transport effects for catalytically promoted endothermic reactions

Description: Relationships for estimating effectiveness factors for porous-solid-catalyzed fluid reactions can result from assuming approximations to temperature and concentration profiles. Approximations designed to simplify the outcome result in simple, explicit, analytic relationships for both isothermal and nonisothermal nth-order reaction systems. For isothermal systems, formulas developed predict effectiveness within 25% of the true isothermal effectiveness factors ({eta}`s) over the range 0.1 > {eta} > 0.99. For isothermal or endothermic reaction systems with {eta} > 0.65, errors are less than 10%. Even in the maximum-error region, estimates for endothermic systems are within a factor of two of those obtained by solution of the rigorous heat and mass transfer equations. For isothermal or endothermic systems with {eta} > 0.95, errors are less than 1%. Thus the formulas can also serve diagnostic uses that confirm presence or absence of significant internal heat or mass transport effects in porous reacting systems. Extension of the approach to non-nth-order reactions is possible; formulas are derived for simple isothermal and nonisothermal Langmuir-Hinshelwood reaction systems. Application of the work to exothermic reactions was not tested, but steeper gradients in such systems would tend to degrade accuracy of the relationships. The equations derived in this work are simpler and easier of application than any others proposed thus far.
Date: June 16, 1998
Creator: Brown, L. F.
Partner: UNT Libraries Government Documents Department