264 Matching Results

Search Results

Advanced search parameters have been applied.

A User's Manual for MASH V1.5 - A Monte Carlo Adjoint Shielding Code System

Description: The Monte Carlo ~djoint ~ielding Code System, MASH, calculates neutron and gamma- ray environments and radiation protection factors for armored military vehicles, structures, trenches, and other shielding configurations by coupling a forward discrete ordinates air- over-ground transport calculation with an adjoint Monte Carlo treatment of the shielding geometry. Efficiency and optimum use of computer time are emphasized. The code system includes the GRTUNCL and DORT codes for air-over-ground transport calculations, the MORSE code with the GIFT5 combinatorial geometry package for adjoint shielding calculations, and several peripheral codes that perform the required data preparations, transformations, and coupling functions. The current version, MASH v 1.5, is the successor to the original MASH v 1.0 code system initially developed at Oak Ridge National Laboratory (ORNL). The discrete ordinates calculation determines the fluence on a coupling surface surrounding the shielding geometry due to an external neutron/gamma-ray source. The Monte Carlo calculation determines the effectiveness of the fluence at that surface in causing a response in a detector within the shielding geometry, i.e., the "dose importance" of the coupling surface fluence. A coupling code folds the fluence together with the dose importance, giving the desired dose response. The coupling code can determine the dose response as a function of the shielding geometry orientation relative to the source, distance from the source, and energy response of the detector. This user's manual includes a short description of each code, the input required to execute the code along with some helpful input data notes, and a representative sample problem.
Date: October 1998
Creator: Slater, C. O.; Barnes, J. M.; Johnson, J. O. & Drischler, J. D.
Partner: UNT Libraries Government Documents Department

GDCT user`s manual

Description: This manual provides the user on how to use the Graphical Database Configuration Tool (GDCT) to build EPICS databases and visualize links between records and process variables.
Date: October 7, 1993
Creator: Kowalkowski, J.
Partner: UNT Libraries Government Documents Department

Model documentation report: Commercial Sector Demand Module of the National Energy Modeling System

Description: This report documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Commercial Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components. This report serves three purposes. First, it is a reference document providing a detailed description for model analysts, users, and the public. Second, this report meets the legal requirement of the Energy Information Administration (EIA) to provide adequate documentation in support of its statistical and forecast reports (Public Law 93-275, section 57(b)(1)). Third, it facilitates continuity in model development by providing documentation from which energy analysts can undertake model enhancements, data updates, and parameter refinements as future projects.
Date: February 1, 1995
Partner: UNT Libraries Government Documents Department

CAVEcomm users manual

Description: The CAVEcomm library is a set of routines designed to generalize the communications between virtual environments and supercomputers.
Date: December 1, 1996
Creator: Disz, T.L.; Papka, M.E.; Pellegrino, M. & Szymanski, M.
Partner: UNT Libraries Government Documents Department

Explanation of how to run the global local optimization code (GLO) to find surface heat flux

Description: From the evaluation[1] of the inverse techniques available, it was determined that the Global Local Optimization Code[2] can determine the surface heat flux using known experimental data at various points in the geometry. This code uses a whole domain approach in which an analysis code (such as TOPAZ2D or ABAQUS) can be run to get the appropriate data needed to minimize the heat flux function. This document is a compilation of our notes on how to run this code to find the surface heat flux. First, the code is described and the overall set-up procedure is reviewed. Then, creation of the configuration file is described. A specific configuration file is given with appropriate explanation. Using this information, the reader should be able to run GLO to find the surface heat flux.
Date: March 1, 1999
Creator: Aceves, S; Sahai, V & Stein, W
Partner: UNT Libraries Government Documents Department

TERRAIN: A computer program to process digital elevation models for modeling surface flow

Description: This document provides a step by step procedure, TERRAIN, for processing digital elevation models to calculate overland flow paths, watershed boundaries, slope, and aspect. The algorithms incorporated into TERRAIN have been used at two different geographic scales: first for small research watersheds where surface wetness measurements are made, and second for regional water modeling for entire counties. For small areas methods based on flow distribution may be more desirable, especially if time-dependent flow models are to be used. The main improvement in TERRAIN compared with earlier programs on which it is based is that it combines the conditioning routines, which remove depressions to avoid water storage, into a single process. Efficiency has also been improved, reducing run times as much as 10:1 and enabling the processing of very large grids in strips for regional modeling. Additionally, the ability to calculate the nutrient load delivered any cell in a watershed has been added. These improvements make TERRAIN a powerful tool for modeling surface flow.
Date: August 1, 1995
Creator: Schwartz, P.M.; Levine, D.A.; Hunsaker, C.T. & Timmins, S.P.
Partner: UNT Libraries Government Documents Department

Design intent tool: User guide

Description: This database tool provides a structured approach to recording design decisions that impact a facility's design intent in areas such as energy efficiency.Owners and de signers alike can plan, monitor and verify that a facility's design intent is being met during each stage of the design process. Additionally, the Tool gives commissioning agents, facility operators and future owners and renovators an understanding of how the building and its subsystems are intended to operate, and thus track and benchmark performance.
Date: August 23, 2002
Creator: Mills, Evan; Abell, Daniel; Bell, Geoffrey; Faludi, Jeremy; Greenberg, Steve; Hitchcock, Rob et al.
Partner: UNT Libraries Government Documents Department

NMG documentation, part 1: user`s guide

Description: This is the first of a three-part report documenting NMG, the Numerical Mathematics Guide. Part I is aimed at the user of the system. It contains an introduction, with an outline of the complete report, and Chapter 1, User`s Point of View. Part II is aimed at the programmer and contains Chapter 2, How It Works. Part III is aimed at the maintainer of NMG and contains Chapter 3, Maintenance, and Chapter 4, Validation. Each chapter has its own page numbering and table of contents.
Date: July 1, 1996
Creator: Fritsch, F.N. & Dickinson, R.P. Jr.
Partner: UNT Libraries Government Documents Department

Development of automated image co-registration techniques: Part II - multisensor imagery

Description: This is the second in a series of PNNL Multispectral Imagery (ST474D) reports on automated co-registration and rectification of multisensor imagery. In the first report, a semi-automated registration procedure was introduced based on methods proposed by Chen and Lee which emphasized registration of same sensor imagery. The Chen and Lee approach is outlined in Figure 1, and is described in detail in the first report. PNNL made several enhancements to the Chen and Lee approach; these modifications are outlined in Figure 2 and are also described in detail in the first report. The PNNL enhancements to the Chen and Lee approach introduced in the first phase have been named Multisensor Image Registration Automation (MIRA). These improvements increased computational efficiency and offered additional algorithms for coarse matching of disparate image types. In the MIRA approach, one set of optimum GCP locations are determined based on a Delaunay triangulation technique using an initial set of GCPs provided by the user, rather than repeating this step for each added control point as is proposed by Chen and Lee. The Chen and Lee approach uses an adjacent pixel difference algorithm for coarse matching patches of the reference image with the source image, while the MIRA approach adds other algorithms. Also the MIRA approach checks to determine if the a newly determined GCP fits the existing warping equation.
Date: October 1, 1996
Creator: Lundeen, T.F.; Andrews, A.K.; Perry, E.M.; Whyatt, M.V. & Steinmaus, K.L.
Partner: UNT Libraries Government Documents Department

User documentation for KINSOL, a nonlinear solver for sequential and parallel computers

Description: KINSOL is a general purpose nonlinear system solver callable from either C or Fortran programs It is based on NKSOL [3], but is written in ANSI-standard C rather than Fortran77 Its most notable feature is that it uses Krylov Inexact Newton techniques in the system`s approximate solution, thus sharing significant modules previously written within CASC at LLNL to support CVODE[6, 7]/PVODE[9, 5] It also requires almost no matrix storage for solving the Newton equations as compared to direct methods The name KINSOL is derived from those techniques Krylov Inexact Newton SOLver The package was arranged so that selecting one of two forms of a single module in the compilation process will allow the entire package to be created in either sequential (serial) or parallel form The parallel version of KINSOL uses MPI (Message-Passing Interface) [8] and an appropriately revised version of the vector module NVECTOR, as mentioned above, to achieve parallelism and portability KINSOL in parallel form is intended for the SPMD (Single Program Multiple Data) model with distributed memory, in which all vectors are identically distributed across processors In particular, the vector module NVECTOR is designed to help the user assign a contiguous segment of a given vector to each of the processors for parallel computation Several primitives were added to NVECTOR as originally written for PVODE to implement KINSOL KINSOL has been run on a Cray-T3D, an eight- processor DEC ALPHA and a cluster of workstations It is currently being used in a simulation of tokamak edge plasmas and in groundwater two-phase flow studies at LLNL The remainder of this paper is organized as follows Section 2 sets the mathematical notation and summarizes the basic methods Section 3 summarizes the organization of the KINSOL solver, while Section 4 summarizes its usage Section 5 describes a preconditioner module, Section ...
Date: July 1, 1998
Creator: Taylor, A. G., LLNL
Partner: UNT Libraries Government Documents Department

Questionnaire for sensitive positions (QSP) version 4.0 -- Users guide document

Description: The US Government does background investigations and reinvestigations to establish that applicants are eligible for required security clearance. The QSP system is an automated Paradox application developed by Boeing in 1988 and used by DOE-RL for data collection, retention, and printing by facsimile of the Standard Form 86 containing a person`s data needed to conduct an investigation. In March 1991 the QSP form was revised by the Office of Personnel Management (OPM). The QSP system was modified and enhanced to QSP version 3.0 and released for use in 1992. Copies of QSP version 3.0 were provided to approximately 20 other sites when requested. In February 1995 the OPM approved the new Standard Form 86 ``Questionnaire for National Security Positions.`` The QSP system was modified and upgraded to QSP version 4.0 to agree with the revised form.
Date: May 21, 1996
Creator: Hausel, J.M.
Partner: UNT Libraries Government Documents Department

Model Commissioning Plan and Guide Specifications

Description: The objectives of Model Commissioning Plan and Guide Specifications are to ensure that the design team applies commissioning concepts to the design and prepares commissioning specifications and a commission plan for inclusion in the bid construction documents.
Date: March 1, 1997
Partner: UNT Libraries Government Documents Department

xdamp Version 2: An IDL{reg_sign}-based data manipulation program

Description: The original DAMP (DAta Manipulation Program) was written by Mark Hedemann of Sandia National Laboratories and used the CA-DISSPLA{trademark} (available from Computer Associates International, Inc., Garden City, NY) graphics package as its engine. It was used to plot, modify, and otherwise manipulate the one-dimensional data waveforms (data vs. time) from a wide variety of accelerators. With the waning of CA-DISSPLA and the increasing popularity of UNIX{reg_sign}-based workstations, a replacement was needed. This package uses the IDL{reg_sign} software, available from Research Systems Incorporated in Boulder, Colorado, as the engine, and creates a set of widgets to manipulate the data in a manner similar to the original DAMP.IDL is currently supported on a wide variety of UNIX platforms such as IBM{reg_sign} workstations, Hewlett Packard workstations, SUN{reg_sign} workstations, Microsoft{reg_sign} Windows{trademark} computers, Macintosh{reg_sign} computers and Digital Equipment Corporation VMS{reg_sign} systems. Thus, xdamp is portable across many platforms. The authors have verified operation, albeit with some minor IDL bugs, on IBM PC computers using Windows, Windows 95 and Windows NT; IBM UNIX platforms; DEC Alpha and VMS systems; HP 9000/700 series workstations; and Macintosh computers, both regular and PowerPC{trademark} versions. Version 2 updates xdamp to require IDL version 4.0.1, adds many enhancements, and fixes a number of bugs.
Date: December 1, 1996
Creator: Ballard, W.P.
Partner: UNT Libraries Government Documents Department

Users guide for mpich, a portable implementation of MPI

Description: MPI (Message Passing Interface) is a standard specification for message-passing libraries. mpich is a portable implementation of the full MPI specification for a wide variety of parallel computing environments. This report describes how to build and run MPI programs using the mpich implementation of MPI.
Date: July 1, 1996
Creator: Gropp, W. & Lusk, E.
Partner: UNT Libraries Government Documents Department

Simple relationships for estimating intraparticle transport effects for catalytically promoted endothermic reactions

Description: Relationships for estimating effectiveness factors for porous-solid-catalyzed fluid reactions can result from assuming approximations to temperature and concentration profiles. Approximations designed to simplify the outcome result in simple, explicit, analytic relationships for both isothermal and nonisothermal nth-order reaction systems. For isothermal systems, formulas developed predict effectiveness within 25% of the true isothermal effectiveness factors ({eta}`s) over the range 0.1 > {eta} > 0.99. For isothermal or endothermic reaction systems with {eta} > 0.65, errors are less than 10%. Even in the maximum-error region, estimates for endothermic systems are within a factor of two of those obtained by solution of the rigorous heat and mass transfer equations. For isothermal or endothermic systems with {eta} > 0.95, errors are less than 1%. Thus the formulas can also serve diagnostic uses that confirm presence or absence of significant internal heat or mass transport effects in porous reacting systems. Extension of the approach to non-nth-order reactions is possible; formulas are derived for simple isothermal and nonisothermal Langmuir-Hinshelwood reaction systems. Application of the work to exothermic reactions was not tested, but steeper gradients in such systems would tend to degrade accuracy of the relationships. The equations derived in this work are simpler and easier of application than any others proposed thus far.
Date: June 16, 1998
Creator: Brown, L. F.
Partner: UNT Libraries Government Documents Department

SECPOP90: Sector population, land fraction, and economic estimation program

Description: In 1973 Mr. W. Athey of the Environmental Protection Agency wrote a computer program called SECPOP which calculated population estimates. Since that time, two things have changed which suggested the need for updating the original program - more recent population censuses and the widespread use of personal computers (PCs). The revised computer program uses the 1990 and 1992 Population Census information and runs on current PCs as {open_quotes}SECPOP90.{close_quotes} SECPOP90 consists of two parts: site and regional. The site provides population and economic data estimates for any location within the continental United States. Siting analysis is relatively fast running. The regional portion assesses site availability for different siting policy decisions; i.e., the impact of available sites given specific population density criteria within the continental United States. Regional analysis is slow. This report compares the SECPOP90 population estimates and the nuclear power reactor licensee-provided information. Although the source, and therefore the accuracy, of the licensee information is unknown, this comparison suggests SECPOP90 makes reasonable estimates. Given the total uncertainty in any current calculation of severe accidents, including the potential offsite consequences, the uncertainty within SECPOP90 population estimates is expected to be insignificant. 12 refs., 55 figs., 7 tabs.
Date: September 1997
Creator: Humphreys, S. L.; Rollstin, J. A. & Ridgely, J. N.
Partner: UNT Libraries Government Documents Department

Winnetka deformation zone: Surface expression of coactive slip on a blind fault during the Northridge earthquake sequence, California. Evidence that coactive faulting occurred in the Canoga Park, Winnetka, and Northridge areas during the 17 January 1994, Northridge, California earthquake

Description: Measurements of normalized length changes of streets over an area of 9 km{sup 2} in San Fernando Valley of Los Angeles, California, define a distinctive strain pattern that may well reflect blind faulting during the 1994 Northridge earthquake. Strain magnitudes are about 3 {times} 10{sup {minus}4}, locally 10{sup {minus}3}. They define a deformation zone trending diagonally from near Canoga Park in the southwest, through Winnetka, to near Northridge in the northeast. The deformation zone is about 4.5 km long and 1 km wide. The northwestern two-thirds of the zone is a belt of extension of streets, and the southeastern one-third is a belt of shortening of streets. On the northwest and southeast sides of the deformation zone the magnitude of the strains is too small to measure, less than 10{sup {minus}4}. Complete states of strain measured in the northeastern half of the deformation zone show that the directions of principal strains are parallel and normal to the walls of the zone, so the zone is not a strike-slip zone. The magnitudes of strains measured in the northeastern part of the Winnetka area were large enough to fracture concrete and soils, and the area of larger strains correlates with the area of greater damage to such roads and sidewalks. All parts of the pattern suggest a blind fault at depth, most likely a reverse fault dipping northwest but possibly a normal fault dipping southeast. The magnitudes of the strains in the Winnetka area are consistent with the strains produced at the ground surface by a blind fault plane extending to depth on the order of 2 km and a net slip on the order of 1 m, within a distance of about 100 to 500 m of the ground surface. The pattern of damage in the San Fernando Valley suggests a ...
Date: December 31, 1996
Creator: Cruikshank, K.M.; Johnson, A.M.; Fleming, R.W. & Jones, R.L.
Partner: UNT Libraries Government Documents Department

Optical Data Library {number_sign}5 for use with the WINDOW 4.1 computer program including NFRC certified data

Description: This report contains additional data for 783 glazing products that can be used with the Window 4.1 energy analysis program. The NFRC requires that all simulations must be carried out using NFRC-certified optical data only. Solar heat gain coefficient and visible transmittance must be calculated using spectral transmittance and reflectance in the solar range. U-factor calculations must use spectral or integrated emittance. NFRC-certified data within the Window 4.1 program is indicated.
Date: January 1, 1998
Creator: Rubin, M.
Partner: UNT Libraries Government Documents Department

Theory manual for FAROW version 1.1: A numerical analysis of the Fatigue And Reliability Of Wind turbine components

Description: Because the fatigue lifetime of wind turbine components depends on several factors that are highly variable, a numerical analysis tool called FAROW has been created to cast the problem of component fatigue life in a probabilistic framework. The probabilistic analysis is accomplished using methods of structural reliability (FORM/SORM). While the workings of the FAROW software package are defined in the user's manual, this theory manual outlines the mathematical basis. A deterministic solution for the time to failure is made possible by assuming analytical forms for the basic inputs of wind speed, stress response, and material resistance. Each parameter of the assumed forms for the inputs can be defined to be a random variable. The analytical framework is described and the solution for time to failure is derived.
Date: January 1, 2000
Partner: UNT Libraries Government Documents Department

Army National Guard (ARNG) Objective Supply Capability Adaptive Redesign (OSCAR) end-user manual

Description: The Objective Supply Capability Adaptive Redesign (OSCAR) project is designed to identify and develop programs which automate requirements not included in standard army systems. This includes providing automated interfaces between standard army systems at the National Guard Bureau (NGB) level and at the state/territory level. As part of the OSCAR project, custom software has been installed at NGB to streamline management of major end items. This software allows item managers to provide automated disposition on excess equipment to states operating the Standard Army Retail Supply System Objective (SARSS-O). It also accelerates movement of excess assets to improve the readiness of the Army National Guard (ARNG)--while reducing excess on hand. The purpose of the End-User Manual is to provide direction and guidance to the customer for implementing the ARNG Excess Management Program.
Date: December 1, 1997
Creator: Pelath, R.P. & Rasch, K.A.
Partner: UNT Libraries Government Documents Department

A summary of recent refinements to the WAKE dispersion model, a component of the HGSYSTEM/UF{sub 6} model suite

Description: The original WAKE dispersion model a component of the HGSYSTEM/UF{sub 6} model suite, is based on Shell Research Ltd.`s HGSYSTEM Version 3.0 and was developed by the US Department of Energy for use in estimating downwind dispersion of materials due to accidental releases from gaseous diffusion plant (GDP) process buildings. The model is applicable to scenarios involving both ground-level and elevated releases into building wake cavities of non-reactive plumes that are either neutrally or positively buoyant. Over the 2-year period since its creation, the WAKE model has been used to perform consequence analyses for Safety Analysis Reports (SARs) associated with gaseous diffusion plants in Portsmouth (PORTS), Paducah (PGDP), and Oak Ridge. These applications have identified the need for additional model capabilities (such as the treatment of complex terrain and time-variant releases) not present in the original utilities which, in turn, has resulted in numerous modifications to these codes as well as the development of additional, stand-alone postprocessing utilities. Consequently, application of the model has become increasingly complex as the number of executable, input, and output files associated with a single model run has steadily grown. In response to these problems, a streamlined version of the WAKE model has been developed which integrates all calculations that are currently performed by the existing WAKE, and the various post-processing utilities. This report summarizes the efforts involved in developing this revised version of the WAKE model.
Date: August 1, 1998
Creator: Yambert, M.W.; Lombardi, D.A.; Goode, W.D. Jr. & Bloom, S.G.
Partner: UNT Libraries Government Documents Department