245 Matching Results

Search Results

Advanced search parameters have been applied.

LCS Users Manual

Description: The Lower Hybrid Simulation Code (LSC) is a computational model of lower hybrid current drive in the presence of an electric field. Details of geometry, plasma profiles, and circuit equations are treated. Two-dimensional velocity space effects are approximated in a one-dimensional Fokker-Planck treatment. The LSC was originally written to be a module for lower hybrid current drive called by the Tokamak Simulation Code (TSC), which is a numerical model of an axisymmetric tokamak plasma and the associated control systems. The TSC simulates the time evolution of a free boundary plasma by solving the MHD equations on a rectangular computational grid. The MHD equations are coupled to the external circuits (representing poloidal field coils) through the boundary conditions. The code includes provisions for modeling the control system, external heating, and fusion heating. The LSC module can also be called by the TRANSP code. TRANSP represents the plasma with an axisymmetric, fixed-boundary model and focuses on calculation of plasma transport to determine transport coefficients from data on power inputs and parameters reached. This manual covers the basic material needed to use the LSC. If run in conjunction with TSC, the "TSC Users Manual" should be consulted. If run in conjunction with TRANSP, on-line documentation will be helpful. A theoretical background of the governing equations and numerical methods is given. Information on obtaining, compiling, and running the code is also provided.
Date: February 1, 1998
Creator: Redd, A.J. & Ignat, D.W.
Partner: UNT Libraries Government Documents Department

Shielding materials for high-energy neutrons

Description: The authors used the Monte Carlo transport code Los Alamos High-Energy Transport (LAHET) to study the shielding effectiveness of common shielding materials for high-energy neutrons. The source neutron spectrum was from the interaction of an 800-MeV proton beam and iron target. In a normal incident, the neutrons collided with walls made of six common shielding materials: water, concrete, iron, lead, polyethylene, and soil. The walls were of four different thicknesses: 25, 50, 75 and 100 cm. They then tallied the neutron spectra on the other side of the shielding wall and calculated the neutron doses. For the high-Z materials--iron and lead--they find that many neutrons with energies between 1--10 MeV are created when high-energy neutrons interact with shielding materials. For materials containing low-Z elements--water, soil, concrete, and polyethylene--the spectra show higher energy peaks at about 100 MeV. The studies show that for a given wall thickness, concrete is more effective than the other materials. They also studied the effectiveness of combinations of materials, such as concrete and water, concrete and soil, iron and polyethylene, or iron polyethylene and concrete.
Date: May 1, 1997
Creator: Hsu, H.H.
Partner: UNT Libraries Government Documents Department

LATIS modeling of laser induced midplane and backplane spallation

Description: The computer code LATIS is used to simulate midplane and backplane spallation resulting from short pulsed laser absorption. A 1-D planar geometry is simulated with an exponential laser absorption profile. The laser pulse length is assumed to be much shorter than the sound transit time across the laser absorption length. The boundary conditions are a fixed front plane and free backplane (backplane spall) and a free front plane and a fixed midplane (midplane spall). The NBS/NRC equation of state for water is used with a self- consistent yet empirical material strength and failure model. The failure model includes the effects of void nucleation, growth and coalescence. Definite signatures of the nucleation and coalescence thresholds are found in the back surface motion for backplane spallation.
Date: March 5, 1997
Creator: Glinksky, M.E.; Bailey, D.S. & London, R.A.
Partner: UNT Libraries Government Documents Department

Applications of evaluated nuclear data in the LAHET code

Description: We investigate the use of evaluated cross section data to define the nonelastic interaction rate for reactions described by the intranuclear cascade code in LAHET. We find that improved predictions of total neutron production within stopping-length target assemblies are obtained.
Date: August 1, 1997
Creator: Prael, R.E. & Chadwick, M.B.
Partner: UNT Libraries Government Documents Department

A generalized fitting technique for the LIFE2 fatigue analysis code

Description: The analysis of component fatigue lifetime for a wind energy conversion system (WECS) requires that the component load spectrum be formulated in terms of stress cycles. Typically, these stress cycles are obtained from time series data using a cycle identification scheme. As discussed by many authors, the matrix or matrices of cycle counts that describe the stresses on a turbine are constructed from relatively short, representative samples of time series data. The ability to correctly represent the long-term behavior of the distribution of stress cycles from these representative samples is critical to the analysis of service lifetimes. Several techniques are currently used to convert representative samples to the lifetime cyclic loads on the turbine. There has been recently developed a set of fitting algorithms that is particularly useful for matching the body of the distribution of fatigue stress cycles on a turbine component. Fitting techniques are now incorporated into the LIFE2 fatigue/fracture analysis code for wind turbines. In this paper, the authors provide an overview of the fitting algorithms and describe the pre- and post-count algorithms developed to permit their use in the LIFE2 code. Typical case studies are used to illustrate the use of the technique.
Date: August 1996
Creator: Sutherland, H. J. & Wilson, T.
Partner: UNT Libraries Government Documents Department

An investigation of radon release and mobility in the subsurface environment. Final project report

Description: Processes affecting transport of volatile species in the shallow soil column have recently been recognized as having a substantial impact on a broad array of real world problems. Investigations of volatile transport have ranged from studies of probable health impacts of radon infiltration into homes to pesticide and volatile organic contaminant mobility in the soil column. The objectives of many of these studies has been the development of numerical models of vapor phase (and solute) transport in shallow soils. An early model, LEACHM, developed by Hutson and Wagenent was recently modified enabling it to describe both solute and vapor phase transport of volatile chemicals in the soil. Subsequent tests of the latter model, named LEACHV, showed that use outside of a very restricted range of soil conditions resulted in large mass balance errors and unreasonable values for soil gas concentrations and vapor flux. The present research was undertaken in an effort to identify and correct the subroutines responsible for the problems and to allow the model to describe vapor phase transport in a much broader range of soil conditions.
Date: June 1, 1997
Creator: Thomas, D.
Partner: UNT Libraries Government Documents Department

Linux support at Fermilab

Description: In January of 1998 Fermilab issued an official statement of support of the Linux operating system. This was the result of a ground swell of interest in the possibilities of a cheap, easily used platform for computation and analysis culminating with the successful demonstration of a small computation farm as reported at CHEP97. This paper will describe the current status of Linux support and deployment at Fermilab. The collaborative development process for Linux creates some problems with traditional support models. A primary example of this is that there is no definite OS distribution ala a CD distribution from a traditional Unix vendor. Fermilab has had to make a more definite statement about what is meant by Linux for this reason. Linux support at Fermilab is restricted to the Intel processor platform. A central distribution system has been created to mitigate problems with multiple distribution and configuration options. This system is based on the Red Hat distribution with the Fermi Unix Environment (FUE) layered above it. Deployment of Linux at the lab has been rapidly growing and by CHEP there are expected to be hundreds of machines running Linux. These include computational farms, trigger processing farms, and desktop workstations. The former groups are described in other talks and consist of clusters of many tens of very similar machines devoted to a few tasks. The latter group is more diverse and challenging. The user community has been very supportive and active in defining needs for Linux features and solving various compatibility issues. We will discuss the support arrangements currently in place.
Date: December 1, 1998
Creator: D.R. Yocum, C. Sieh, D. Skow, S. Kovich, D. Holmgren and R. Kennedy
Partner: UNT Libraries Government Documents Department

LAGRANGIAN PARTICLE DISPERSION MODEL (LPDM) TECHNICAL DESCRIPTION (U)

Description: The Savannah River National Laboratory (SRNL) uses the Lagrangian Particle Dispersion Model (LPDM) in conjunction with the Regional Atmospheric Modeling System as an operational tool for emergency response consequence assessments for the Savannah River Site (SRS). The LPDM is an advanced stochastic atmospheric transport model used to transport and disperse passive tracers subject to the meteorological field generated by RAMS from sources of varying number and shape. The Atmospheric Technologies Group (ATG) of the SRNL is undertaking the task of reviewing documentation and code for LPDM Quality Assurance (QA). The LPDM QA task will include a model technical description, computer coding descriptions, model applications, and configuration control. This report provides a comprehensive technical description of the LPDM model.
Date: July 20, 2006
Creator: Chen, K
Partner: UNT Libraries Government Documents Department

LADTAP-PROB: A PROBABILISTIC MODEL TO ASSESS RADIOLOGICAL CONSEQUENCES FROM LIQUID RADIOACTIVE RELEASES

Description: The potential radiological consequences to humans resulting from aqueous releases at the Savannah River Site (SRS) have usually been assessed using the computer code LADTAP or deterministic variations of this code. The advancement of LADTAP over the years included LADTAP II (a computer program that still resides on the mainframe at SRS) [1], LADTAP XL{copyright} (Microsoft Excel{reg_sign} Spreadsheet) [2], and other versions specific to SRS areas such as [3]. The spreadsheet variations of LADTAP contain two worksheets: LADTAP and IRRIDOSE. The LADTAP worksheet estimates dose for environmental pathways including ingestion of water and fish and external exposure resulting from recreational activities. IRRIDOSE estimates potential dose to individuals from irrigation of food crops with contaminated water. A new version of this deterministic methodology, LADTAP-PROB, was developed at Savannah River National Laboratory (SRNL) to (1) consider the complete range of the model parameter values (not just maximum or mean values), (2) determine the influences of parameter uncertainties within the LADTAP methodology, to perform a sensitivity analysis of all model parameters (to identify the input parameters to which model results are most sensitive), and (3) probabilistically assess radiological consequences from contaminated water. This study presents the methodology applied in LADTAP-PROB.
Date: January 26, 2009
Creator: Farfan, E; Trevor Foley, T & Tim Jannik, T
Partner: UNT Libraries Government Documents Department

Understanding Lustre Internals

Description: Lustre was initiated and funded, almost a decade ago, by the U.S. Department of Energy (DoE) Office of Science and National Nuclear Security Administration laboratories to address the need for an open source, highly-scalable, high-performance parallel filesystem on by then present and future supercomputing platforms. Throughout the last decade, it was deployed over numerous medium-to-large-scale supercomputing platforms and clusters, and it performed and met the expectations of the Lustre user community. As it stands at the time of writing this document, according to the Top500 list, 15 of the top 30 supercomputers in the world use Lustre filesystem. This report aims to present a streamlined overview on how Lustre works internally at reasonable details including relevant data structures, APIs, protocols and algorithms involved for Lustre version 1.6 source code base. More importantly, it tries to explain how various components interconnect with each other and function as a system. Portions of this report are based on discussions with Oak Ridge National Laboratory Lustre Center of Excellence team members and portions of it are based on our own understanding of how the code works. We, as the authors team bare all responsibilities for all errors and omissions in this document. We can only hope it helps current and future Lustre users and Lustre code developers as much as it helped us understanding the Lustre source code and its internal workings.
Date: April 1, 2009
Creator: Wang, Feiyi; Oral, H Sarp; Shipman, Galen M; Drokin, Oleg; Wang, Di & Huang, He
Partner: UNT Libraries Government Documents Department

The SANDmath package.

Description: This is a basic documentation explaining how to use the SANDmath macros with a LATEX 2{var_epsilon} document pertaining to the SANDreport class.
Date: August 1, 2004
Creator: PÔebay, Philippe Pierre
Partner: UNT Libraries Government Documents Department

Coupled light transport-heat diffusion model for laser dosimetry with dynamic optical properties

Description: The effect of dynamic optical properties on the spatial distribution of light in laser therapy is studied via numerical simulations. A two-dimensional, time dependent computer program called LATIS is used. Laser light transport is simulated with a Monte Carlo technique including anisotropic scattering and absorption. Thermal heat transport is calculated with a finite difference algorithm. Material properties are specified on a 2-D mesh and can be arbitrary functions of space and time. Arrhenius rate equations are solved for tissue damage caused by elevated temperatures. Optical properties are functions of tissue damage, as determined by previous measurements. Results are presented for the time variation of the light distribution and damage within the tissue as the optical properties of the tissue are altered.
Date: March 1, 1995
Creator: London, R.A.; Glinsky, M.E.; Zimmerman, G.B.; Eder, D.C. & Jacques, S.L.
Partner: UNT Libraries Government Documents Department

Molecular Simulation of Reacting Systems

Description: The final report for a Laboratory Directed Research and Development project entitled, Molecular Simulation of Reacting Systems is presented. It describes efforts to incorporate chemical reaction events into the LAMMPS massively parallel molecular dynamics code. This was accomplished using a scheme in which several classes of reactions are allowed to occur in a probabilistic fashion at specified times during the MD simulation. Three classes of reaction were implemented: addition, chain transfer and scission. A fully parallel implementation was achieved using a checkerboarding scheme, which avoids conflicts due to reactions occurring on neighboring processors. The observed chemical evolution is independent of the number of processors used. The code was applied to two test applications: irreversible linear polymerization and thermal degradation chemistry.
Date: March 1, 2002
Creator: THOMPSON, AIDAN P.
Partner: UNT Libraries Government Documents Department

Damage estimates from long-term structural analysis of a wind turbine in a US wind farm environment

Description: Time-domain simulations of the loads on wind energy conversion systems have been hampered in the past by the relatively long computational times for nonlinear structural analysis codes. However, recent advances in both the level of sophistication and computational efficiency of available computer hardware and the codes themselves now permit long-term simulations to be conducted in reasonable times. Thus, these codes provide a unique capability to evaluate the spectral content of the fatigue loads on a turbine. To demonstrate these capabilities, a Micon 65/13 turbine is analyzed using the YawDyn and the ADAMS dynamic analysis codes. The SNLWIND-3D simulator and measured boundary conditions are used to simulate the inflow environment that can be expected during a single, 24-hour period by a turbine residing in Row 41 of a wind farm located in San Gorgonio Pass, California. Also, long-term simulations (up to 8 hours of simulated time) with constant average inflow velocities are used to better define the characteristics of the fatigue load on the turbine. Damage calculations, using the LIFE2 fatigue analysis code and the MSU/DOE fatigue data base for composite materials, are then used to determine minimum simulation times for consistent estimates of service lifetimes.
Date: October 1, 1996
Creator: Kelley, N.D. & Sutherland, H.J.
Partner: UNT Libraries Government Documents Department

Lilith: A scalable secure tool for massively parallel distributed computing

Description: Changes in high performance computing have necessitated the ability to utilize and interrogate potentially many thousands of processors. The ASCI (Advanced Strategic Computing Initiative) program conducted by the United States Department of Energy, for example, envisions thousands of distinct operating systems connected by low-latency gigabit-per-second networks. In addition multiple systems of this kind will be linked via high-capacity networks with latencies as low as the speed of light will allow. Code which spans systems of this sort must be scalable; yet constructing such code whether for applications, debugging, or maintenance is an unsolved problem. Lilith is a research software platform that attempts to answer these questions with an end toward meeting these needs. Presently, Lilith exists as a test-bed, written in Java, for various spanning algorithms and security schemes. The test-bed software has, and enforces, hooks allowing implementation and testing of various security schemes.
Date: June 1, 1997
Creator: Armstrong, R.C.; Camp, L.J.; Evensky, D.A. & Gentile, A.C.
Partner: UNT Libraries Government Documents Department

Measurement of neutron attenuation through thick shields and comparison with calculation

Description: The large neutrino experiments conducted over the last several years at the Los Alamos Neutron Science Center (LANSCE) have provided the opportunity to measure the effects of neutron attenuation in very thick shields. These experiments have featured detectors with active masses of 6 to 150 tons and shield thicknesses ranging from 3000 to 5280 g/cm{sup 2}. An absolute measurement of the high-energy neutron flux was made from the beam stop in a neutrino cave at ninety degrees and nine meters from the beam stop. Differential neutron shielding measurements in iron were also performed, resulting in an attenuation length of 148 g/cm{sup 2}. These measurements allow for the testing of radiation shielding codes for deep penetration problems. The measured flux and attenuation length is compared to calculations using the LAHET Code System (LCS). These codes incorporate biasing techniques, allowing for direct calculation of deep penetration shielding problems. Calculations of the neutron current and attenuation length are presented and compared with measured values. Results from the shielding codes show good agreement with the measured values.
Date: December 31, 1998
Creator: Bull, J.S.; Donahue, J.B. & Burman, R.L.
Partner: UNT Libraries Government Documents Department

Parallel unconstrained minimization of potential energy in LAMMPS

Description: This report describes a new minimization capability added to LAMMPS V4.0. Minimization of potential energy is used to find molecular conformations that are close to structures found in nature. The new minimization algorithm uses LAMMPS subroutines for calculating energy and force vectors, and follows the LAMMPS partitioning scheme for distributing large data objects on multiprocessor machines. Since gradient-based algorithms cannot tolerate nonsmoothness, a new Coulomb style that smoothly cuts off to zero at a finite distance is provided. This report explains the minimization algorithm and its parallel implementation within LAMMPS. Guidelines are given for invoking the algorithm and interpreting results.
Date: October 13, 1997
Creator: Plantenga, T.
Partner: UNT Libraries Government Documents Department