36 Matching Results

Search Results

Advanced search parameters have been applied.

Modeling of Cation Binding in Hydrated 2:1 Clay Minerals - Final Report

Description: Hydrated 2:1 clay minerals are high surface area, layered silicates that play a unique role in determining the fate of radionuclides in the environment. This project consisted of developing and implementing computer simulation methods for molecular characterization of the swelling and ion exchange properties of Hydrated 2:1 clay minerals, and the subsequent analysis and theoretical modeling with a view toward improving contaminant transport modeling as well as soil remediation and radionuclide containment strategies. Project results included the (a) development of simulation methods to treat clays under environmentally relevant conditions of variable water vapor pressure; (b) calculation of clay swelling thermodynamics as a function of interlayer ion size and charge (calculated quantities include immersion energies, free energies, and entropies of swelling); and (c) calculation of ion exchange free energies, including contributions from changing interlayer water contents and layer spacing.
Date: September 14, 2000
Creator: Smith, David E.
Partner: UNT Libraries Government Documents Department

Initial Results in the Use of Prony Methods to Determine the Damping and Modal Composition of Power System Dynamic Response Signals.

Description: Prony analysis is an emerging method that extends Fourier analysis by directly estimating the frequency, damping, strength, and relative phase of modal components present in a given signal. This is precisely the kind of information that power system engineers would like to extract from transient stability program (TSP) simulations and from large-scale system tests or disturbances. A tool of this sort would be particularly valuable for TSP output analysis, where it promises to provide: parametric summaries for damping studies (data compression), quantified information for adjusting remedial controls (sensitivity analysis and performance evaluation), insight into modal interaction mechanisms (modal analysis), reduced simulation times for damping evaluation (prediction). These considerations lead BPA to produce the interactive FORTRAN programs TRANSCIENT and DTRANSCIENT. The objectives are to evaluate the method, to revise the code for utility applications, and to fortify both for use with larger models. Polynomial rooting, a critical and numerically demanding task, is now accomplished by a routine (QPOLY) that was extracted from the NASA program SAMSAN and converted to quadruple precision. The revised DTRANSCIENT is now accessed as a subroutine, PRSPAK. For batch use PRSPAK has been converted to a more comprehensive program, SIGPAKZ. This report presents early results in the application of Prony analysis to power system problems. Key objectives are to: provide a brief mathematical description of Prony analysis, report on progress in applying and evaluating SIGPAKZ, outline the development status of the Prony code itself and needed enhancements to it. 21 refs., 12 figs.
Date: October 1, 1988
Creator: Hauer, John F.
Partner: UNT Libraries Government Documents Department

Agent review phase one report.

Description: This report summarizes the findings for phase one of the agent review and discusses the review methods and results. The phase one review identified a short list of agent systems that would prove most useful in the service architecture of an information management, analysis, and retrieval system. Reviewers evaluated open-source and commercial multi-agent systems and scored them based upon viability, uniqueness, ease of development, ease of deployment, and ease of integration with other products. Based on these criteria, reviewers identified the ten most appropriate systems. The report also mentions several systems that reviewers deemed noteworthy for the ideas they implement, even if those systems are not the best choices for information management purposes.
Date: December 1, 2009
Creator: Zubelewicz, Alex Tadeusz; Davis, Christopher Edward & Bauer, Travis LaDell
Partner: UNT Libraries Government Documents Department

Application of the DG-1199 methodology to the ESBWR and ABWR.

Description: Appendix A-5 of Draft Regulatory Guide DG-1199 'Alternative Radiological Source Term for Evaluating Design Basis Accidents at Nuclear Power Reactors' provides guidance - applicable to RADTRAD MSIV leakage models - for scaling containment aerosol concentration to the expected steam dome concentration in order to preserve the simplified use of the Accident Source Term (AST) in assessing containment performance under assumed design basis accident (DBA) conditions. In this study Economic and Safe Boiling Water Reactor (ESBWR) and Advanced Boiling Water Reactor (ABWR) RADTRAD models are developed using the DG-1199, Appendix A-5 guidance. The models were run using RADTRAD v3.03. Low Population Zone (LPZ), control room (CR), and worst-case 2-hr Exclusion Area Boundary (EAB) doses were calculated and compared to the relevant accident dose criteria in 10 CFR 50.67. For the ESBWR, the dose results were all lower than the MSIV leakage doses calculated by General Electric/Hitachi (GEH) in their licensing technical report. There are no comparable ABWR MSIV leakage doses, however, it should be noted that the ABWR doses are lower than the ESBWR doses. In addition, sensitivity cases were evaluated to ascertain the influence/importance of key input parameters/features of the models.
Date: September 1, 2010
Creator: Kalinich, Donald A.; Gauntt, Randall O. & Walton, Fotini
Partner: UNT Libraries Government Documents Department

Parallel algorithm strategies for circuit simulation.

Description: Circuit simulation tools (e.g., SPICE) have become invaluable in the development and design of electronic circuits. However, they have been pushed to their performance limits in addressing circuit design challenges that come from the technology drivers of smaller feature scales and higher integration. Improving the performance of circuit simulation tools through exploiting new opportunities in widely-available multi-processor architectures is a logical next step. Unfortunately, not all traditional simulation applications are inherently parallel, and quickly adapting mature application codes (even codes designed to parallel applications) to new parallel paradigms can be prohibitively difficult. In general, performance is influenced by many choices: hardware platform, runtime environment, languages and compilers used, algorithm choice and implementation, and more. In this complicated environment, the use of mini-applications small self-contained proxies for real applications is an excellent approach for rapidly exploring the parameter space of all these choices. In this report we present a multi-core performance study of Xyce, a transistor-level circuit simulation tool, and describe the future development of a mini-application for circuit simulation.
Date: January 1, 2010
Creator: Thornquist, Heidi K.; Schiek, Richard Louis & Keiter, Eric Richard
Partner: UNT Libraries Government Documents Department

Hydrocarbon characterization experiments in fully turbulent fires.

Description: As the capabilities of numerical simulations increase, decision makers are increasingly relying upon simulations rather than experiments to assess risks across a wide variety of accident scenarios including fires. There are still, however, many aspects of fires that are either not well understood or are difficult to treat from first principles due to the computational expense. For a simulation to be truly predictive and to provide decision makers with information which can be reliably used for risk assessment the remaining physical processes must be studied and suitable models developed for the effects of the physics. The model for the fuel evaporation rate in a liquid fuel pool fire is significant because in well-ventilated fires the evaporation rate largely controls the total heat release rate from the fire. A set of experiments are outlined in this report which will provide data for the development and validation of models for the fuel regression rates in liquid hydrocarbon fuel fires. The experiments will be performed on fires in the fully turbulent scale range (> 1 m diameter) and with a number of hydrocarbon fuels ranging from lightly sooting to heavily sooting. The importance of spectral absorption in the liquid fuels and the vapor dome above the pool will be investigated and the total heat flux to the pool surface will be measured. The importance of convection within the liquid fuel will be assessed by restricting large scale liquid motion in some tests. These data sets will provide a sound, experimentally proven basis for assessing how much of the liquid fuel needs to be modeled to enable a predictive simulation of a fuel fire given the couplings between evaporation of fuel from the pool and the heat release from the fire which drives the evaporation.
Date: May 1, 2007
Creator: Ricks, Allen & Blanchat, Thomas K.
Partner: UNT Libraries Government Documents Department

Advanced engineering environment collaboration project.

Description: The Advanced Engineering Environment (AEE) is a model for an engineering design and communications system that will enhance project collaboration throughout the nuclear weapons complex (NWC). Sandia National Laboratories and Parametric Technology Corporation (PTC) worked together on a prototype project to evaluate the suitability of a portion of PTC's Windchill 9.0 suite of data management, design and collaboration tools as the basis for an AEE. The AEE project team implemented Windchill 9.0 development servers in both classified and unclassified domains and used them to test and evaluate the Windchill tool suite relative to the needs of the NWC using weapons project use cases. A primary deliverable was the development of a new real time collaborative desktop design and engineering process using PDMLink (data management tool), Pro/Engineer (mechanical computer aided design tool) and ProductView Lite (visualization tool). Additional project activities included evaluations of PTC's electrical computer aided design, visualization, and engineering calculations applications. This report documents the AEE project work to share information and lessons learned with other NWC sites. It also provides PTC with recommendations for improving their products for NWC applications.
Date: December 1, 2008
Creator: Lamph, Jane Ann; Pomplun, Alan R.; Kiba, Grant W.; Dutra, Edward G.; Dankiewicz, Robert J. & Marburger, Scot J.
Partner: UNT Libraries Government Documents Department

On the need and use of models to explore the role of economic confidence:a survey.

Description: Empirical studies suggest that consumption is more sensitive to current income than suggested under the permanent income hypothesis, which raises questions regarding expectations for future income, risk aversion, and the role of economic confidence measures. This report surveys a body of fundamental economic literature as well as burgeoning computational modeling methods to support efforts to better anticipate cascading economic responses to terrorist threats and attacks. This is a three part survey to support the incorporation of models of economic confidence into agent-based microeconomic simulations. We first review broad underlying economic principles related to this topic. We then review the economic principle of confidence and related empirical studies. Finally, we provide a brief survey of efforts and publications related to agent-based economic simulation.
Date: April 1, 2005
Creator: Sprigg, James A.; Paez, Paul J. (University of New Mexico, Albuquerque, NM) & Hand, Michael S. (University of New Mexico, Albuquerque, NM)
Partner: UNT Libraries Government Documents Department

Full employment and competition in the Aspen economic model: implications for modeling acts of terrorism.

Description: Acts of terrorism could have a range of broad impacts on an economy, including changes in consumer (or demand) confidence and the ability of productive sectors to respond to changes. As a first step toward a model of terrorism-based impacts, we develop here a model of production and employment that characterizes dynamics in ways useful toward understanding how terrorism-based shocks could propagate through the economy; subsequent models will introduce the role of savings and investment into the economy. We use Aspen, a powerful economic modeling tool developed at Sandia, to demonstrate for validation purposes that a single-firm economy converges to the known monopoly equilibrium price, output, and employment levels, while multiple-firm economies converge toward the competitive equilibria typified by lower prices and higher output and employment. However, we find that competition also leads to churn by consumers seeking lower prices, making it difficult for firms to optimize with respect to wages, prices, and employment levels. Thus, competitive firms generate market ''noise'' in the steady state as they search for prices and employment levels that will maximize profits. In the context of this model, not only could terrorism depress overall consumer confidence and economic activity but terrorist acts could also cause normal short-run dynamics to be misinterpreted by consumers as a faltering economy.
Date: November 1, 2004
Creator: Sprigg, James A. & Ehlen, Mark Andrew
Partner: UNT Libraries Government Documents Department

Exploration of new multivariate spectral calibration algorithms.

Description: A variety of multivariate calibration algorithms for quantitative spectral analyses were investigated and compared, and new algorithms were developed in the course of this Laboratory Directed Research and Development project. We were able to demonstrate the ability of the hybrid classical least squares/partial least squares (CLSIPLS) calibration algorithms to maintain calibrations in the presence of spectrometer drift and to transfer calibrations between spectrometers from the same or different manufacturers. These methods were found to be as good or better in prediction ability as the commonly used partial least squares (PLS) method. We also present the theory for an entirely new class of algorithms labeled augmented classical least squares (ACLS) methods. New factor selection methods are developed and described for the ACLS algorithms. These factor selection methods are demonstrated using near-infrared spectra collected from a system of dilute aqueous solutions. The ACLS algorithm is also shown to provide improved ease of use and better prediction ability than PLS when transferring calibrations between near-infrared calibrations from the same manufacturer. Finally, simulations incorporating either ideal or realistic errors in the spectra were used to compare the prediction abilities of the new ACLS algorithm with that of PLS. We found that in the presence of realistic errors with non-uniform spectral error variance across spectral channels or with spectral errors correlated between frequency channels, ACLS methods generally out-performed the more commonly used PLS method. These results demonstrate the need for realistic error structure in simulations when the prediction abilities of various algorithms are compared. The combination of equal or superior prediction ability and the ease of use of the ACLS algorithms make the new ACLS methods the preferred algorithms to use for multivariate spectral calibrations.
Date: March 1, 2004
Creator: Van Benthem, Mark Hilary; Haaland, David Michael; Melgaard, David Kennett; Martin, Laura Elizabeth; Wehlburg, Christine Marie; Pell, Randy J. (The Dow Chemical Company, Midland, MI) et al.
Partner: UNT Libraries Government Documents Department

MatSeis developer's guide:version 1.0.

Description: This guide is intended to enable researchers working with seismic data, but lacking backgrounds in computer science and programming, to develop seismic algorithms using the MATLAB-based MatSeis software. Specifically, it presents a series of step-by-step instructions to write four specific functions of increasing complexity, while simultaneously explaining the notation, syntax, and general program design of the functions being written. The ultimate goal is that that the user can use this guide as a jumping off point from which he or she can write new functions that are compatible with and expand the capabilities of the current MatSeis software that has been developed as part of the Ground-based Nuclear Explosion Monitoring Research and Engineering (GNEMRE) program at Sandia National Laboratories.
Date: May 1, 2007
Creator: McConnell, Lane Christopher & Young, Christopher John
Partner: UNT Libraries Government Documents Department

Integrating software architectures for distributed simulations and simulation analysis communities.

Description: The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context of the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.
Date: October 1, 2005
Creator: Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael; Moore, Patrick Curtis; Sa, Timothy J. & Hawley, Marilyn F.
Partner: UNT Libraries Government Documents Department

Modeling and simulation technology readiness levels.

Description: This report summarizes the results of an effort to establish a framework for assigning and communicating technology readiness levels (TRLs) for the modeling and simulation (ModSim) capabilities at Sandia National Laboratories. This effort was undertaken as a special assignment for the Weapon Simulation and Computing (WSC) program office led by Art Hale, and lasted from January to September 2006. This report summarizes the results, conclusions, and recommendations, and is intended to help guide the program office in their decisions about the future direction of this work. The work was broken out into several distinct phases, starting with establishing the scope and definition of the assignment. These are characterized in a set of key assertions provided in the body of this report. Fundamentally, the assignment involved establishing an intellectual framework for TRL assignments to Sandia's modeling and simulation capabilities, including the development and testing of a process to conduct the assignments. To that end, we proposed a methodology for both assigning and understanding the TRLs, and outlined some of the restrictions that need to be placed on this process and the expected use of the result. One of the first assumptions we overturned was the notion of a ''static'' TRL--rather we concluded that problem context was essential in any TRL assignment, and that leads to dynamic results (i.e., a ModSim tool's readiness level depends on how it is used, and by whom). While we leveraged the classic TRL results from NASA, DoD, and Sandia's NW program, we came up with a substantially revised version of the TRL definitions, maintaining consistency with the classic level definitions and the Predictive Capability Maturity Model (PCMM) approach. In fact, we substantially leveraged the foundation the PCMM team provided, and augmented that as needed. Given the modeling and simulation TRL definitions and our proposed assignment methodology, ...
Date: January 1, 2006
Creator: Clay, Robert L.; Shneider, Max S.; Marburger, S. J. & Trucano, Timothy Guy
Partner: UNT Libraries Government Documents Department

Modeling threat assessments of water supply systems using markov latent effects methodology.

Description: Recent amendments to the Safe Drinking Water Act emphasize efforts toward safeguarding our nation's water supplies against attack and contamination. Specifically, the Public Health Security and Bioterrorism Preparedness and Response Act of 2002 established requirements for each community water system serving more than 3300 people to conduct an assessment of the vulnerability of its system to a terrorist attack or other intentional acts. Integral to evaluating system vulnerability is the threat assessment, which is the process by which the credibility of a threat is quantified. Unfortunately, full probabilistic assessment is generally not feasible, as there is insufficient experience and/or data to quantify the associated probabilities. For this reason, an alternative approach is proposed based on Markov Latent Effects (MLE) modeling, which provides a framework for quantifying imprecise subjective metrics through possibilistic or fuzzy mathematics. Here, an MLE model for water systems is developed and demonstrated to determine threat assessments for different scenarios identified by the assailant, asset, and means. Scenario assailants include terrorists, insiders, and vandals. Assets include a water treatment plant, water storage tank, node, pipeline, well, and a pump station. Means used in attacks include contamination (onsite chemicals, biological and chemical), explosives and vandalism. Results demonstrated highest threats are vandalism events and least likely events are those performed by a terrorist.
Date: December 1, 2006
Creator: Silva, Consuelo Juanita
Partner: UNT Libraries Government Documents Department

Molecular simulations of beta-amyloid protein near hydrated lipids (PECASE).

Description: We performed molecular dynamics simulations of beta-amyloid (A{beta}) protein and A{beta} fragment(31-42) in bulk water and near hydrated lipids to study the mechanism of neurotoxicity associated with the aggregation of the protein. We constructed full atomistic models using Cerius2 and ran simulations using LAMMPS. MD simulations with different conformations and positions of the protein fragment were performed. Thermodynamic properties were compared with previous literature and the results were analyzed. Longer simulations and data analyses based on the free energy profiles along the distance between the protein and the interface are ongoing.
Date: December 1, 2005
Creator: Thompson, Aidan Patrick; Han, Kunwoo (Texas A&M University, College Station, TX) & Ford, David M. (Texas A&M University, College Station, TX)
Partner: UNT Libraries Government Documents Department

Molecule-based approach for computing chemical-reaction rates in upper atmosphere hypersonic flows.

Description: This report summarizes the work completed during FY2009 for the LDRD project 09-1332 'Molecule-Based Approach for Computing Chemical-Reaction Rates in Upper-Atmosphere Hypersonic Flows'. The goal of this project was to apply a recently proposed approach for the Direct Simulation Monte Carlo (DSMC) method to calculate chemical-reaction rates for high-temperature atmospheric species. The new DSMC model reproduces measured equilibrium reaction rates without using any macroscopic reaction-rate information. Since it uses only molecular properties, the new model is inherently able to predict reaction rates for arbitrary nonequilibrium conditions. DSMC non-equilibrium reaction rates are compared to Park's phenomenological non-equilibrium reaction-rate model, the predominant model for hypersonic-flow-field calculations. For near-equilibrium conditions, Park's model is in good agreement with the DSMC-calculated reaction rates. For far-from-equilibrium conditions, corresponding to a typical shock layer, the difference between the two models can exceed 10 orders of magnitude. The DSMC predictions are also found to be in very good agreement with measured and calculated non-equilibrium reaction rates. Extensions of the model to reactions typically found in combustion flows and ionizing reactions are also found to be in very good agreement with available measurements, offering strong evidence that this is a viable and reliable technique to predict chemical reaction rates.
Date: August 1, 2009
Creator: Gallis, Michail A.; Bond, Ryan Bomar & Torczynski, John Robert
Partner: UNT Libraries Government Documents Department

Advances in radiation modeling in ALEGRA :a final report for LDRD-67120, efficient implicit mulitgroup radiation calculations.

Description: The original LDRD proposal was to use a nonlinear diffusion solver to compute estimates for the material temperature that could then be used in a Implicit Monte Carlo (IMC) calculation. At the end of the first year of the project, it was determined that this was not going to be effective, partially due to the concept, and partially due to the fact that the radiation diffusion package was not as efficient as it could be. The second, and final year, of the project focused on improving the robustness and computational efficiency of the radiation diffusion package in ALEGRA. To this end, several new multigroup diffusion methods have been developed and implemented in ALEGRA. While these methods have been implemented, their effectiveness of reducing overall simulation run time has not been fully tested. Additionally a comprehensive suite of verification problems has been developed for the diffusion package to ensure that it has been implemented correctly. This process took considerable time, but exposed significant bugs in both the previous and new diffusion packages, the linear solve packages, and even the NEVADA Framework's parser. In order to manage this large suite of problem, a new tool called Tampa has been developed. It is a general tool for automating the process of running and analyzing many simulations. Ryan McClarren, at the University of Michigan has been developing a Spherical Harmonics capability for unstructured meshes. While still in the early phases of development, this promises to bridge the gap in accuracy between a full transport solution using IMC and the diffusion approximation.
Date: November 1, 2005
Creator: Mehlhorn, Thomas Alan; Kurecka, Christopher J. (University of Michigan, Ann Arbor, MI); McClarren, Ryan (University of Michigan, Ann Arbor, MI); Brunner, Thomas A. & Holloway, James Paul (University of Michigan, Ann Arbor, MI)
Partner: UNT Libraries Government Documents Department

ChemCell : a particle-based model of protein chemistry and diffusion in microbial cells.

Description: Prokaryotic single-cell microbes are the simplest of all self-sufficient living organisms. Yet microbes create and use much of the molecular machinery present in more complex organisms, and the macro-molecules in microbial cells interact in regulatory, metabolic, and signaling pathways that are prototypical of the reaction networks present in all cells. We have developed a simple simulation model of a prokaryotic cell that treats proteins, protein complexes, and other organic molecules as particles which diffuse via Brownian motion and react with nearby particles in accord with chemical rate equations. The code models protein motion and chemistry within an idealized cellular geometry. It has been used to simulate several simple reaction networks and compared to more idealized models which do not include spatial effects. In this report we describe an initial version of the simulation code that was developed with FY03 funding. We discuss the motivation for the model, highlight its underlying equations, and describe simulations of a 3-stage kinase cascade and a portion of the carbon fixation pathway in the Synechococcus microbe.
Date: December 1, 2003
Creator: Plimpton, Steven James & Slepoy, Alexander
Partner: UNT Libraries Government Documents Department

An example uncertainty and sensitivity analysis at the Horonobe site for performance assessment calculations.

Description: Given pre-existing Groundwater Modeling System (GMS) models of the Horonobe Underground Research Laboratory (URL) at both the regional and site scales, this work performs an example uncertainty analysis for performance assessment (PA) applications. After a general overview of uncertainty and sensitivity analysis techniques, the existing GMS sitescale model is converted to a PA model of the steady-state conditions expected after URL closure. This is done to examine the impact of uncertainty in site-specific data in conjunction with conceptual model uncertainty regarding the location of the Oomagari Fault. In addition, a quantitative analysis of the ratio of dispersive to advective forces, the F-ratio, is performed for stochastic realizations of each conceptual model. All analyses indicate that accurate characterization of the Oomagari Fault with respect to both location and hydraulic conductivity is critical to PA calculations. This work defines and outlines typical uncertainty and sensitivity analysis procedures and demonstrates them with example PA calculations relevant to the Horonobe URL.
Date: August 1, 2004
Creator: James, Scott Carlton
Partner: UNT Libraries Government Documents Department

Substructured multibody molecular dynamics.

Description: We have enhanced our parallel molecular dynamics (MD) simulation software LAMMPS (Large-scale Atomic/Molecular Massively Parallel Simulator, lammps.sandia.gov) to include many new features for accelerated simulation including articulated rigid body dynamics via coupling to the Rensselaer Polytechnic Institute code POEMS (Parallelizable Open-source Efficient Multibody Software). We use new features of the LAMMPS software package to investigate rhodopsin photoisomerization, and water model surface tension and capillary waves at the vapor-liquid interface. Finally, we motivate the recipes of MD for practitioners and researchers in numerical analysis and computational mechanics.
Date: November 1, 2006
Creator: Grest, Gary Stephen; Stevens, Mark Jackson; Plimpton, Steven James; Woolf, Thomas B. (Johns Hopkins University, Baltimore, MD); Lehoucq, Richard B.; Crozier, Paul Stewart et al.
Partner: UNT Libraries Government Documents Department

Pamgen, a library for parallel generation of simple finite element meshes.

Description: Generating finite-element meshes is a serious bottleneck for large parallel simulations. When mesh generation is limited to serial machines and element counts approach a billion, this bottleneck becomes a roadblock. Pamgen is a parallel mesh generation library that allows on-the-fly scalable generation of hexahedral and quadrilateral finite element meshes for several simple geometries. It has been used to generate more that 1.1 billion elements on 17,576 processors. Pamgen generates an unstructured finite element mesh on each processor at the start of a simulation. The mesh is specified by commands passed to the library as a 'C'-programming language string. The resulting mesh geometry, topology, and communication information can then be queried through an API. pamgen allows specification of boundary condition application regions using sidesets (element faces) and nodesets (collections of nodes). It supports several simple geometry types. It has multiple alternatives for mesh grading. It has several alternatives for the initial domain decomposition. Pamgen makes it easy to change details of the finite element mesh and is very useful for performance studies and scoping calculations.
Date: April 1, 2008
Creator: Foucar, James G.; Drake, Richard Roy; Hensinger, David M. & Gardiner, Thomas Anthony
Partner: UNT Libraries Government Documents Department

The Sandia GeoModel : theory and user's guide.

Description: The mathematical and physical foundations and domain of applicability of Sandia's GeoModel are presented along with descriptions of the source code and user instructions. The model is designed to be used in conventional finite element architectures, and (to date) it has been installed in five host codes without requiring customizing the model subroutines for any of these different installations. Although developed for application to geological materials, the GeoModel actually applies to a much broader class of materials, including rock-like engineered materials (such as concretes and ceramics) and even to metals when simplified parameters are used. Nonlinear elasticity is supported through an empirically fitted function that has been found to be well-suited to a wide variety of materials. Fundamentally, the GeoModel is a generalized plasticity model. As such, it includes a yield surface, but the term 'yield' is generalized to include any form of inelastic material response including microcrack growth and pore collapse. The geomodel supports deformation-induced anisotropy in a limited capacity through kinematic hardening (in which the initially isotropic yield surface is permitted to translate in deviatoric stress space to model Bauschinger effects). Aside from kinematic hardening, however, the governing equations are otherwise isotropic. The GeoModel is a genuine unification and generalization of simpler models. The GeoModel can employ up to 40 material input and control parameters in the rare case when all features are used. Simpler idealizations (such as linear elasticity, or Von Mises yield, or Mohr-Coulomb failure) can be replicated by simply using fewer parameters. For high-strain-rate applications, the GeoModel supports rate dependence through an overstress model.
Date: August 1, 2004
Creator: Brannon, Rebecca Moss & Fossum, Arlo Frederick
Partner: UNT Libraries Government Documents Department

Evaluation of risk from acts of terrorism :the adversary/defender model using belief and fuzzy sets.

Description: Risk from an act of terrorism is a combination of the likelihood of an attack, the likelihood of success of the attack, and the consequences of the attack. The considerable epistemic uncertainty in each of these three factors can be addressed using the belief/plausibility measure of uncertainty from the Dempster/Shafer theory of evidence. The adversary determines the likelihood of the attack. The success of the attack and the consequences of the attack are determined by the security system and mitigation measures put in place by the defender. This report documents a process for evaluating risk of terrorist acts using an adversary/defender model with belief/plausibility as the measure of uncertainty. Also, the adversary model is a linguistic model that applies belief/plausibility to fuzzy sets used in an approximate reasoning rule base.
Date: September 1, 2006
Creator: Darby, John L.
Partner: UNT Libraries Government Documents Department

EMPHASIS/Nevada UTDEM user guide : version 1.0.

Description: The Unstructured Time-Domain ElectroMagnetics (UTDEM) portion of the EMPHASIS suite solves Maxwell's equations using finite-element techniques on unstructured meshes. This document provides user-specific information to facilitate the use of the code for applications of interest.
Date: March 1, 2005
Creator: Turner, C. David; Seidel, David Bruce & Pasik, Michael Francis
Partner: UNT Libraries Government Documents Department