237 Matching Results

Search Results

Advanced search parameters have been applied.


Description: The objectives of the VITALINKS tabletop exercise are to: Raise awareness of infrastructure interdependency issues; Identify and focus on the most important vulnerabilities and restoration priorities resulting from infrastructure disruptions; Examine the resources (people and equipment) required to sustain systems under emergency conditions; Identify and highlight roles, responsibilities, and authorities (including trans-border issues); and Continue to foster a more effective interface among public and private sector service providers and public officials in developing and implementing critical infrastructure protection, mitigation, response, and recovery options.
Date: March 11, 2002
Creator: Baldwin, T. B.
Partner: UNT Libraries Government Documents Department

Verification of VENTSAR XL - a spreadsheet version of VENTSAR

Description: VENTSAR is a computer model that analyzes flow patterns of pollutants on or near buildings. Plume rise may be considered. VENTSAR has been modified to allow for execution on a Macintosh using Microsoft Excel. This new version is called VENTSAR XL. All methodologies are identical to those within VENTSAR. This report provides verification of all models within VENTSAR XL. Strict comparisons were made with VENTSAR to ensure consistency between the two models
Date: May 1, 1996
Creator: Simpkins, A.A.
Partner: UNT Libraries Government Documents Department

Monte Carlo stratified source-sampling

Description: In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo {open_quotes}eigenvalue of the world{close_quotes} problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. The original test-problem was treated by a special code designed specifically for that purpose. Recently ANL started work on a method for dealing with more realistic eigenvalue of the world configurations, and has been incorporating this method into VIM. The original method has been modified to take into account real-world statistical noise sources not included in the model problem. This paper constitutes a status report on work still in progress.
Date: September 1, 1997
Creator: Blomquist, R.N. & Gelbard, E.M.
Partner: UNT Libraries Government Documents Department

Visualization of Information Spaces with VxInsight

Description: VxInsight provides a visual mechanism for browsing, exploring and retrieving information from a database. The graphical display conveys information about the relationship between objects in several ways and on multiple scales. In this way, individual objects are always observed within a larger context. For example, consider a database consisting of a set of scientific papers. Imagine that the papers have been organized in a two dimensional geometry so that related papers are located close to each other. Now construct a landscape where the altitude reflects the local density of papers. Papers on physics will form a mountain range, and a different range will stand over the biological papers. In between will be research reports from biophysics and other bridging disciplines. Now, imagine exploring these mountains. If we zoom in closer, the physics mountains will resolve into a set of sub-disciplines. Eventually, by zooming in far enough, the individual papers become visible. By pointing and clicking you can learn more about papers of interest or retrieve their full text. Although physical proximity conveys a great deal of information about the relationship between documents, you can also see which papers reference which others, by drawing lines between the citing and cited papers. For even more information, you can choose to highlight papers by a particular researcher or a particular institution, or show the accumulation of papers through time, watching some disciplines explode and other stagnate. VxInsight is a general purpose tool, which enables this kind of interaction with wide variety of relational data: documents, patents, web pages, and financial transactions are just a few examples. The tool allows users to interactively browse, explore and retrieve information from the database in an intuitive way.
Date: December 1, 2000
Creator: Wylie, B.N.; Boyack, K.W.; Davidson, G.S. & Johnson, D.K.
Partner: UNT Libraries Government Documents Department

Taming the Viper: Software Upgrade for VFAUser and Viper

Description: This report describes the procedure and properties of the software upgrade for the Vibration Performance Recorder. The upgrade will check the 20 memory cards for proper read/write operation. The upgrade was successfully installed and uploaded into the Viper and the field laptop. The memory checking routine must run overnight to complete the test, although the laptop need only be connected to the Viper unit until the downloading routine is finished. The routine has limited ability to recognize incomplete or corrupt header and footer files. The routine requires 400 Megabytes of free hard disk space. There is one minor technical flaw detailed in the conclusion.
Date: August 8, 2000
Partner: UNT Libraries Government Documents Department

VALDRIFT 1.0: A valley atmospheric dispersion model with deposition

Description: VALDRIFT version 1.0 is an atmospheric transport and diffusion model for use in well-defined mountain valleys. It is designed to determine the extent of ddft from aedal pesticide spraying activities, but can also be applied to estimate the transport and diffusion of various air pollutants in valleys. The model is phenomenological -- that is, the dominant meteorological processes goveming the behavior of the valley atmosphere are formulated explicitly in the model, albeit in a highly parameterized fashion. The key meteorological processes treated are: (1) nonsteady and nonhomogeneous along-valley winds and turbulent diffusivities, (2) convective boundary layer growth, (3) inversion descent, (4) noctumal temperature inversion breakup, and (5) subsidence. The model is applicable under relatively cloud-free, undisturbed synoptic conditions and is configured to operate through one diumal cycle for a single valley. The inputs required are the valley topographical characteristics, pesticide release rate as a function of time and space, along-valley wind speed as a function of time and space, temperature inversion characteristics at sunrise, and sensible heat flux as a function of time following sunrise. Default values are provided for certain inputs in the absence of detailed observations. The outputs are three-dimensional air concentration and ground-level deposition fields as a function of time.
Date: May 1, 1995
Creator: Allwine, K.J.; Bian, X. & Whiteman, C.D.
Partner: UNT Libraries Government Documents Department

Parallel contingency statistics with Titan.

Description: This report summarizes existing statistical engines in VTK/Titan and presents the recently parallelized contingency statistics engine. It is a sequel to [PT08] and [BPRT09] which studied the parallel descriptive, correlative, multi-correlative, and principal component analysis engines. The ease of use of this new parallel engines is illustrated by the means of C++ code snippets. Furthermore, this report justifies the design of these engines with parallel scalability in mind; however, the very nature of contingency tables prevent this new engine from exhibiting optimal parallel speed-up as the aforementioned engines do. This report therefore discusses the design trade-offs we made and study performance with up to 200 processors.
Date: September 1, 2009
Creator: Thompson, David C. & Pebay, Philippe Pierre
Partner: UNT Libraries Government Documents Department

Scalable k-means statistics with Titan.

Description: This report summarizes existing statistical engines in VTK/Titan and presents both the serial and parallel k-means statistics engines. It is a sequel to [PT08], [BPRT09], and [PT09] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, and contingency engines. The ease of use of the new parallel k-means engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the k-means engine.
Date: November 1, 2009
Creator: Thompson, David C.; Bennett, Janine C. & Pebay, Philippe Pierre
Partner: UNT Libraries Government Documents Department

A Contract Based System For Large Data Visualization

Description: VisIt is a richly featured visualization tool that is used to visualize some of the largest simulations ever run. The scale of these simulations requires that optimizations are incorporated into every operation VisIt performs. But the set of applicable optimizations that VisIt can perform is dependent on the types of operations being done. Complicating the issue, VisIt has a plugin capability that allows new, unforeseen components to be added, making it even harder to determine which optimizations can be applied. We introduce the concept of a contract to the standard data flow network design. This contract enables each component of the data flow network to modify the set of optimizations used. In addition, the contract allows for new components to be accommodated gracefully within VisIt's data flow network system.
Date: April 12, 2005
Creator: Childs, H R; Brugger, E S; Bonnell, K S; Meredith, J S; Miller, M C; Whitlock, B J et al.
Partner: UNT Libraries Government Documents Department

The office of real soon now, western pilot (projectors in offices project)

Description: The ASCI VIEWS program at Lawrence Livermore National Laboratory (LLNL) has been investigating a variety of display technologies, motivated by the large size, high resolution and complexity of some data sets that ASCI users explore and analyze. The purpose of this report is to describe the design, deployment and initial user reactions to one display system. The inspiration for the system comes from a similar experimental deployment at the University of North Carolina at Chapel Hill (UNC), one of the VIEWS program's academic partners. The display system features the use of multiple projectors in individual offices creating oversized displays to replace standard monitors. Some discussion on alternative multi-projector display systems provides context for this description.The VIEWS program began exploring the possibilities of alternative displays by building large, tiled displays and supporting the development of extremely high-pixel density LCD panels [ASCI]. The same considerations have led to partnerships with several groups of researchers working on various aspects of multi-projector display systems including groups at UNC, Stanford University, Princeton University, the University of Utah, Argonne National Lab, and the two NSF supercomputer centers, NCSA and SDSC. This report is divided into eight sections. The following section describes the background for the development of this multi-projector display system, including brief descriptions of other large-format and high-resolution display projects, and provides some LLNL motivations for exploring further. Section III covers the evolution of the design intended specifically for LLNL and explains some of the factors that influenced the decisions made. Section IV provides a detailed description of the two installations, including materials and resources involved. After a few weeks of experience with the systems, the users were interviewed and their reactions and comments are summarized in Section V. Conclusions, recommendations, and a short list of references complete this report.
Date: March 11, 2002
Creator: Uselton, S L
Partner: UNT Libraries Government Documents Department

Visual Sample Plan Version 2.0 User's Guide

Description: This user's guide describes Visual Sample Plan (VSP) Version 2.0 and provides instructions for using the software. VSP selects the appropriate number and location of environmental samples to ensure that the results of statistical tests performed to provide input to environmental decisions have the required confidence and performance. VSP Version 1.0 provides sample-size equations or algorithms needed by specific statistical tests appropriate for specific environmental sampling objectives. The easy-to-use program is highly visual and graphic. VSP runs on personal computers with Microsoft Windows operating systems (95, 98, Millenium Edition, 2000, and Windows NT). Designed primarily for project managers and users without expertise in statistics, VSP is applicable to any two-dimensional geographical population to be sampled (e.g., surface soil, a defined layer of subsurface soil, building surfaces, water bodies, and other similar applications) for studies of environmental quality.
Date: September 23, 2002
Creator: Hassig, Nancy L.; Wilson, John E.; Gilbert, Richard O.; Carlson, Deborah K.; O'Brien, Robert F.; Pulsipher, Brent A. et al.
Partner: UNT Libraries Government Documents Department

Experimental Investigation and High Resolution Simulator of In-Situ Combustion Processes

Description: Accurate simulation of in-situ combustion processes is computationally very challenging because the spatial and temporal scales over which the combustion process takes place are very small. In this current and eleventh report, we report on the development of a virtual kinetic cell (VKC) that aids the study of the interaction between kinetics and phase behavior. The VKC also provides an excellent tool for developing and testing specialized solvers for the stiff kinetics encountered in ISC processes.
Date: July 1, 2006
Creator: Gerritsen, Margot & Kovscek, Anthony R.
Partner: UNT Libraries Government Documents Department

The Vibration Virtual Environment for Test Optimization (VETO)

Description: A new test simulation tool is being developed to support vibration test design and to evaluate the overall testability of a component or system. This environment, the Vibration Virtual Environment for Test Optimization (VETO), is utilized to optimally place vibration control and response transducers and to investigate the selection of test parameters needed in the design and performance of a vibration experiment. The engineer can investigate the effects of different control parameters prior to performing an actual vibration test. Additionally, new and existing fixture designs can be evaluated through the development of analytical or experimental models that can be integrated into the simulation environment. This test design environment also provides the engineer with the ability to combine analytically or experimentally derived models of the vibration test hardware, instrumentation and equipment into a simulation model that represents the vibration testing capability. Hardware-in-the-loop simulations can be conducted using this model to examine multiple facets of the test design. This paper presents a new tool that will assist test engineers in maximizing the value of vibration tests through the use of hardware-in-the-loop simulations.
Date: October 1, 1996
Creator: Klenke, S. E.; Lauffer, J. P.; Gregory, D. L. & Togami, T. C.
Partner: UNT Libraries Government Documents Department

Comparison of simplified and standard spherical harmonics in the variational nodal method

Description: Recently, the variational nodal method has been extended through the use of the Rumyantsev interface conditions to solve the spherical harmonics (P{sub N}) equations of arbitrary odd order. In this paper, the authors generalize earlier x-y geometry work to fit the corresponding simplified spherical harmonics (SP{sub N}) equations into the variational nodal framework. Both P{sub N} and SP{sub N} approximations are implemented in the multigroup VARIANT code at Argonne National Laboratory in two and three dimensional Cartesian and hexagonal geometries. The availability of angular approximations through P{sub 5} and SP{sub 5}, and of flat, linear and quadratic spatial interface approximations allows investigation of both spatial truncation and angular approximation errors. Moreover, the SP{sub 3} approximation offers a cost-effective method for reducing transport errors.
Date: December 31, 1995
Creator: Lewis, E.E. & Palmiotti, G.
Partner: UNT Libraries Government Documents Department

Production properties of jets in Z boson events at CDF

Description: We present a study of the production properties of hadronic jets in Z boson events from 1.8 TeV {ital p}{ital {anti p}} collisions using 106 pb{sup -1} of Run 1A and 1B CDF data. We compare distributions of several kinematic variables in the data to leading order QCD predictions generated using the VECBOS Monte Carlo program.
Date: August 1, 1996
Creator: Dittmann, J.R.
Partner: UNT Libraries Government Documents Department

Using High-Speed WANs and Network Data Caches to Enable Remote and Distributed Visualization

Description: Visapult is a prototype application and framework for remote visualization of large scientific datasets. We approach the technical challenges of tera-scale visualization with a unique architecture that employs high speed WANs and network data caches for data staging and transmission. This architecture allows for the use of available cache and compute resources at arbitrary locations on the network. High data throughput rates and network utilization are achieved by parallelizing I/O at each stage in the application, and by pipe-lining the visualization process. On the desktop, the graphics interactivity is effectively decoupled from the latency inherent in network applications. We present a detailed performance analysis of the application, and improvements resulting from field-test analysis conducted as part of the DOE Combustion Corridor project.
Date: April 18, 2000
Creator: Bethel, Wes; Lau, Stephen; Tierney, Brian; Lee, Jason & Gunter, Dan
Partner: UNT Libraries Government Documents Department

Vectors a Fortran 90 module for 3-dimensional vector and dyadic arithmetic

Description: A major advance contained in the new Fortran 90 language standard is the ability to define new data types and the operators associated with them. Writing computer code to implement computations with real and complex three-dimensional vectors and dyadics is greatly simplified if the equations can be implemented directly, without the need to code the vector arithmetic explicitly. The Fortran 90 module described here defines new data types for real and complex 3-dimensional vectors and dyadics, along with the common operations needed to work with these objects. Routines to allow convenient initialization and output of the new types are also included. In keeping with the philosophy of data abstraction, the details of the implementation of the data types are maintained private, and the functions and operators are made generic to simplify the combining of real, complex, single- and double-precision vectors and dyadics.
Date: February 1, 1998
Creator: Brock, B.C.
Partner: UNT Libraries Government Documents Department

Need for higher order polynomial basis for polynomial nodal methods employed in LWR calculations

Description: The paper evaluates the accuracy and efficiency of sixth order polynomial solutions and the use of one radial node per core assembly for pressurized water reactor (PWR) core power distributions and reactivities. The computer code VARIANT was modified to calculate sixth order polynomial solutions for a hot zero power benchmark problem in which a control assembly along a core axis is assumed to be out of the core. Results are presented for the VARIANT, DIF3D-NODAL, and DIF3D-finite difference codes. The VARIANT results indicate that second order expansion of the within-node source and linear representation of the node surface currents are adequate for this problem. The results also demonstrate the improvement in the VARIANT solution when the order of the polynomial expansion of the within-node flux is increased from fourth to sixth order. There is a substantial saving in computational time for using one radial node per assembly with the sixth order expansion compared to using four or more nodes per assembly and fourth order polynomial solutions. 11 refs., 1 tab.
Date: August 1, 1997
Creator: Taiwo, T.A. & Palmiotti, G.
Partner: UNT Libraries Government Documents Department

An overview of EXTOOL: An analysis tool for V-TOUGH and NUFT

Description: Several post-processors have been used in connection with V-TOUGH. Initially, a sequence of utilities were used to extract and plot V-TOUGH information. This changed in 1991 as a new post-processor, EXTOOL, was developed. Currently, EXTOOL, is the main post-processor for the modeling codes V-TOUGH and NUFT. In the following sections, a history of V-TOUGH post-processing is discussed along with an overview of EXTOOL. This overview describes some of Extool`s capabilities and suggests reasons for using this code instead of another postprocessor. More detailed information on EXTOOL can be found in the Extool User`s Manual and the Extool Programmer`s Guide. Both these manuals are drafts, and can be requested by sending email to daveler2@llnl.gov.
Date: August 1, 1995
Creator: Daveler, S.
Partner: UNT Libraries Government Documents Department