210 Matching Results

Search Results

Advanced search parameters have been applied.

Experimental Investigation and High Resolution Simulator of In-Situ Combustion Processes

Description: Accurate simulation of in-situ combustion processes is computationally very challenging because the spatial and temporal scales over which the combustion process takes place are very small. In this current and eleventh report, we report on the development of a virtual kinetic cell (VKC) that aids the study of the interaction between kinetics and phase behavior. The VKC also provides an excellent tool for developing and testing specialized solvers for the stiff kinetics encountered in ISC processes.
Date: July 1, 2006
Creator: Gerritsen, Margot & Kovscek, Anthony R.
Partner: UNT Libraries Government Documents Department

A Contract Based System For Large Data Visualization

Description: VisIt is a richly featured visualization tool that is used to visualize some of the largest simulations ever run. The scale of these simulations requires that optimizations are incorporated into every operation VisIt performs. But the set of applicable optimizations that VisIt can perform is dependent on the types of operations being done. Complicating the issue, VisIt has a plugin capability that allows new, unforeseen components to be added, making it even harder to determine which optimizations can be applied. We introduce the concept of a contract to the standard data flow network design. This contract enables each component of the data flow network to modify the set of optimizations used. In addition, the contract allows for new components to be accommodated gracefully within VisIt's data flow network system.
Date: April 12, 2005
Creator: Childs, H R; Brugger, E S; Bonnell, K S; Meredith, J S; Miller, M C; Whitlock, B J et al.
Partner: UNT Libraries Government Documents Department

Parallel contingency statistics with Titan.

Description: This report summarizes existing statistical engines in VTK/Titan and presents the recently parallelized contingency statistics engine. It is a sequel to [PT08] and [BPRT09] which studied the parallel descriptive, correlative, multi-correlative, and principal component analysis engines. The ease of use of this new parallel engines is illustrated by the means of C++ code snippets. Furthermore, this report justifies the design of these engines with parallel scalability in mind; however, the very nature of contingency tables prevent this new engine from exhibiting optimal parallel speed-up as the aforementioned engines do. This report therefore discusses the design trade-offs we made and study performance with up to 200 processors.
Date: September 1, 2009
Creator: Thompson, David C. & Pebay, Philippe Pierre
Partner: UNT Libraries Government Documents Department

Scalable k-means statistics with Titan.

Description: This report summarizes existing statistical engines in VTK/Titan and presents both the serial and parallel k-means statistics engines. It is a sequel to [PT08], [BPRT09], and [PT09] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, and contingency engines. The ease of use of the new parallel k-means engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the k-means engine.
Date: November 1, 2009
Creator: Thompson, David C.; Bennett, Janine C. & Pebay, Philippe Pierre
Partner: UNT Libraries Government Documents Department

Vitalinks

Description: The objectives of the VITALINKS tabletop exercise are to: Raise awareness of infrastructure interdependency issues; Identify and focus on the most important vulnerabilities and restoration priorities resulting from infrastructure disruptions; Examine the resources (people and equipment) required to sustain systems under emergency conditions; Identify and highlight roles, responsibilities, and authorities (including trans-border issues); and Continue to foster a more effective interface among public and private sector service providers and public officials in developing and implementing critical infrastructure protection, mitigation, response, and recovery options.
Date: March 11, 2002
Creator: Baldwin, T. B.
Partner: UNT Libraries Government Documents Department

The Vibration Virtual Environment for Test Optimization (VETO)

Description: A new test simulation tool is being developed to support vibration test design and to evaluate the overall testability of a component or system. This environment, the Vibration Virtual Environment for Test Optimization (VETO), is utilized to optimally place vibration control and response transducers and to investigate the selection of test parameters needed in the design and performance of a vibration experiment. The engineer can investigate the effects of different control parameters prior to performing an actual vibration test. Additionally, new and existing fixture designs can be evaluated through the development of analytical or experimental models that can be integrated into the simulation environment. This test design environment also provides the engineer with the ability to combine analytically or experimentally derived models of the vibration test hardware, instrumentation and equipment into a simulation model that represents the vibration testing capability. Hardware-in-the-loop simulations can be conducted using this model to examine multiple facets of the test design. This paper presents a new tool that will assist test engineers in maximizing the value of vibration tests through the use of hardware-in-the-loop simulations.
Date: October 1, 1996
Creator: Klenke, S.E.; Lauffer, J.P.; Gregory, D.L. & Togami, T.C.
Partner: UNT Libraries Government Documents Department

Verification of VENTSAR XL - a spreadsheet version of VENTSAR

Description: VENTSAR is a computer model that analyzes flow patterns of pollutants on or near buildings. Plume rise may be considered. VENTSAR has been modified to allow for execution on a Macintosh using Microsoft Excel. This new version is called VENTSAR XL. All methodologies are identical to those within VENTSAR. This report provides verification of all models within VENTSAR XL. Strict comparisons were made with VENTSAR to ensure consistency between the two models
Date: May 1, 1996
Creator: Simpkins, A.A.
Partner: UNT Libraries Government Documents Department

Monte Carlo stratified source-sampling

Description: In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo {open_quotes}eigenvalue of the world{close_quotes} problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. The original test-problem was treated by a special code designed specifically for that purpose. Recently ANL started work on a method for dealing with more realistic eigenvalue of the world configurations, and has been incorporating this method into VIM. The original method has been modified to take into account real-world statistical noise sources not included in the model problem. This paper constitutes a status report on work still in progress.
Date: September 1, 1997
Creator: Blomquist, R.N. & Gelbard, E.M.
Partner: UNT Libraries Government Documents Department

Visualization of Information Spaces with VxInsight

Description: VxInsight provides a visual mechanism for browsing, exploring and retrieving information from a database. The graphical display conveys information about the relationship between objects in several ways and on multiple scales. In this way, individual objects are always observed within a larger context. For example, consider a database consisting of a set of scientific papers. Imagine that the papers have been organized in a two dimensional geometry so that related papers are located close to each other. Now construct a landscape where the altitude reflects the local density of papers. Papers on physics will form a mountain range, and a different range will stand over the biological papers. In between will be research reports from biophysics and other bridging disciplines. Now, imagine exploring these mountains. If we zoom in closer, the physics mountains will resolve into a set of sub-disciplines. Eventually, by zooming in far enough, the individual papers become visible. By pointing and clicking you can learn more about papers of interest or retrieve their full text. Although physical proximity conveys a great deal of information about the relationship between documents, you can also see which papers reference which others, by drawing lines between the citing and cited papers. For even more information, you can choose to highlight papers by a particular researcher or a particular institution, or show the accumulation of papers through time, watching some disciplines explode and other stagnate. VxInsight is a general purpose tool, which enables this kind of interaction with wide variety of relational data: documents, patents, web pages, and financial transactions are just a few examples. The tool allows users to interactively browse, explore and retrieve information from the database in an intuitive way.
Date: December 1, 2000
Creator: Wylie, B.N.; Boyack, K.W.; Davidson, G.S. & Johnson, D.K.
Partner: UNT Libraries Government Documents Department

Taming the Viper: Software Upgrade for VFAUser and Viper

Description: This report describes the procedure and properties of the software upgrade for the Vibration Performance Recorder. The upgrade will check the 20 memory cards for proper read/write operation. The upgrade was successfully installed and uploaded into the Viper and the field laptop. The memory checking routine must run overnight to complete the test, although the laptop need only be connected to the Viper unit until the downloading routine is finished. The routine has limited ability to recognize incomplete or corrupt header and footer files. The routine requires 400 Megabytes of free hard disk space. There is one minor technical flaw detailed in the conclusion.
Date: August 8, 2000
Creator: DORIN,RANDALL T. & MOSER III,JOHN C.
Partner: UNT Libraries Government Documents Department

VALDRIFT 1.0: A valley atmospheric dispersion model with deposition

Description: VALDRIFT version 1.0 is an atmospheric transport and diffusion model for use in well-defined mountain valleys. It is designed to determine the extent of ddft from aedal pesticide spraying activities, but can also be applied to estimate the transport and diffusion of various air pollutants in valleys. The model is phenomenological -- that is, the dominant meteorological processes goveming the behavior of the valley atmosphere are formulated explicitly in the model, albeit in a highly parameterized fashion. The key meteorological processes treated are: (1) nonsteady and nonhomogeneous along-valley winds and turbulent diffusivities, (2) convective boundary layer growth, (3) inversion descent, (4) noctumal temperature inversion breakup, and (5) subsidence. The model is applicable under relatively cloud-free, undisturbed synoptic conditions and is configured to operate through one diumal cycle for a single valley. The inputs required are the valley topographical characteristics, pesticide release rate as a function of time and space, along-valley wind speed as a function of time and space, temperature inversion characteristics at sunrise, and sensible heat flux as a function of time following sunrise. Default values are provided for certain inputs in the absence of detailed observations. The outputs are three-dimensional air concentration and ground-level deposition fields as a function of time.
Date: May 1, 1995
Creator: Allwine, K.J.; Bian, X. & Whiteman, C.D.
Partner: UNT Libraries Government Documents Department

The office of real soon now, western pilot (projectors in offices project)

Description: The ASCI VIEWS program at Lawrence Livermore National Laboratory (LLNL) has been investigating a variety of display technologies, motivated by the large size, high resolution and complexity of some data sets that ASCI users explore and analyze. The purpose of this report is to describe the design, deployment and initial user reactions to one display system. The inspiration for the system comes from a similar experimental deployment at the University of North Carolina at Chapel Hill (UNC), one of the VIEWS program's academic partners. The display system features the use of multiple projectors in individual offices creating oversized displays to replace standard monitors. Some discussion on alternative multi-projector display systems provides context for this description.The VIEWS program began exploring the possibilities of alternative displays by building large, tiled displays and supporting the development of extremely high-pixel density LCD panels [ASCI]. The same considerations have led to partnerships with several groups of researchers working on various aspects of multi-projector display systems including groups at UNC, Stanford University, Princeton University, the University of Utah, Argonne National Lab, and the two NSF supercomputer centers, NCSA and SDSC. This report is divided into eight sections. The following section describes the background for the development of this multi-projector display system, including brief descriptions of other large-format and high-resolution display projects, and provides some LLNL motivations for exploring further. Section III covers the evolution of the design intended specifically for LLNL and explains some of the factors that influenced the decisions made. Section IV provides a detailed description of the two installations, including materials and resources involved. After a few weeks of experience with the systems, the users were interviewed and their reactions and comments are summarized in Section V. Conclusions, recommendations, and a short list of references complete this report.
Date: March 11, 2002
Creator: Uselton, S L
Partner: UNT Libraries Government Documents Department

Visual Sample Plan Version 2.0 User's Guide

Description: This user's guide describes Visual Sample Plan (VSP) Version 2.0 and provides instructions for using the software. VSP selects the appropriate number and location of environmental samples to ensure that the results of statistical tests performed to provide input to environmental decisions have the required confidence and performance. VSP Version 1.0 provides sample-size equations or algorithms needed by specific statistical tests appropriate for specific environmental sampling objectives. The easy-to-use program is highly visual and graphic. VSP runs on personal computers with Microsoft Windows operating systems (95, 98, Millenium Edition, 2000, and Windows NT). Designed primarily for project managers and users without expertise in statistics, VSP is applicable to any two-dimensional geographical population to be sampled (e.g., surface soil, a defined layer of subsurface soil, building surfaces, water bodies, and other similar applications) for studies of environmental quality.
Date: September 23, 2002
Creator: Hassig, Nancy L.; Wilson, John E.; Gilbert, Richard O.; Carlson, Deborah K.; O'Brien, Robert F.; Pulsipher, Brent A. et al.
Partner: UNT Libraries Government Documents Department

Visual Spreadsheets in VisIt

Description: The VACET team would like to add visual spreadsheeting capability to the visualization tool VisIt, to make it be a viable tool for current users of AMRVis and ChomboVis. This document describes AMRVis and ChomboVis approaches to visual spreadsheets and describes a proposed visual spreadsheet mechanism for VisIt.
Date: February 5, 2007
Creator: Whitlock, B & Childs, H
Partner: UNT Libraries Government Documents Department

The Challenges to Coupling Dynamic Geospatial Models

Description: Many applications of modeling spatial dynamic systems focus on a single system and a single process, ignoring the geographic and systemic context of the processes being modeled. A solution to this problem is the coupled modeling of spatial dynamic systems. Coupled modeling is challenging for both technical reasons, as well as conceptual reasons. This paper explores the benefits and challenges to coupling or linking spatial dynamic models, from loose coupling, where information transfer between models is done by hand, to tight coupling, where two (or more) models are merged as one. To illustrate the challenges, a coupled model of Urbanization and Wildfire Risk is presented. This model, called Vesta, was applied to the Santa Barbara, California region (using real geospatial data), where Urbanization and Wildfires occur and recur, respectively. The preliminary results of the model coupling illustrate that coupled modeling can lead to insight into the consequences of processes acting on their own.
Date: June 23, 2006
Creator: Goldstein, N
Partner: UNT Libraries Government Documents Department

Verification of the Accuracy of Sample-Size Equation Calculations for Visual Sample Plan Version 0.9C

Description: Visual Sample Plan (VSP) is a software tool being developed to facilitate the design of environmental sampling plans using a site-map visual interface, standard sample-size equations, a variety of sampling grids and random sampling plans, and graphs to visually depict the results to the user. This document provides comparisons between sample sizes calculated by VSP Version 0.9C, and sample sizes calculated by test code written in the S-Plus language. All sample sizes calculated by VSP matched the independently calculated sample sizes. Also the VSP implementation of the ELIGPRID-PC algorithm for hot spot probabilities is shown to match previous results for 100 standard test cases. The Conclusions and Limitations section of this document lists some aspects of VSP that were not tested by this suite of tests and recommends simulation-based enhancements for future versions of VSP.
Date: January 29, 2001
Creator: Davidson, James R.
Partner: UNT Libraries Government Documents Department

Comparison of simplified and standard spherical harmonics in the variational nodal method

Description: Recently, the variational nodal method has been extended through the use of the Rumyantsev interface conditions to solve the spherical harmonics (P{sub N}) equations of arbitrary odd order. In this paper, the authors generalize earlier x-y geometry work to fit the corresponding simplified spherical harmonics (SP{sub N}) equations into the variational nodal framework. Both P{sub N} and SP{sub N} approximations are implemented in the multigroup VARIANT code at Argonne National Laboratory in two and three dimensional Cartesian and hexagonal geometries. The availability of angular approximations through P{sub 5} and SP{sub 5}, and of flat, linear and quadratic spatial interface approximations allows investigation of both spatial truncation and angular approximation errors. Moreover, the SP{sub 3} approximation offers a cost-effective method for reducing transport errors.
Date: December 31, 1995
Creator: Lewis, E.E. & Palmiotti, G.
Partner: UNT Libraries Government Documents Department

Production properties of jets in Z boson events at CDF

Description: We present a study of the production properties of hadronic jets in Z boson events from 1.8 TeV {ital p}{ital {anti p}} collisions using 106 pb{sup -1} of Run 1A and 1B CDF data. We compare distributions of several kinematic variables in the data to leading order QCD predictions generated using the VECBOS Monte Carlo program.
Date: August 1, 1996
Creator: Dittmann, J.R.
Partner: UNT Libraries Government Documents Department

Using High-Speed WANs and Network Data Caches to Enable Remote and Distributed Visualization

Description: Visapult is a prototype application and framework for remote visualization of large scientific datasets. We approach the technical challenges of tera-scale visualization with a unique architecture that employs high speed WANs and network data caches for data staging and transmission. This architecture allows for the use of available cache and compute resources at arbitrary locations on the network. High data throughput rates and network utilization are achieved by parallelizing I/O at each stage in the application, and by pipe-lining the visualization process. On the desktop, the graphics interactivity is effectively decoupled from the latency inherent in network applications. We present a detailed performance analysis of the application, and improvements resulting from field-test analysis conducted as part of the DOE Combustion Corridor project.
Date: April 18, 2000
Creator: Bethel, Wes; Lau, Stephen; Tierney, Brian; Lee, Jason & Gunter, Dan
Partner: UNT Libraries Government Documents Department