Search Results

Advanced search parameters have been applied.

Modeling Complex Forest Ecology in a Parallel Computing Infrastructure

Description: Effective stewardship of forest ecosystems make it imperative to measure, monitor, and predict the dynamic changes of forest ecology. Measuring and monitoring provides us a picture of a forest's current state and the necessary data to formulate models for prediction. However, societal and natural events alter the course of a forest's development. A simulation environment that takes into account these events will facilitate forest management. In this thesis, we describe an efficient parallel implementation of a land cover use model, Mosaic, and discuss the development efforts to incorporate spatial interaction and succession dynamics into the model. To evaluate the performance of our implementation, an extensive set of simulation experiments was carried out using a dataset representing the H.J. Andrews Forest in the Oregon Cascades. Results indicate that a significant reduction in the simulation execution time of our parallel model can be achieved as compared to uni-processor simulations.
Date: August 2003
Creator: Mayes, John
Partner: UNT Libraries

COLLABORATIVE: FUSION SIMULATION PROGRAM

Description: New York University, Courant Institute of Mathematical Sciences, participated in the “Fusion Simulation Program (FSP) Planning Activities” [http://www.pppl.gov/fsp], with C.S. Chang as the institutional PI. FSP’s mission was to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. Specific institutional goal of the New York University was to participate in the planning of the edge integrated simulation, with emphasis on the usage of large scale HPCs, in connection with the SciDAC CPES project which the PI was leading. New York University successfully completed its mission by participating in the various planning activities, including the edge physics integration, the edge science drivers, and the mathematical verification. The activity resulted in the combined report that can be found in http://www.pppl.gov/fsp/Overview.html. Participation and presentations as part of this project are listed in a separation file.
Date: June 5, 2012
Creator: Chang, Choong Seock
Partner: UNT Libraries Government Documents Department

Co-simulation of innovative integrated HVAC systems in buildings

Description: Integrated performance simulation of buildings HVAC systems can help in reducing energy consumption and increasing occupant comfort. However, no single building performance simulation (BPS) tool offers sufficient capabilities and flexibilities to analyze integrated building systems and to enable rapid prototyping of innovative building and system technologies. One way to alleviate this problem is to use co-simulation, as an integrated approach to simulation. This article elaborates on issues important for co-simulation realization and discusses multiple possibilities to justify the particular approach implemented in the here described co-simulation prototype. The prototype is validated with the results obtained from the traditional simulation approach. It is further used in a proof-of-concept case study to demonstrate the applicability of the method and to highlight its benefits. Stability and accuracy of different coupling strategies are analyzed to give a guideline for the required coupling time step.
Date: June 21, 2010
Creator: Trcka, Marija; Hensena, Jan L.M. & Wetter, Michael
Partner: UNT Libraries Government Documents Department

Energy Usage While Maintaining Thermal Comfort : A Case Study of a UNT Dormitory

Description: Campus dormitories for the University of North Texas house over 5500 students per year; each one of them requires certain comfortable living conditions while they live there. There is an inherit amount of money required in order to achieve minimal comfort levels; the cost is mostly natural gas for water and room heating and electricity for cooling, lighting and peripherals. The US Department of Energy has developed several programs to aid in performing energy simulations to help those interested design more cost effective building designs. Energy-10 is such a program that allows users to conduct whole house evaluations by reviewing and altering a few parameters such as building materials, solar heating, energy efficient windows etc. The idea of this project was to recreate a campus dormitory and try to emulate existent energy consumption then try to find ways of lowering that usage while maintaining a high level of personal comfort.
Date: December 2011
Creator: Gambrell, Dusten
Partner: UNT Libraries

PEBBLES Operation and Theory Manual

Description: The PEBBLES manual describes the PEBBLES code. The PEBBLES code is a computer program designed to simulation the motion, packing and vibration of spheres that undergo various mechanical forces including gravitation, Hooke’s law force and various friction forces. The frictional forces include true static friction that allows non-zero angles of repose. Each pebble is individually simulated using the distinct element method.
Date: September 1, 2010
Creator: Cogliati, Joshua J.
Partner: UNT Libraries Government Documents Department

PEBBLES Operation and Theory Manual

Description: The PEBBLES manual describes the PEBBLES code. The PEBBLES code is a computer program designed to simulation the motion, packing and vibration of spheres that undergo various mechanical forces including gravitation, Hooke’s law force and various friction forces. The frictional forces include true static friction that allows non-zero angles of repose. Each pebble is individually simulated using the distinct element method.
Date: February 1, 2011
Creator: Cogliati, Joshua J.
Partner: UNT Libraries Government Documents Department

A View on Future Building System Modeling and Simulation

Description: This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described by coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).
Date: April 1, 2011
Creator: Wetter, Michael
Partner: UNT Libraries Government Documents Department

Diagnostics for the Combustion Science Workbench

Description: As the cost of computers declines relative to outfitting andmaintaining laser spectroscopy laboratories, computers will account foran increasing proportion of the research conducted in fundamentalcombustion science. W.C. Gardiner foresaw that progress will be limitedby the ability to understand the implications of what has been computedand to draw inferences about the elementary components of the combustionmodels. Yet the diagnostics that are routinely applied to computerexperiments have changed little from the sensitivity analyses includedwith the original chemkin software distribution. This paper describessome diagnostics capabilities that may be found on the virtual combustionscience workbench of the future. These diagnostics are illustrated bysome new results concerning which of the hydrogen/oxygen chain branchingreactions actually occur in flames, the increased formation of NOx inwrinkled flames versus flat flames, and the adequacy oftheoreticalpredictions of the effects of stretch. Several areas are identified wherework is needed, including the areas of combustion chemistry and laserdiagnostics, to make the virtual laboratory a reality.
Date: February 21, 2007
Creator: Grcar, J.F.; Day, M.S. & Bell, J.B.
Partner: UNT Libraries Government Documents Department

Three-dimensional Simulation of Gas Conductance Measurement Experiments on Alcator C-Mod

Description: Three-dimensional Monte Carlo neutral transport simulations of gas flow through the Alcator C-Mod subdivertor yield conductances comparable to those found in dedicated experiments. All are significantly smaller than the conductance found with the previously used axisymmetric geometry. A benchmarking exercise of the code against known conductance values for gas flow through a simple pipe provides a physical basis for interpreting the comparison of the three-dimensional and experimental C-Mod conductances.
Date: June 15, 2004
Creator: Stotler, D.P. & LaBombard, B.
Partner: UNT Libraries Government Documents Department

2-D Reflectometer Modeling for Optimizing the ITER Low-field Side Reflectometer System

Description: The response of a low-field side reflectometer system for ITER is simulated with a 2?D reflectometer code using a realistic plasma equilibrium. It is found that the reflected beam will often miss its launch point by as much as 40 cm and that a vertical array of receiving antennas is essential in order to observe a reflection on the low-field side of ITER.
Date: September 2, 2005
Creator: Kramer, G. J.; Nazikian, R.; Valeo, E. J.; Budny, R. V.; Kessel, C. & Johnson, D.
Partner: UNT Libraries Government Documents Department

A corrected and generalized successive random additions algorithm for simulating fractional levy motions

Description: Simulation of subsurface heterogeneity is important for modeling subsurface flow and transport processes. Previous studies have indicated that subsurface property variations can often be characterized by fractional Brownian motion (fBm) or (truncated) fractional Levy motion (fLm). Because Levy-stable distributions have many novel and often unfamiliar properties, studies on generating fLm distributions are rare in the literature. In this study, we generalize a relatively simple and computationally efficient successive random additions (SRA) algorithm, originally developed for generating Gaussian fractals, to simulate fLm distributions. We also propose an additional important step in response to continued observations that the traditional SRA algorithm often generates fractal distributions having poor scaling and correlation properties. Finally, the generalized and modified SRA algorithm is validated through numerical tests.
Date: May 29, 2002
Creator: Liu, Hui-Hai; Bodvarsson, Gudmundur S.; Lu, Silong & Molz, Fred J.
Partner: UNT Libraries Government Documents Department

Speeding Up Simulations of Relativistic Systems using an Optimal Boosted Frame

Description: It can be computationally advantageous to perform computer simulations in a Lorentz boosted frame for a certain class of systems. However, even if the computer model relies on a covariant set of equations, it has been pointed out that algorithmic difficulties related to discretization errors may have to be overcome in order to take full advantage of the potential speedup. We summarize the findings, the difficulties and their solutions, and show that the technique enables simulations important to several areas of accelerator physics that are otherwise problematic, including self-consistent modeling in three-dimensions of laser wokefield accelerator stages at energies of 10 GeV and above.
Date: January 27, 2009
Creator: Vay, J. L.; Fawley, W. M.; Geddes, C. G. R.; Cormier-Michel, E. & Grote, D. P.
Partner: UNT Libraries Government Documents Department

Rollback Reduction Techniques Through Load Balancing in Optimistic Parallel Discrete Event Simulation

Description: Discrete event simulation is an important tool for modeling and analysis. Some of the simulation applications such as telecommunication network performance, VLSI logic circuits design, battlefield simulation, require enormous amount of computing resources. One way to satisfy this demand for computing power is to decompose the simulation system into several logical processes (Ip) and run them concurrently. In any parallel discrete event simulation (PDES) system, the events are ordered according to their time of occurrence. In order for the simulation to be correct, this ordering has to be preserved. There are three approaches to maintain this ordering. In a conservative system, no lp executes an event unless it is certain that all events with earlier time-stamps have been executed. Such systems are prone to deadlock. In an optimistic system on the other hand, simulation progresses disregarding this ordering and saves the system states regularly. Whenever a causality violation is detected, the system rolls back to a state saved earlier and restarts processing after correcting the error. There is another approach in which all the lps participate in the computation of a safe time-window and all events with time-stamps within this window are processed concurrently. In optimistic simulation systems, there is a global virtual time (GVT), which is the minimum of the time-stamps of all the events existing in the system. The system can not rollback to a state prior to GVT and hence all such states can be discarded. GVT is used for memory management, load balancing, termination detection and committing of events. However, GVT computation introduces additional overhead. In optimistic systems, large number of rollbacks can degrade the system performance considerably. We have studied the effect of load balancing in reducing the number of rollbacks in such systems. We have designed three load balancing algorithms and implemented two of …
Date: May 1996
Creator: Sarkar, Falguni
Partner: UNT Libraries

Implementations of mesh refinement schemes for particle-in-cell plasma simulations

Description: Plasma simulations are often rendered challenging by the disparity of scales in time and in space which must be resolved. When these disparities are in distinctive zones of the simulation region, a method which has proven to be effective in other areas (e.g. fluid dynamics simulations) is the mesh refinement technique. We briefly discuss the challenges posed by coupling this technique with plasma Particle-In-Cell simulations and present two implementations in more detail, with examples.
Date: October 20, 2003
Creator: Vay, J. L.; Colella, P.; Friedman, A.; Grote, D. P.; McCorquodale, P. & Serafini, D. B.
Partner: UNT Libraries Government Documents Department

List mode reconstruction for PET with motion compensation: A simulation study

Description: Motion artifacts can be a significant factor that limits the image quality in high-resolution PET. Surveillance systems have been developed to track the movements of the subject during a scan. Development of reconstruction algorithms that are able to compensate for the subject motion will increase the potential of PET. In this paper we present a list mode likelihood reconstruction algorithm with the ability of motion compensation. The subject motion is explicitly modeled in the likelihood function. The detections of each detector pair are modeled as a Poisson process with time-varying rate function. The proposed method has several advantages over the existing methods. It uses all detected events and does not introduce any interpolation error. Computer simulations show that the proposed method can compensate simulated subject movements and that the reconstructed images have no visible motion artifacts.
Date: July 1, 2002
Creator: Qi, Jinyi & Huesman, Ronald H.
Partner: UNT Libraries Government Documents Department

List mode reconstruction for PET with motion compensation: A simulation study

Description: Motion artifacts can be a significant factor that limits the image quality in high-resolution PET. Surveillance systems have been developed to track the movements of the subject during a scan. Development of reconstruction algorithms that are able to compensate for the subject motion will increase the potential of PET. In this paper we present a list mode likelihood reconstruction algorithm with the ability of motion compensation. The subject moti is explicitly modeled in the likelihood function. The detections of each detector pair are modeled as a Poisson process with time vary ingrate function. The proposed method has several advantages over the existing methods. It uses all detected events and does not introduce any interpolation error. Computer simulations show that the proposed method can compensate simulated subject movements and that the reconstructed images have no visible motion artifacts.
Date: July 3, 2002
Creator: Qi, Jinyi & Huesman, Ronald H.
Partner: UNT Libraries Government Documents Department

Hydra: a service oriented architecture for scientific simulation integration

Description: One of the current major challenges in scientific modeling and simulation, in particular in the infrastructure-analysis community, is the development of techniques for efficiently and automatically coupling disparate tools that exist in separate locations on different platforms, implemented in a variety of languages and designed to be standalone. Recent advances in web-based platforms for integrating systems such as SOA provide an opportunity to address these challenges in a systematic fashion. This paper describes Hydra, an integrating architecture for infrastructure modeling and simulation that defines geography-based schemas that, when used to wrap existing tools as web services, allow for seamless plug-and-play composability. Existing users of these tools can enhance the value of their analysis by assessing how the simulations of one tool impact the behavior of another tool and can automate existing ad hoc processes and work flows for integrating tools together.
Date: January 1, 2008
Creator: Bent, Russell; Djidjev, Tatiana; Hayes, Birch P; Holland, Joe V; Khalsa, Hari S; Linger, Steve P et al.
Partner: UNT Libraries Government Documents Department

Benchmarking Nonlinear Turbulence Simulations on Alcator C-Mod

Description: Linear simulations of plasma microturbulence are used with recent radial profiles of toroidal velocity from similar plasmas to consider nonlinear microturbulence simulations and observed transport analysis on Alcator C-Mod. We focus on internal transport barrier (ITB) formation in fully equilibrated H-mode plasmas with nearly flat velocity profiles. Velocity profile data, transport analysis and linear growth rates are combined to integrate data and simulation, and explore the effects of toroidal velocity on benchmarking simulations. Areas of interest for future nonlinear simulations are identified. A good gyrokinetic benchmark is found in the plasma core, without extensive nonlinear simulations. RF-heated C-Mod H-mode experiments, which exhibit an ITB, have been studied with the massively parallel code GS2 towards validation of gyrokinetic microturbulence models. New, linear, gyrokinetic calculations are reported and discussed in connection with transport analysis near the ITB trigger time of shot No.1001220016.
Date: June 22, 2004
Creator: Redi, M.H.; Fiore, C.L.; Dorland, W.; Greenwald, M.J.; Hammett, G.W.; Hill, K. et al.
Partner: UNT Libraries Government Documents Department

Efficient Modeling of Laser-Plasma Accelerators with INF&RNO

Description: The numerical modeling code INF&RNO (INtegrated Fluid& paRticle simulatioN cOde, pronounced"inferno") is presented. INF&RNO is an efficient 2D cylindrical code to model the interaction of a short laser pulse with an underdense plasma. The code is based on an envelope model for the laser while either a PIC or a fluid description can be used for the plasma. The effect of the laser pulse on the plasma is modeled with the time-averaged poderomotive force. These and other features allow for a speedup of 2-4 orders of magnitude compared to standard full PIC simulations while still retaining physical fidelity. The code has been benchmarked against analytical solutions and 3D PIC simulations and here a set of validation tests together with a discussion of the performances are presented.
Date: June 1, 2010
Creator: Benedetti, C.; Schroeder, C. B.; Esarey, E.; Geddes, C. G. R. & Leemans, W. P.
Partner: UNT Libraries Government Documents Department

Simulation of lean premixed turbulent combustion

Description: There is considerable technological interest in developingnew fuel-flexible combustion systems that can burn fuels such ashydrogenor syngas. Lean premixed systems have the potential to burn thesetypes of fuels with high efficiency and low NOx emissions due to reducedburnt gas temperatures. Although traditional scientific approaches basedon theory and laboratory experiment have played essential roles indeveloping our current understanding of premixed combustion, they areunable to meet the challenges of designing fuel-flexible lean premixedcombustion devices. Computation, with itsability to deal with complexityand its unlimited access to data, hasthe potential for addressing thesechallenges. Realizing this potential requires the ability to perform highfidelity simulations of turbulent lean premixed flames under realisticconditions. In this paper, we examine the specialized mathematicalstructure of these combustion problems and discuss simulation approachesthat exploit this structure. Using these ideas we can dramatically reducecomputational cost, making it possible to perform high-fidelitysimulations of realistic flames. We illustrate this methodology byconsidering ultra-lean hydrogen flames and discuss how this type ofsimulation is changing the way researchers study combustion.
Date: June 25, 2006
Creator: Bell, John B.; Day, Marcus S.; Almgren, Ann S.; Lijewski, MichaelJ.; Rendleman, Charles A.; Cheng, Robert K. et al.
Partner: UNT Libraries Government Documents Department

Relation of validation experiments to applications.

Description: Computational and mathematical models are developed in engineering to represent the behavior of physical systems to various system inputs and conditions. These models are often used to predict at other conditions, rather than to just reproduce the behavior of data obtained at the experimental conditions. For example, the boundary or initial conditions, time of prediction, geometry, material properties, and other model parameters can be different at test conditions than those for an anticipated application of a model. Situations for which the conditions may differ include those for which (1) one is in the design phase and a prototype of the system has not been constructed and tested under the anticipated conditions, (2) only one version of a final system can be built and destructive testing is not feasible, or (3) the anticipated design conditions are variable and one cannot easily reproduce the range of conditions with a limited number of carefully controlled experiments. Because data from these supporting experiments have value in model validation, even if the model was tested at different conditions than an anticipated application, methodology is required to evaluate the ability of the validation experiments to resolve the critical behavior for the anticipated application. The methodology presented uses models for the validation experiments and a model for the application to address how well the validation experiments resolve the application. More specifically, the methodology investigates the tradeoff that exists between the uncertainty (variability) in the behavior of the resolved critical variables for the anticipated application and the ability of the validation experiments to resolve this behavior. The important features of this approach are demonstrated through simple linear and non-linear heat conduction examples.
Date: February 1, 2009
Creator: Hamilton, James R. (New Mexico State University, Las Cruces, NM) & Hills, Richard Guy
Partner: UNT Libraries Government Documents Department

Computer Simulation Placements in a Unit of Instruction

Description: Educators considering implementing a computer simulation must decide on the optimum placement of the simulation in the unit of instruction to maximize student learning. This study examined student achievement using two different placements for the computer simulation, The Civil War, in a unit of instruction of 8th grade American History students in a suburban middle school.
Date: December 1994
Creator: Naumann, Steve E. (Steve Eugene)
Partner: UNT Libraries

Simulations of Temperatures in Burning Tokamak Plasmas using the GLF23 Model in the TRANSP Code

Description: The GLF23 prediction model, incorporated in the TRANSP plasma analysis code, is used to predict temperatures for burning plasmas in the proposed FIRE and ITER-FEAT tokamaks. Flat electron density profiles with various central values are assumed. Scaling of the fusion power P(subscript)dt and gain Q(subscript)dt with density (subscript)and pedestal temperature are given. Helium ash transport and sawtooth effect Pdt in long duration ITER-FEAT plasmas.
Date: August 13, 2002
Creator: Budny, R.V.
Partner: UNT Libraries Government Documents Department
Back to Top of Screen