Search Results

Syncretisms for wind quintet and percussion: A study in combining organizational principles from Southeast Asian music with western stylistic elements.

Description: Syncretisms is an original composition scored for flute, oboe, clarinet, horn, bassoon, and marimba (2-mallet minimum, 4 recommended) with an optional percussion part requiring glockenspiel and chimes, and has an approximate duration of 6 min. 45. sec. The composition combines modern western tuning, timbre, and harmonic language with organizational principles identified in music from Southeast Asia (including music from cultures found in Thailand, Cambodia, Malaysia, and Indonesia). The accompanying paper describes each of these organizational principles, drawing on the work of scholars who have performed fieldwork, and describes the way in which each principle was employed in Syncretisms. The conclusion speculates on a method for comparing musical organizational systems cross-culturally.
Date: May 2008
Creator: Seymour, John
Partner: UNT Libraries

XML-Based Agent Scripts and Inference Mechanisms

Description: Natural language understanding has been a persistent challenge to researchers in various computer science fields, in a number of applications ranging from user support systems to entertainment and online teaching. A long term goal of the Artificial Intelligence field is to implement mechanisms that enable computers to emulate human dialogue. The recently developed ALICEbots, virtual agents with underlying AIML scripts, by A.L.I.C.E. foundation, use AIML scripts - a subset of XML - as the underlying pattern database for question answering. Their goal is to enable pattern-based, stimulus-response knowledge content to be served, received and processed over the Web, or offline, in the manner similar to HTML and XML. In this thesis, we describe a system that converts the AIML scripts to Prolog clauses and reuses them as part of a knowledge processor. The inference mechanism developed in this thesis is able to successfully match the input pattern with our clauses database even if words are missing. We also emulate the pattern deduction algorithm of the original logic deduction mechanism. Our rules, compatible with Semantic Web standards, bring structure to the meaningful content of Web pages and support interactive content retrieval using natural language.
Date: August 2003
Creator: Sun, Guili
Partner: UNT Libraries

General Index to Experiment Station Record Volumes 01-12, 1989-1901 and to Experiment Station Bulletin Number 2

Description: A topical, alphabetically arranged index to volumes 1-12 including experiment station records, publications reviewed, and foreign publications. It has a 'Consolidated Table of Contents' which lists all editorial notes and publications of the experiment stations and Department of Agriculture from the referenced volumes.
Date: 1903
Creator: United States. Office of Experiment Stations.
Partner: UNT Libraries Government Documents Department

Virtual Human Problem Solving Environments

Description: Abstract. Interest in complex integrated digital or virtual human modeling has seen a significant increase over the last decade. Coincident with that increased interest, Oak Ridge National Laboratory (ORNL) initiated the development of a human simulation tool, the Virtual Human. The Virtual Human includes a problem-solving environment (PSE) for implementing the integration of physiological models in different programming languages and connecting physiological function to anatomy. The Virtual Human PSE (VHPSE) provides the computational framework with which to develop the concept of a "Virtual Human." Supporting the framework is a data definition for modeling parameters, PhysioML, a Virtual Human Database (VHDB), and a Web-based graphical user interface (GUI) developed using Java. Following description of the VHPSE, we discuss four example implementations of models within the framework. Further expansion of a human modeling environment was carried out in the Defense Advanced Research Projects Agency Virtual Soldier Project. SCIRun served as the Virtual Soldier problem solving environment (VSPSE). We review and compare specific developments in these projects that have significant potential for the future of Virtual Human modeling and simulation. We conclude with an evaluation of areas of future work that will provide important extensions to the VHPSE and VSPSE and make possible a fully-integrated environment for human anatomical and physiological modeling: the Virtual Human.
Date: January 1, 2008
Creator: Ward, Richard C.; Pouchard, Line Catherine; Munro, Nancy B. & Fischer, Sarah Kathleen
Partner: UNT Libraries Government Documents Department

ENHANCING SEISMIC CALIBRATION RESEARCH THROUGH SOFTWARE AUTOMATION AND SCIENTIFIC INFORMATION MANAGEMENT

Description: The National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Engineering (GNEM R&E) Program at LLNL has made significant progress enhancing the process of deriving seismic calibrations and performing scientific integration, analysis, and information management with software automation tools. Several achievements in schema design, data visualization, synthesis, and analysis were completed this year. Our tool efforts address the problematic issues of very large datasets and varied formats encountered during seismic calibration research. As data volumes have increased, scientific information management issues such as data quality assessment, ontology mapping, and metadata collection that are essential for production and validation of derived calibrations have negatively impacted researchers abilities to produce products. New information management and analysis tools have resulted in demonstrated gains in efficiency of producing scientific data products and improved accuracy of derived seismic calibrations. Significant software engineering and development efforts have produced an object-oriented framework that provides database centric coordination between scientific tools, users, and data. Nearly a half billion parameters, signals, measurements, and metadata entries are all stored in a relational database accessed by an extensive object-oriented multi-technology software framework that includes elements of stored procedures, real-time transactional database triggers and constraints, as well as coupled Java and C++ software libraries to handle the information interchange and validation requirements. Significant resources were applied to schema design to enable recording of processing flow and metadata. A core capability is the ability to rapidly select and present subsets of related signals and measurements to the researchers for analysis and distillation both visually (JAVA GUI client applications) and in batch mode (instantiation of multi-threaded applications on clusters of processors). Development of efficient data exploitation methods has become increasingly important throughout academic and government seismic research communities to address multi-disciplinary large scale initiatives. Effective frameworks must also simultaneously provide the researcher ...
Date: July 6, 2007
Creator: Ruppert, S D; Dodge, D A; Ganzberger, M D; Hauk, T F & Matzel, E M
Partner: UNT Libraries Government Documents Department

Summary Report for the SINBAD Search Tool Project

Description: The Shielding Integral Benchmark Archive Database (SINBAD) Search Tool has been developed to serve as an interface with the SINBAD database to facilitate a simple and quick means of searching for information related to experimental benchmark problems. The Search Tool is written in Java and provides a better and efficient way to retrieve information from the SINBAD database. Searches can be performed quickly and easily. With regard to improvements, users are no longer required to know the name of the benchmarks to search the database. Instead, a search can be performed by specifying the experimental facility, constituents of the experimental benchmark, etc. In summary, a new powerful database search tool has been developed for SINBAD.
Date: June 1, 2012
Creator: Cunha Da Silva, Alice
Partner: UNT Libraries Government Documents Department

NIF ICCS Test Controller for Automated & Manual Testing

Description: The National Ignition Facility (NIF) Integrated Computer Control System (ICCS) is a large (1.5 MSLOC), hierarchical, distributed system that controls all aspects of the NIF laser [1]. The ICCS team delivers software updates to the NIF facility throughout the year to support shot operations and commissioning activities. In 2006, there were 48 releases of ICCS: 29 full releases, 19 patches. To ensure the quality of each delivery, thousands of manual and automated tests are performed using the ICCS Test Controller test infrastructure. The TestController system provides test inventory management, test planning, automated test execution and manual test logging, release testing summaries and test results search, all through a web browser interface. Automated tests include command line based frameworks server tests and Graphical User Interface (GUI) based Java tests. Manual tests are presented as a checklist-style web form to be completed by the tester. The results of all tests, automated and manual, are kept in a common repository that provides data to dynamic status reports. As part of the 3-stage ICCS release testing strategy, the TestController system helps plan, evaluate and track the readiness of each release to the NIF facility.
Date: October 3, 2007
Creator: Zielinski, J S
Partner: UNT Libraries Government Documents Department

Improvements in fast-response flood modeling: desktop parallel computing and domain tracking

Description: It is becoming increasingly important to have the ability to accurately forecast flooding, as flooding accounts for the most losses due to natural disasters in the world and the United States. Flood inundation modeling has been dominated by one-dimensional approaches. These models are computationally efficient and are considered by many engineers to produce reasonably accurate water surface profiles. However, because the profiles estimated in these models must be superimposed on digital elevation data to create a two-dimensional map, the result may be sensitive to the ability of the elevation data to capture relevant features (e.g. dikes/levees, roads, walls, etc...). Moreover, one-dimensional models do not explicitly represent the complex flow processes present in floodplains and urban environments and because two-dimensional models based on the shallow water equations have significantly greater ability to determine flow velocity and direction, the National Research Council (NRC) has recommended that two-dimensional models be used over one-dimensional models for flood inundation studies. This paper has shown that two-dimensional flood modeling computational time can be greatly reduced through the use of Java multithreading on multi-core computers which effectively provides a means for parallel computing on a desktop computer. In addition, this paper has shown that when desktop parallel computing is coupled with a domain tracking algorithm, significant computation time can be eliminated when computations are completed only on inundated cells. The drastic reduction in computational time shown here enhances the ability of two-dimensional flood inundation models to be used as a near-real time flood forecasting tool, engineering, design tool, or planning tool. Perhaps even of greater significance, the reduction in computation time makes the incorporation of risk and uncertainty/ensemble forecasting more feasible for flood inundation modeling (NRC 2000; Sayers et al. 2000).
Date: January 1, 2009
Creator: Judi, David R; Mcpherson, Timothy N & Burian, Steven J
Partner: UNT Libraries Government Documents Department

A System for Exchanging Control and Status Messages in the NOvA Data Acquisition

Description: In preparation for NOvA, a future neutrino experiment at Fermilab, we are developing a system for passing control and status messages in the data acquisition system. The DAQ system will consist of applications running on approximately 450 nodes. The message passing system will use a publish-subscribe model and will provide support for sending messages and receiving the associated replies. Additional features of the system include a layered architecture with custom APIs tailored to the needs of a DAQ system, the use of an open source messaging system for handling the reliable delivery of messages, the ability to send broadcasts to groups of applications, and APIs in Java, C++, and Python. Our choice for the open source system to deliver messages is EPICS. We will discuss the architecture of the system, our experience with EPICS, and preliminary test results.
Date: April 1, 2007
Creator: Biery, K.A.; Cooper, R.G.; Foulkes, S.C.; Guglielmo, G.M.; Piccoli, L.P.; Votava, M.E.V. et al.
Partner: UNT Libraries Government Documents Department

Teaching object concepts for XML-based representations.

Description: Students learned about object-oriented design concepts and knowledge representation through the use of a set of toy blocks. The blocks represented a limited and focused domain of knowledge and one that was physical and tangible. The blocks helped the students to better visualize, communicate, and understand the domain of knowledge as well as how to perform object decomposition. The blocks were further abstracted to an engineering design kit for water park design. This helped the students to work on techniques for abstraction and conceptualization. It also led the project from tangible exercises into software and programming exercises. Students employed XML to create object-based knowledge representations and Java to use the represented knowledge. The students developed and implemented software allowing a lay user to design and create their own water slide and then to take a simulated ride on their slide.
Date: January 1, 2002
Creator: Kelsey, R. L. (Robert L.)
Partner: UNT Libraries Government Documents Department

Generic Optimization Program User Manual Version 3.0.0

Description: GenOpt is an optimization program for the minimization of a cost function that is evaluated by an external simulation program. It has been developed for optimization problems where the cost function is computationally expensive and its derivatives are not available or may not even exist. GenOpt can be coupled to any simulation program that reads its input from text files and writes its output to text files. The independent variables can be continuous variables (possibly with lower and upper bounds), discrete variables, or both, continuous and discrete variables. Constraints on dependent variables can be implemented using penalty or barrier functions. GenOpt uses parallel computing to evaluate the simulations. GenOpt has a library with local and global multi-dimensional and one-dimensional optimization algorithms, and algorithms for doing parametric runs. An algorithm interface allows adding new minimization algorithms without knowing the details of the program structure. GenOpt is written in Java so that it is platform independent. The platform independence and the general interface make GenOpt applicable to a wide range of optimization problems. GenOpt has not been designed for linear programming problems, quadratic programming problems, and problems where the gradient of the cost function is available. For such problems, as well as for other problems, special tailored software exists that is more efficient.
Date: May 11, 2009
Creator: Wetter, Michael
Partner: UNT Libraries Government Documents Department

RISK REDUCTION WITH A FUZZY EXPERT EXPLORATION TOOL

Description: Incomplete or sparse information on types of data such as geologic or formation characteristics introduces a high level of risk for oil exploration and development projects. ''Expert'' systems developed and used in several disciplines and industries have demonstrated beneficial results. A state-of-the-art exploration ''expert'' tool, relying on a computerized database and computer maps generated by neural networks, is being developed through the use of ''fuzzy'' logic, a relatively new mathematical treatment of imprecise or non-explicit parameters and values. Oil prospecting risk can be reduced with the use of a properly developed and validated ''Fuzzy Expert Exploration (FEE) Tool.'' This FEE Tool can be beneficial in many regions of the U.S. by enabling risk reduction in oil and gas prospecting as well as decreased prospecting and development costs. In the 1998-1999 oil industry environment, many smaller exploration companies lacked the resources of a pool of expert exploration personnel. Downsizing, low oil prices, and scarcity of exploration funds have also affected larger companies, and will, with time, affect the end users of oil industry products in the U.S. as reserves are depleted. The FEE Tool will benefit a diverse group in the U.S., leading to a more efficient use of scarce funds and lower product prices for consumers. This second annual report contains a summary of progress to date, problems encountered, plans for the next quarter, and an assessment of the prospects for future progress. During the second year of the project, data acquisition of the Brushy Canyon Formation was completed with the compiling and analyzing of well logs, geophysical data, and production information needed to characterize production potential in the Delaware Basin. A majority of this data now resides in several online databases on our servers and is in proper form to be accessed by external programs such as Web applications. ...
Date: May 17, 2001
Creator: Weiss, William W.
Partner: UNT Libraries Government Documents Department

RISK REDUCTION WITH A FUZZY EXPERT EXPLORATION TOOL

Description: Incomplete or sparse information on geologic or formation characteristics introduces a high level of risk for oil exploration and development projects. Expert systems have been developed and used in several disciplines and industries, including medical diagnostics, with favorable results. A state-of-the-art exploration ''expert'' tool, relying on a computerized data base and computer maps generated by neural networks, is proposed through the use of ''fuzzy'' logic, a relatively new mathematical treatment of imprecise or non-explicit parameters and values. This project will develop an Artificial Intelligence system that will draw upon a wide variety of information to provide realistic estimates of risk. ''Fuzzy logic,'' a system of integrating large amounts of inexact, incomplete information with modern computational methods to derive usable conclusions, has been demonstrated as a cost-effective computational technology in many industrial applications. During project year 1, 90% of geologic, geophysical, production and price data were assimilated for installation into the database. Logs provided geologic data consisting of formation tops of the Brushy Canyon, Lower Brushy Canyon, and Bone Springs zones of 700 wells used to construct regional cross sections. Regional structure and isopach maps were constructed using kriging to interpolate between the measured points. One of the structure derivative maps (azimuth of curvature) visually correlates with Brushy Canyon fields on the maximum change contours. Derivatives of the regional geophysical data also visually correlate with the location of the fields. The azimuth of maximum dip approximately locates fields on the maximum change contours. In a similar manner the second derivative in the x-direction of the gravity map visually correlates with the alignment of the known fields. The visual correlations strongly suggest that neural network architectures will be found to correlate regional attributes with individual well production. On a local scale, given open-hole log information, a neural network was trained to predict ...
Date: June 30, 2000
Creator: Weiss, William W.
Partner: UNT Libraries Government Documents Department

Java XMGR

Description: The XMGR5 graphing package [1] for drawing RELAP5 [2] plots is being re-written in Java [3]. Java is a robust programming language that is available at no cost for most computer platforms from Sun Microsystems, Inc. XMGR5 is an extension of an XY plotting tool called ACE/gr extended to plot data from several US Nuclear Regulatory Commission (NRC) applications. It is also the most popular graphing package worldwide for making RELAP5 plots. In Section 1, a short review of XMGR5 is given, followed by a brief overview of Java. In Section 2, shortcomings of both tkXMGR [4] and XMGR5 are discussed and the value of converting to Java is given. Details of the conversion to Java are given in Section 3. The progress to date, some conclusions and future work are given in Section 4. Some screen shots of the Java version are shown.
Date: August 1, 2004
Creator: Mesina, Dr. George L. & Miller, Steven P.
Partner: UNT Libraries Government Documents Department

Architectural Advancements in RELAP5-3D

Description: As both the computer industry and field of nuclear science and engineering move forward, there is a need to improve the computing tools used in the nuclear industry to keep pace with these changes. By increasing the capability of the codes, the growing modeling needs of nuclear plant analysis will be met and advantage can be taken of more powerful computer languages and architecture. In the past eighteen months, improvements have been made to RELAP5-3D [1] for these reasons. These architectural advances include code restructuring, conversion to Fortran 90, high performance computing upgrades, and rewriting of the RELAP5 Graphical User Interface (RGUI) [2] and XMGR5 [3] in Java. These architectural changes will extend the lifetime of RELAP5-3D, reduce the costs for development and maintenance, and improve it speed and reliability.
Date: November 1, 2005
Creator: Mesina, Dr. George L.
Partner: UNT Libraries Government Documents Department

Lambda Station: Alternate network path forwarding for production SciDAC applications

Description: The LHC era will start very soon, creating immense data volumes capable of demanding allocation of an entire network circuit for task-driven applications. Circuit-based alternate network paths are one solution to meeting the LHC high bandwidth network requirements. The Lambda Station project is aimed at addressing growing requirements for dynamic allocation of alternate network paths. Lambda Station facilitates the rerouting of designated traffic through site LAN infrastructure onto so-called 'high-impact' wide-area networks. The prototype Lambda Station developed with Service Oriented Architecture (SOA) approach in mind will be presented. Lambda Station has been successfully integrated into the production version of the Storage Resource Manager (SRM), and deployed at US CMS Tier1 center at Fermilab, as well as at US-CMS Tier-2 site at Caltech. This paper will discuss experiences using the prototype system with production SciDAC applications for data movement between Fermilab and Caltech. The architecture and design principles of the production version Lambda Station software, currently being implemented as Java based web services, will also be presented in this paper.
Date: September 1, 2007
Creator: Grigoriev, Maxim; Bobyshev, Andrey; Crawford, Matt; DeMar, Phil; Grigaliunas, Vyto; Moibenko, Alexander et al.
Partner: UNT Libraries Government Documents Department

The CMS dataset bookkeeping service

Description: The CMS Dataset Bookkeeping Service (DBS) has been developed to catalog all CMS event data from Monte Carlo and Detector sources. It provides the ability to identify MC or trigger source, track data provenance, construct datasets for analysis, and discover interesting data. CMS requires processing and analysis activities at various service levels and the DBS system provides support for localized processing or private analysis, as well as global access for CMS users at large. Catalog entries can be moved among the various service levels with a simple set of migration tools, thus forming a loose federation of databases. DBS is available to CMS users via a Python API, Command Line, and a Discovery web page interfaces. The system is built as a multi-tier web application with Java servlets running under Tomcat, with connections via JDBC to Oracle or MySQL database backends. Clients connect to the service through HTTP or HTTPS with authentication provided by GRID certificates and authorization through VOMS. DBS is an integral part of the overall CMS Data Management and Workflow Management systems.
Date: October 1, 2007
Creator: Afaq, Anzar,; /Fermilab; Dolgert, Andrew; /Cornell U., Phys. Dept.; Guo, Yuyi; /Fermilab et al.
Partner: UNT Libraries Government Documents Department

Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling (Final Report)

Description: This report contains the comprehensive summary of the work performed on the SBIR Phase II, Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling at Kitware Inc. in collaboration with Stanford Linear Accelerator Center (SLAC). The goal of the work was to develop collaborative visualization tools for large-scale data as illustrated in the figure below. The solutions we proposed address the typical problems faced by geographicallyand organizationally-separated research and engineering teams, who produce large data (either through simulation or experimental measurement) and wish to work together to analyze and understand their data. Because the data is large, we expect that it cannot be easily transported to each team member's work site, and that the visualization server must reside near the data. Further, we also expect that each work site has heterogeneous resources: some with large computing clients, tiled (or large) displays and high bandwidth; others sites as simple as a team member on a laptop computer. Our solution is based on the open-source, widely used ParaView large-data visualization application. We extended this tool to support multiple collaborative clients who may locally visualize data, and then periodically rejoin and synchronize with the group to discuss their findings. Options for managing session control, adding annotation, and defining the visualization pipeline, among others, were incorporated. We also developed and deployed a Web visualization framework based on ParaView that enables the Web browser to act as a participating client in a collaborative session. The ParaView Web Visualization framework leverages various Web technologies including WebGL, JavaScript, Java and Flash to enable interactive 3D visualization over the web using ParaView as the visualization server. We steered the development of this technology by teaming with the SLAC National Accelerator Laboratory. SLAC has a computationally-intensive problem important to the nations scientific progress as described shortly. Further, SLAC researchers routinely generate ...
Date: November 13, 2011
Creator: Schroeder, William J.
Partner: UNT Libraries Government Documents Department

DAKOTA JAGUAR 2.1 user's Manual.

Description: JAGUAR (JAva GUi for Applied Research) is a Java software tool providing an advanced text editor and graphical user interface (GUI) to manipulate DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) input specifications. This document focuses on the features necessary for a user to use JAGUAR.
Date: June 1, 2011
Creator: Adams, Brian M.; Lefantzi, Sophia; Chan, Ethan & Ruthruff, Joseph R.
Partner: UNT Libraries Government Documents Department

User Interface Framework for the National Ignition Facility (NIF)

Description: A user interface (UI) framework supports the development of user interfaces to operate the National Ignition Facility (NIF) using the Integrated Computer Control System (ICCS). [1] This framework simplifies UI development and ensures consistency for NIF operators. A comprehensive, layered collection of UIs in ICCS provides interaction with system-level processes, shot automation, and subsystem-specific devices. All user interfaces are written in Java, employing CORBA to interact with other ICCS components. ICCS developers use these frameworks to compose two major types of user interfaces: broadviews and control panels. Broadviews provide a visual representation of the NIF beamlines through interactive schematic drawings. Control panels provide status and control at a device level. The UI framework includes a suite of display components to standardize user interaction through data entry behaviors, common connection and threading mechanisms, and a common appearance. With these components, ICCS developers can more efficiently address usability issues in the facility when needed. The ICCS UI framework helps developers create consistent and easy-to-understand user interfaces for NIF operators.
Date: October 1, 2007
Creator: Fisher, J M; Bowers, G A; Carey, R W; Daveler, S A; Herndon Ford, K B; Ho, J C et al.
Partner: UNT Libraries Government Documents Department

Use of the target diagnostic control system in the National Ignition Facility

Description: The extreme physics of targets shocked by NIF's 192-beam laser are observed by a diverse suite of diagnostics including optical backscatter, time-integrated, time resolved and gated X-ray sensors, laser velocity interferometry, and neutron time of flight. Diagnostics to diagnose fusion ignition implosion and neutron emissions have been developed. A Diagnostic Control System (DCS) for both hardware and software facilitates development and eases integration. Each complex diagnostic typically uses an ensemble of electronic instruments attached to sensors, digitizers, cameras, and other devices. In the DCS architecture each instrument is interfaced to a low-cost Window XP processor and Java application. Instruments are aggregated as needed in the supervisory system to form an integrated diagnostic. The Java framework provides data management, control services and operator GUI generation. During the past several years, over thirty-six diagnostics have been deployed using this architecture in support of the National Ignition Campaign (NIC). The DCS architecture facilitates the expected additions and upgrades to diagnostics as more experiments are performed. This paper presents the DCS architecture, framework and our experiences in using it during the NIC to operate, upgrade and maintain a large set of diagnostic instruments.
Date: July 25, 2011
Creator: Shelton, R; Lagin, L & Nelson, J
Partner: UNT Libraries Government Documents Department

CartaBlanca-rapid prototyping development environment for non-linear systems on unstructured grids.

Description: This talk describes a component-based nonlinear physical system simulation prototyping package written entirely in Java using objectoriented design, The package provides scientists and engineers a 'developer-friendly' software environment for large-scale computational algorithm and physical model development, on the Jacobian-Free Newton-Krylov solution method surrounding a finite-volume treatment of conservation equations. This enables a clean component-like implementation. We first provide motivation for the development of the software and then discuss software structure. Discussion .includes a description of the use of Java's built-in thread facility that enables parallel, shared-memory computations on a wide variety of unstructured grids with triangular, quadrilateral, tetrahedral and hexahedral elements. We also discuss the use of Java's inheritance mechanism in the construction of a hierarchy of physics systems objects and linear and nonlinear solver objects that simplify development and foster software re-use. Following this, we show results from example calculations and then discuss plans including the extension of the software to distributed memory computer systems.
Date: January 1, 2002
Creator: VanderHeyden, W. B. (William Brian); Livescu, D. (Daniel) & Padial-Collins, N. T. (Nely T.)
Partner: UNT Libraries Government Documents Department

Application of a Java-based, univel geometry, neutral particle Monte Carlo code to the searchlight problem

Description: A univel geometry, neutral particle Monte Carlo transport code, written entirely in the Java programming language, is under development for medical radiotherapy applications. The code uses ENDF-VI based continuous energy cross section data in a flexible XML format. Full neutron-photon coupling, including detailed photon production and photonuclear reactions, is included. Charged particle equilibrium is assumed within the patient model so that detailed transport of electrons produced by photon interactions may be neglected. External beam and internal distributed source descriptions for mixed neutron-photon sources are allowed. Flux and dose tallies are performed on a univel basis. A four-tap, shift-register-sequence random number generator is used. Initial verification and validation testing of the basic neutron transport routines is underway. The searchlight problem was chosen as a suitable first application because of the simplicity of the physical model. Results show excellent agreement with analytic solutions. Computation times for similar numbers of histories are comparable to other neutron MC codes written in C and FORTRAN.
Date: April 1, 2005
Creator: Wemple, Charles A. & Cogliati, Joshua J.
Partner: UNT Libraries Government Documents Department