550 Matching Results

Search Results

Advanced search parameters have been applied.

A Formal Notation for Hardware and Software Verification

Description: Some years ago a theory was published describing compilation of digital logic system build descriptions (list of behaviors for elementary components and connections between them) into recursively computable system behavior descriptions. The current work describes an analogous theory for computer programs.
Date: October 1986
Creator: Chapman, Richard O. & Gabriel, John R.
Partner: UNT Libraries Government Documents Department

Generating Machine Code for High-Level Programming Languages

Description: The purpose of this research was to investigate the generation of machine code from high-level programming language. The following steps were undertaken: 1) Choose a high-level programming language as the source language and a computer as the target computer. 2) Examine all stages during the compiling of a high-level programming language and all data sets involved in the compilation. 3) Discover the mechanism for generating machine code and the mechanism to generate more efficient machine code from the language. 3) Construct an algorithm for generating machine code for the target computer. The results suggest that compiler is best implemented in a high-level programming language, and that SCANNER and PARSER should be independent of target representations, if possible.
Date: December 1976
Creator: Chao, Chia-Huei
Partner: UNT Libraries

INTERFACING AUTOCAD WITH MAGNETIC DESIGN

Description: This report is a summary of work done towards developing an AutoCAD based system for design and analysis of magnets. The computer programs that have been developed are an attempt to integrate the new SUN computer based system with existing software on the old HP1000 System. We believe this is a good start for the further development of the whole system. The programming languages used are AutoLISP for the programs used by AutoCAD, and Fortran (Microsoft Fortran) for all others. The entire work has been done on IBM-AT, with the well known limits of the memory, speed of execution and operating system, therefore, some adjustment may be needed for the more powerful SUN system.
Date: February 1, 1988
Creator: Sorin, M. & Caspi, S.
Partner: UNT Libraries Government Documents Department

Babel Fortran 2003 Binding for Structured Data Types

Description: Babel is a tool aimed at the high-performance computing community that addresses the need for mixing programming languages (Java, Python, C, C++, Fortran 90, FORTRAN 77) in order to leverage the specific benefits of those languages. Scientific codes often rely on structured data types (structs, derived data types) to encapsulate data, and Babel has been lacking in this type of support until recently. We present a new language binding that focuses on their interoperability of C/C++ with Fortran 2003. The new binding builds on the existing Fortran 90 infrastructure by using the iso-c-binding module defined in the Fortran 2003 standard as the basis for C/C++ interoperability. We present the technical approach for the new binding and discuss our initial experiences in applying the binding in FACETS (Framework Application for Core-Edge Transport Simulations) to integrate C++ with legacy Fortran codes.
Date: May 2, 2008
Creator: Muszala, S; Epperly, T & Wang, N
Partner: UNT Libraries Government Documents Department

DATAPLOT: Introduction and Overview

Description: Abstract: This manual provides DATAPLOT code solution to a variety of commonly occurring graphical problems. A line-by-line explanation of code is given, along with illustrations and general discussion.
Date: June 1984
Creator: Filliben, James J.
Partner: UNT Libraries Government Documents Department

Flowchart-to-source-code conversion program for GPSS/360

Description: Thesis. An interactive program is descnibed which allows a user to specify a GPSS program as a block diagram. The user draws the block diagram using the Graphical Remote Access Support System (GRASS) and sends it to the remote computer where it is converted to an equivalent source program. He can then run the program, if desired, and have the output from GPSS routed back to the graphics terminal, selecting the portion for which hard copy is desired. The program is written in 360 Assembler Language. (auth)
Date: December 1, 1973
Creator: Purvy, R.E.
Partner: UNT Libraries Government Documents Department

''Ideal'' directly executed languages: an analytical argument for emulation

Description: Several methods of evaluating user programs are analyzed with respect to space and time requirements. The concept of an ideal'' directly executed language is introduced, and it is argued that the ideal'' directly executed language for a contemporary computing system will not be either its source language or the language accepted by its base machine. (auth)
Date: December 1, 1973
Creator: Hoevel, L. W.
Partner: UNT Libraries Government Documents Department

Fortran M Language Definition

Description: This document defines the Fortran M extensions to Fortran 77. It updates an earlier definition, dated June 1992, in several minor respects.
Date: August 1993
Creator: Foster, Ian & Chandy, K. Mani
Partner: UNT Libraries Government Documents Department

The machine protection system for the R&D energy recovery LINAC

Description: The Machine Protection System (MPS) is a device-safety system that is designed to prevent damage to hardware by generating interlocks, based upon the state of input signals generated by selected sub-systems. It protects all the key machinery in the R&D Project called the Energy Recovery LINAC (ERL) against the high beam current. The MPS is capable of responding to a fault with an interlock signal within several microseconds. The ERL MPS is based on a National Instruments CompactRIO platform, and is programmed by utilizing National Instruments' development environment for a visual programming language. The system also transfers data (interlock status, time of fault, etc.) to the main server. Transferred data is integrated into the pre-existing software architecture which is accessible by the operators. This paper will provide an overview of the hardware used, its configuration and operation, as well as the software written both on the device and the server side.
Date: March 28, 2011
Creator: Altinbas, Z.; Kayran, D.; Jamilkowski, J.; Lee, R.C. & Oerter, B.
Partner: UNT Libraries Government Documents Department

Center for Technology for Advanced Scientific Component Software (TASCS)

Description: The UO portion of the larger TASCS project was focused on the usability subproject identified in the original project proposal. The key usability issue that we tacked was that of supporting legacy code developers in migrating to a component-oriented design pattern and development model with minimal manual labor. It was observed during the lifetime of the TASCS (and previous CCA efforts) that more often than not, users would arrive with existing code that was developed previous to their exposure to component design methods. As such, they were faced with the task of both learning the CCA toolchain and at the same time, manually deconstructing and reassembling their existing code to fit the design constraints imposed by components. This was a common complaint (and occasional reason for a user to abandon components altogether), so our task was to remove this manual labor as much as possible to lessen the burden placed on the end-user when adopting components for existing codes. To accomplish this, we created a source-based static analysis tool that used code annotations to drive code generation and transformation operations. The use of code annotations is due to one of the key technical challenges facing this work | programming languages are limited in the degree to which application-specific semantics can be represented in code. For example, data types are often ambiguous. The C pointer is the most common example cited in practice. Given a pointer to a location in memory, should it be interpreted as a singleton or an array. If it is to be interpreted as an array, how many dimensions does the array have? What are their extents? The annotation language that we designed and implemented addresses this ambiguity issue by allowing users to decorate their code in places where ambiguity exists in order to guide tools to ...
Date: June 30, 2010
Creator: Sottile, Dr. Mathew
Partner: UNT Libraries Government Documents Department

COMPOSE-HPC: A Transformational Approach to Exascale

Description: The goal of the COMPOSE-HPC project is to 'democratize' tools for automatic transformation of program source code so that it becomes tractable for the developers of scientific applications to create and use their own transformations reliably and safely. This paper describes our approach to this challenge, the creation of the KNOT tool chain, which includes tools for the creation of annotation languages to control the transformations (PAUL), to perform the transformations (ROTE), and optimization and code generation (BRAID), which can be used individually and in combination. We also provide examples of current and future uses of the KNOT tools, which include transforming code to use different programming models and environments, providing tests that can be used to detect errors in software or its execution, as well as composition of software written in different programming languages, or with different threading patterns.
Date: April 1, 2012
Creator: Bernholdt, David E; Allan, Benjamin A.; Armstrong, Robert C.; Chavarria-Miranda, Daniel; Dahlgren, Tamara L.; Elwasif, Wael R et al.
Partner: UNT Libraries Government Documents Department

An introduction to LIME 1.0 and its use in coupling codes for multiphysics simulations.

Description: LIME is a small software package for creating multiphysics simulation codes. The name was formed as an acronym denoting 'Lightweight Integrating Multiphysics Environment for coupling codes.' LIME is intended to be especially useful when separate computer codes (which may be written in any standard computer language) already exist to solve different parts of a multiphysics problem. LIME provides the key high-level software (written in C++), a well defined approach (with example templates), and interface requirements to enable the assembly of multiple physics codes into a single coupled-multiphysics simulation code. In this report we introduce important software design characteristics of LIME, describe key components of a typical multiphysics application that might be created using LIME, and provide basic examples of its use - including the customized software that must be written by a user. We also describe the types of modifications that may be needed to individual physics codes in order for them to be incorporated into a LIME-based multiphysics application.
Date: November 1, 2011
Creator: Belcourt, Noel; Pawlowski, Roger Patrick; Schmidt, Rodney Cannon & Hooper, Russell Warren
Partner: UNT Libraries Government Documents Department

Accounting for Global Climate Model Projection Uncertainty in Modern Statistical Downscaling

Description: Future climate change has emerged as a national and a global security threat. To carry out the needed adaptation and mitigation steps, a quantification of the expected level of climate change is needed, both at the global and the regional scale; in the end, the impact of climate change is felt at the local/regional level. An important part of such climate change assessment is uncertainty quantification. Decision and policy makers are not only interested in 'best guesses' of expected climate change, but rather probabilistic quantification (e.g., Rougier, 2007). For example, consider the following question: What is the probability that the average summer temperature will increase by at least 4 C in region R if global CO{sub 2} emission increases by P% from current levels by time T? It is a simple question, but one that remains very difficult to answer. It is answering these kind of questions that is the focus of this effort. The uncertainty associated with future climate change can be attributed to three major factors: (1) Uncertainty about future emission of green house gasses (GHG). (2) Given a future GHG emission scenario, what is its impact on the global climate? (3) Given a particular evolution of the global climate, what does it mean for a particular location/region? In what follows, we assume a particular GHG emission scenario has been selected. Given the GHG emission scenario, the current batch of the state-of-the-art global climate models (GCMs) is used to simulate future climate under this scenario, yielding an ensemble of future climate projections (which reflect, to some degree our uncertainty of being able to simulate future climate give a particular GHG scenario). Due to the coarse-resolution nature of the GCM projections, they need to be spatially downscaled for regional impact assessments. To downscale a given GCM projection, two methods ...
Date: March 17, 2010
Creator: Johannesson, G
Partner: UNT Libraries Government Documents Department

Prometheus Reactor I&C Software Development Methodology, for Action

Description: The purpose of this letter is to submit the Reactor Instrumentation and Control (I&C) software life cycle, development methodology, and programming language selections and rationale for project Prometheus to NR for approval. This letter also provides the draft Reactor I&C Software Development Process Manual and Reactor Module Software Development Plan to NR for information.
Date: July 30, 2005
Creator: Hamilton, T.
Partner: UNT Libraries Government Documents Department

Runtime Detection of C-Style Errors in UPC Code

Description: Unified Parallel C (UPC) extends the C programming language (ISO C 99) with explicit parallel programming support for the partitioned global address space (PGAS), which provides a global memory space with localized partitions to each thread. Like its ancestor C, UPC is a low-level language that emphasizes code efficiency over safety. The absence of dynamic (and static) safety checks allows programmer oversights and software flaws that can be hard to spot. In this paper, we present an extension of a dynamic analysis tool, ROSE-Code Instrumentation and Runtime Monitor (ROSECIRM), for UPC to help programmers find C-style errors involving the global address space. Built on top of the ROSE source-to-source compiler infrastructure, the tool instruments source files with code that monitors operations and keeps track of changes to the system state. The resulting code is linked to a runtime monitor that observes the program execution and finds software defects. We describe the extensions to ROSE-CIRM that were necessary to support UPC. We discuss complications that arise from parallel code and our solutions. We test ROSE-CIRM against a runtime error detection test suite, and present performance results obtained from running error-free codes. ROSE-CIRM is released as part of the ROSE compiler under a BSD-style open source license.
Date: September 29, 2011
Creator: Pirkelbauer, P; Liao, C; Panas, T & Quinlan, D
Partner: UNT Libraries Government Documents Department

Using architectures for semantic interoperability to create journal clubs for emergency response

Description: In certain types of 'slow burn' emergencies, careful accumulation and evaluation of information can offer a crucial advantage. The SARS outbreak in the first decade of the 21st century was such an event, and ad hoc journal clubs played a critical role in assisting scientific and technical responders in identifying and developing various strategies for halting what could have become a dangerous pandemic. This research-in-progress paper describes a process for leveraging emerging semantic web and digital library architectures and standards to (1) create a focused collection of bibliographic metadata, (2) extract semantic information, (3) convert it to the Resource Description Framework /Extensible Markup Language (RDF/XML), and (4) integrate it so that scientific and technical responders can share and explore critical information in the collections.
Date: January 1, 2009
Creator: Powell, James E; Collins, Linn M & Martinez, Mark L B
Partner: UNT Libraries Government Documents Department

Process for selecting engineering tools : applied to selecting a SysML tool.

Description: Process for Selecting Engineering Tools outlines the process and tools used to select a SysML (Systems Modeling Language) tool. The process is general in nature and users could use the process to select most engineering tools and software applications.
Date: February 1, 2011
Creator: De Spain, Mark J.; Post, Debra S. (Sandia National Laboratories, Livermore, CA); Taylor, Jeffrey L. & De Jong, Kent
Partner: UNT Libraries Government Documents Department

USER 2.1; User Specified Estimation Routine, Techncial Manual 2003.

Description: This document is primarily a description of the user interface for USER2.1; it is not a description of the statistical theory and calculations behind USER. This project is funded by the Bonneville Power Administration, U.S. Department of Energy, under Contract No. 004126, Project No. 198910700 as part of the BPA's program to protect, mitigate, and enhance fish and wildlife affected by the development and operation of hydroelectric facilities on the Columbia River and its tributaries. The analysis of fish and wildlife data requires investigators to have the ability to develop statistical models tailored to their study requirements. Hence, a flexible platform to develop statistical likelihood models to estimate demographic parameters is necessary. To this end, Program USER (User Specified Estimation Routine) was developed to provide a convenient platform for investigators to develop statistical models and analyze tagging and count data. The program is capable of developing models and analyzing any count data that can be described by multinomial or product multinomial distributions. Such data include release-recapture studies using PIT-tags, radio-tags, balloon-tags, and acoustic-tags to estimate survival, movement, and demographic data on the age and/or sex structure of wild populations. The user of the program can specify the parameters and model structure at will to tailor the analyses to the specific requirements of the field sampling program, the data, and populations under investigation. All of this is available without the need for the user to know advanced programming languages or numerical analysis techniques, and without involving cumbersome software developed for extraneous purposes. Program USER represents a powerful statistical modeling routine that can be readily used by investigators with a wide range of interests and quantitative skills.
Date: July 1, 2003
Creator: Lady, James; Westhagen, Peter & Skalski, John
Partner: UNT Libraries Government Documents Department