5,270 Matching Results

Search Results

Advanced search parameters have been applied.

MGMRES: A generalization of GMRES for solving large sparse nonsymmetric linear systems

Description: This paper is concerned with the solution of the linear system Au = b, where A is a real square nonsingular matrix which is large, sparse and nonsymmetric. We consider the use of Krylov subspace methods. We first choose an initial approximation u{sup (0)} to the solution {bar u} = A{sup -1}b. The GMRES (Generalized Minimum Residual Algorithm for Solving Non Symmetric Linear Systems) method was developed by Saad and Schultz (1986) and used extensively for many years, for sparse systems. This paper considers a generalization of GMRES; it is similar to GMRES except that we let Z = A{sup T}Y, where Y is a nonsingular matrix which is symmetric but not necessarily SPD.
Date: November 1, 1996
Creator: Young, D.M. & Chen, Jen Yuan
Partner: UNT Libraries Government Documents Department

TWRSview system requirements specification

Description: This document provides the system requirements specification for the TWRSview software system. The TWRSview software system is being developed to integrate electronic data supporting the development of the TWRS technical baseline
Date: December 1, 1995
Creator: Caldwell, J.A. & Lee, A.K.
Partner: UNT Libraries Government Documents Department

Culturally relevant science: An approach to math science education for hispanics. Final technical report

Description: This progress report summarizes results of a teacher workshop. A letter sent to 17 teachers who had participated in the workshop requested a report of any activities undertaken and copies of lesson plans and materials developed. Only nine responses were received, and not all of them demonstrated a satisfactory level of activity. Teachers who submitted materials showing the most promise were invited to participate in the Summer Writing Workshop. A partial first draft of a companion volume for the teacher`s manual was written which provides a rationale for culturally relevant science and presents the cultural and scientific background needed. The outline of the book is presented in Appendix 1. Appendix 2 is a sample chapter from the book.
Date: November 14, 1996
Creator: Montellano, B.O. de
Partner: UNT Libraries Government Documents Department

The Fermilab computing farms in 1997

Description: The farms in 1997 went through a variety of changes. First, the farms expansion, begun in 1996, was completed. This boosted the computing capacity to something like 20,000 MIPS (where a MIP is a unit defined by running a program, TINY, on the machine and comparing the machine performance to a VAX 11/780). In SpecInt92, it would probably rate close to 40,000. The use of the farms was not all that large. The fixed target experiments were not generally in full production in 1997, but spent time tuning up code. Other users processed on the farms, but tended to come and go and not saturate the resource. Some of the old farms were retired, saving the lab money on maintenance and saving the farms support staff effort.
Date: February 15, 1998
Creator: Wolbers, S.
Partner: UNT Libraries Government Documents Department

Go ahead, visit those web sites, you can`t get hurt, can you?

Description: Browsing (surfing) the World Wide Web (the web) has exploded onto the Internet with an unprecedented popularity. Fueled by massive acceptance, the web client/server technology is leaping forward with a speed that competes with no other software technology. The primary force behind this phenomenon is the simplicity of the web browsing experience. People who have never touched a computer before can now perform sophisticated network tasks with a simple point-and-click. Unfortunately, this simplicity gives many, if not most, web wanderers the impression that the web browser is risk free, nothing more than a high powered television. This misconception is dangerous by creating the myth that a user visiting a web site is immune from subversive or malicious intent. While many want you to believe that surfing the web is as simple as using any other household appliance, it is not like surfing television channels, it is bi-directional. You can learn a lot of useful information from web sites. But, either directly or indirectly, others can also learn quite a bit about you. Of even more concern is a web sites` potential ability to exert control over the local computer. This paper tries to consolidate some of the current concerns that you should consider as you jump into the surf.
Date: February 1, 1997
Creator: Rothfuss, J.S. & Parrett, J.W.
Partner: UNT Libraries Government Documents Department

Fourier analysis

Description: What follows is a description of my analysis. First, the FFT that I use is described on the attached pages. Note that the scaling factor for the forward transform is 1/N. I compute the following rms values: rms(original data) = 64.9463 nm; rms(data*hanning) = 55.7723 nm (before renormalization). The use of the hanning filter is accompanied by a renormalization to insure that the rms value is maintained. I also fit to the curvature of the scan. The data corrected for focus gives the following rms values: rms(corrected data) = 56.8835 nm; rms(corrected data*hanning) = 53.2179 nm (before renormalization). The PSD is shown for various data. The PSD is calculated as: PSD = | FFT(y) | {sup 2} * xl where xl is the length of the x axis, 45.9952. I did find an error in the plot that you were sent. If kx is the frequency axis, i.e., values from (0,Nyquist), then kx(l,Nyquist) is plotted versus PSD(0,Nyquist). This error is corrected in the attached plots. The plot you have appears to be the PSD of the original data with no hanning applied. The removal of the quadratic term appear to have a negligible effect on the PSD. It changes only the first couple of terms (which lie outside of the data valid range). The removal of the center feature has a much stronger effect.
Date: March 3, 1997
Creator: Lawson, J.
Partner: UNT Libraries Government Documents Department

ARISE: American renaissance in science education

Description: The national standards and state derivatives must be reinforced by models of curricular reform. In this paper, ARISE presents one model based on a set of principles--coherence, integration of the sciences, movement from concrete ideas to abstract ones, inquiry, connection and application, sequencing that is responsive to how people learn.
Date: September 14, 1998
Partner: UNT Libraries Government Documents Department

A Feature Extraction Toolbox for Pattern Recognition Application

Description: Feature extraction and evaluation are procedures common to the development of all pattern recognition application. These features are the primary pieces of information used to train the pattern recognition engine, whether that engine is a neural network, a fuzzy logic rulebase, or a genetic algorithm. Careful selection of the features to be used by the pattern recognition engine can significantly streamline the overall development and training of the solution for the pattern recognition application. Presently, AlliedSignal Federal Manufacturing & Technologies (FM&T) is developing an integrated, computer-based software package, called the Feature Extraction Toolbox. This package will be used for developing and deploying solutions to generic pattern recognition problems. The toolbox integrates a variety of software techniques for signal processing, feature extraction and evaluation, and pattern recognition, under a single, user-friendly developmental environment. While a feature extraction toolbox can help in the selection process, it is the user that ultimately must make all decisions. A prototype version of this toolbox has been developed and currently is being used for applications development on several projects in support of the Department of Energy. The toolbox has been developed to run on a laptop computer so that it can be taken to a site and used in the field.
Date: November 23, 1998
Creator: Baumgart, C.W.; Linder, K.E. & Nelson, L.K.
Partner: UNT Libraries Government Documents Department

Fermilab central mass storage system as a test bed for HPSS

Description: Fermilab Central Mass Storage System (FCMSS) provides mass storage related services to 16 groups of users, mainly Fermilab Fixed Target Experiments. FCMSS is also used as a test bed for High Performance Storage System (HPSS). We report on our more than a year and a half long production experience of using HPSS and show results of various performance tests.
Date: December 1, 1998
Creator: Krzysztof Genser, Alexander Moibenko, Don Petravick, David Sachs and Joseph Syu
Partner: UNT Libraries Government Documents Department

SMARTARRAY: A C++ class template for self-describing, resizable, error-resistant arrays

Description: The SmartArray class template supports one-dimensional (single index) arrays and provides four major features that make it superior to built-in C++ arrays: a SmartArray is self-describing (both capacity and content), a SmartArray can be dynamically resized, the index supplied to the operator of a SmartArray is bounds checked, and the lower bound of a SmartArray can be chosen by the programmer. Additionally, the SmartArray class provides a full set of traversal functions, an assignment operator, editing functions, and an error handling mechanism-yet remains small, self-contained, portable, efficient, and easy to master. The class template SmartArray <T> requires that T be either a built-in type or a class that provides an assignment operator, a default (no argument) constructor, a copy constructor, and a destructor. If T does not contain any pointers, the compiler-generated versions of these four functions will probably be adequate.
Date: April 1, 1996
Creator: Perano, K.J. & Nielan, P.E.
Partner: UNT Libraries Government Documents Department

Dynamical hierarchies - A summary

Description: This paper summarizes some of the problems associated with the generation of higher order emergent structures in formal dynamical systems. In biological systems, higher order hyperstructures occur both in an intuitive and a formal sense: monomers, polymers, membranes, organelles, cells, tissues, organs, etc. constitute an observable hierarchy, apparently generated by the underlying biomolecular process. However, in models and simulations of these systems, it has turned out to be quite difficult to produce higher order emergent structures from first principles. The first problem is to agree on what a higher order structure is. An emergent structure can be defined through an introduction of an observational function. If a property can be observed in the dynamics, but not at the level of the fundamental first order interacting structures, we define it to be emergent. It is well known that second order structures occur relatively easy in simulation, so the problem is how to proceed to third and higher order without external interference. A third order structure is defined through the interaction of second order structures forming a new observable not found at the lower levels.
Date: April 1, 1996
Creator: Rasmussen, S.; Barrett, C.L. & Olesen, M.W.
Partner: UNT Libraries Government Documents Department

Hard chaos, quantum billiards, and quantum dot computers

Description: This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). Research was performed in analytic and computational techniques for dealing with hard chaos, especially the powerful tool of cycle expansions. This work has direct application to the understanding of electrons in nanodevices, such as junctions of quantum wires, or in arrays of dots or antidots. We developed a series of techniques for computing the properties of quantum systems with hard chaos, in particular the flow of electrons through nanodevices. These techniques are providing the insight and tools to design computers with nanoscale components. Recent efforts concentrated on understanding the effects of noise and orbit pruning in chaotic dynamical systems. We showed that most complicated chaotic systems (not just those equivalent to a finite shift) will develop branch points in their cycle expansion. Once the singularity is known to exist, it can be removed with a dramatic increase in the speed of convergence of quantities of physical interest.
Date: July 1, 1996
Creator: Mainieri, R.; Cvitanovic, P. & Hasslacher, B.
Partner: UNT Libraries Government Documents Department

A case study in automated theorem proving: A difficult problem about commutators

Description: This paper shows how the automated deduction system OTTER. was used to prove the group theory theorem {chi}{sup 3} = e {implies} [[[y, z], u], v] = e, where e is the identity, and [XI Y] is the commutator {chi}{prime}y{prime}{chi}y. This is a difficult problem for automated provers, and several lengthy searches were run before a proof was found. Problem formulation and search strategy played a key role in the success. I believe that ours is the first automated proof of the theorem.
Date: February 1, 1995
Creator: McCune, W.
Partner: UNT Libraries Government Documents Department

Edge detection by nonlinear dynamics

Description: We demonstrate how the formulation of a nonlinear scale-space filter can be used for edge detection and junction analysis. By casting edge-preserving filtering in terms of maximizing information content subject to an average cost function, the computed cost at each pixel location becomes a local measure of edgeness. This computation depends on a single scale parameter and the given image data. Unlike previous approaches which require careful tuning of the filter kernels for various types of edges, our scheme is general enough to be able to handle different edges, such as lines, step-edges, corners and junctions. Anisotropy in the data is handled automatically by the nonlinear dynamics.
Date: July 1994
Creator: Wong, Yiu-fai
Partner: UNT Libraries Government Documents Department

Meta-transport library user`s guide

Description: Developing new transport protocols or protocol algorithms suffer from the complexity of the environment in which they are intended to run. Modeling techniques attempt to relieve this by simulating the environment. Our approach to promoting rapid prototyping of protocols and protocol algorithms is to provide a pre-built infrastructure that is common to all transport protocols, so that the focus is placed on the protocol-specific aspects. The Meta-Transport Library is a library of base classes that implement or abstract out the mundane functions of a protocol; new protocol implementations are derived from the base classes. The result is a fully viable transport protocol implementation, with emphasis on modularity. The collection of base classes form a {open_quotes}class-chest{close_quotes} of tools from which protocols can be developed and studied with as little change to a normal mix environment as possible. In addition to supporting protocol designers, this approach has pedagogical uses.
Date: July 1, 1996
Creator: Strayer, W.T.
Partner: UNT Libraries Government Documents Department

Off-training-set error for the Gibbs and the Bayes optimal generalizers

Description: In this paper we analyze the average off-training-set behavior of the Bayes-optimal and Gibbs learning algorithms. We do this by exploiting the concept of refinement, which concerns the relationship between probability distributions. For non-uniform sampling distributions the expected off training-set error for both learning algorithms can rise with, training set size. However we show in this paper that for uniform sampling and either algorithm, the expected error is a non-increasing function of training set size. For uniform sampling distributions, we also characterize the priors for which the expected error of the Bayes-optimal algorithm stays constant. In addition we show that when the target function is fixed, expected off-training-set error can increase with training set size if and only if the expected error averaged over all targets decreases with training set size. Our results hold for arbitrary noise and arbitrary loss functions.
Date: January 3, 1995
Creator: Grossman, T.; Knill, E. & Wolpert, D.
Partner: UNT Libraries Government Documents Department

Speech coding

Description: Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the ...
Date: May 8, 1998
Creator: Ravishankar, C., Hughes Network Systems, Germantown, MD
Partner: UNT Libraries Government Documents Department

A synchronous paradigm for modeling stable reactive systems

Description: This paper describes a modeling technique for single-agent reactive systems, that is influenced by the modeling paradigm of Parnas as well as by the synchronous paradigms of LUSTRE and ESTEREL. In this paradigm, single-agent reactive systems are modeled in a universe having a discrete clock. This discretization of time greatly reduces the temporal complexity of the model. He believes that the advantage of this reduction in temporal complexity is that the resulting model is in many ways better suited to automated software construction and analysis techniques (e.g., deductive synthesis, transformation, and verification) than models that are based on continuous representations of time.
Date: December 1, 1998
Creator: Winter, V.L.
Partner: UNT Libraries Government Documents Department

Future algorithm research needs for partitioning in solid mechanics and coupled mechanical models

Description: Exceptional progress has been made in mathematical algorithm research leading to optimized mesh partitions for the highly unstructured grids occurring in finite element applications in solid mechanics. Today another research challenge presents itself. Research is needed to include boundary conditions into the algorithms for partitioning meshes. We describe below two methods we use currently to accomplish this and propose a more general approach be developed which would apply to our problems today as well as to the coupled models we envision for the future. Finally, we suggest research be considered that would incorporate partitioning methods into parallel mesh generation.
Date: October 6, 1997
Creator: Hoover, C. G.; DeGroot, A. J. & Sherwood, R. J.
Partner: UNT Libraries Government Documents Department