3,172 Matching Results

Search Results

Advanced search parameters have been applied.

Dimension of chaotic attractors

Description: Dimension is perhaps the most basic property of an attractor. In this paper we discuss a variety of different definitions of dimension, compute their values for a typical example, and review previous work on the dimension of chaotic attractors. The relevant definitions of dimension are of two general types, those that depend only on metric properties, and those that depend on probabilistic properties (that is, they depend on the frequency with which a typical trajectory visits different regions of the attractor). Both our example and the previous work that we review support the conclusion that all of the probabilistic dimensions take on the same value, which we call the dimension of the natural measure, and all of the metric dimensions take on a common value, which we call the fractal dimension. Furthermore, the dimension of the natural measure is typically equal to the Lyapunov dimension, which is defined in terms of Lyapunov numbers, and thus is usually far easier to calculate than any other definition. Because it is computable and more physically relevant, we feel that the dimension of the natural measure is more important than the fractal dimension.
Date: September 1, 1982
Creator: Farmer, J.D.; Ott, E. & Yorke, J.A.
Partner: UNT Libraries Government Documents Department

Axymptotic normality of X/sup 2/ in mxn tables with n large and small cell expectations

Description: Asymptotic normality for chi/sup 2/ used as a test for homogeneity is established under nonstandard conditions. The case of an mxn table with m fixed and the total number of observations proportional to n is studied for n large. Results are obtained under very mild assumptions on the marginal totals.
Date: January 1, 1977
Creator: Cuzick, J.
Partner: UNT Libraries Government Documents Department

Teaching old Fortran programmers new tricks

Description: For a number of valid reasons, Fortran remains in widespread use. It can be difficult to get long-time Fortran programmers to accept the use of new software tools that are increasingly required to lower software costs. In order to gain acceptance for a new software tool, it is necessary for it to be easy to learn and use, as well as to provide new benefits. In the process of introducing the use of the Ratfor preprocessor for Fortran, a number of useful guidelines were defined for gaining the acceptance of any new software tool in an existing environment.
Date: January 1, 1980
Creator: Wampler, B. E.
Partner: UNT Libraries Government Documents Department

Visual tools and languages: Directions for the '90s

Description: We identify and discuss three domains where we believe that innovative application of visual programming languages is likely to make a significant impact in the near term: concurrent computing, computer-based assistance for people with disabilities, and the multimedia/multimodal environments of tomorrow in which it will be possible to hear and physically interact with information as well as see it. 33 refs., 3 figs.
Date: January 1, 1991
Creator: Glinert, E.P. (Rensselaer Polytechnic Inst., Troy, NY (United States). Dept. of Computer Science); Blattner, M.M. (Lawrence Livermore National Lab., CA (United States)) & Frerking, C.J. (California Univ., Davis, CA (United States))
Partner: UNT Libraries Government Documents Department

A sendmail. cf scheme for a large network

Description: Like most large networked sites our users depend heavily on the electronic mail system for both internal and off-site communications. Unfortunately the sendmail.cf file, which is used to control the behavior of the sendmail program, is somewhat cryptic and difficult to decipher for the neophyte. So, on one hand you have a highly visible, frequently used utility, and on the other hand a not-so-easily acquired system administration forte. Here is the sendmail topology of our site, what premises we based it on, and the parts of the sendmail.cf files which support the topology.
Date: August 14, 1991
Creator: Darmohray, T.M.
Partner: UNT Libraries Government Documents Department

Application of automated deduction to the search for single axioms for exponent groups

Description: We present new results in axiomatic group theory obtained by using automated deduction programs. The results include single axioms, some with the identity and others without, for groups of exponents 3, 4, 5 and 7, and a general form for single axioms for groups of odd exponent. The results were obtained by using the programs in three separate ways: as a symbolic calculator, to search for proofs,and to search for couterexamples. We also touch on relations between logic programming and automated reasoning.
Date: February 11, 1992
Creator: McCune, W. & Wos, L.
Partner: UNT Libraries Government Documents Department

Floating point hardware emulator for RSX-11D

Description: An RSX-11D task was written to simulate the FP-11 floating point hardware on systems that lack this hardware. The simulation is transparent to tasks using floating point instructions. All normal features of the hardware are simulated exactly, including its action on exception conditions. The emulator is a privileged task occupying about 2.7K words of memory. When it is loaded and run, it sets up a linkage to intercept the reserved instruction trap before it reaches the executive, and route it to a service routine that can decode and simulate the floating point instruction set. The results of a benchmark timing test are given, as are notes on converting the emulator to run under RSX-11M. 1 figure, 2 tables.
Date: January 1, 1977
Creator: Kellogg, M. & Long, M.
Partner: UNT Libraries Government Documents Department

Current status of link access control and encryption system

Description: The purpose of this project is to develop necessary technologies for the secure protection of data communication networks. Data encryption equipment, using the federal government's Data Encryption Standard (DES) algorithm, was designed and developed. This equipment is the Link Access Control and Encryption (Link ACE) system. It protects unclassified sensitive data transmissions over unprotected lines between central computers and remote terminals. Link ACE units have been installed and are operational in the Department of Energy's Central Personnel Clearance Index (CPCI) system.
Date: January 1, 1984
Creator: Springer, E.
Partner: UNT Libraries Government Documents Department

Two characterizations of sufficient matrices

Description: Two characterizations are given for the class of sufficient matrices defined by Cottle, Pang, and Venkateswaran. The first is a direct translation of the definition into linear programming terms. The second can be thought of as a generalization of a theorem of T. D. Parsons on P-matrices. 19 refs.
Date: December 1, 1990
Creator: Cottle, R.W. & Guu, Sy-Ming.
Partner: UNT Libraries Government Documents Department

Sensitivity-analysis techniques: self-teaching curriculum

Description: This self teaching curriculum on sensitivity analysis techniques consists of three parts: (1) Use of the Latin Hypercube Sampling Program (Iman, Davenport and Ziegler, Latin Hypercube Sampling (Program User's Guide), SAND79-1473, January 1980); (2) Use of the Stepwise Regression Program (Iman, et al., Stepwise Regression with PRESS and Rank Regression (Program User's Guide) SAND79-1472, January 1980); and (3) Application of the procedures to sensitivity and uncertainty analyses of the groundwater transport model MWFT/DVM (Campbell, Iman and Reeves, Risk Methodology for Geologic Disposal of Radioactive Waste - Transport Model Sensitivity Analysis; SAND80-0644, NUREG/CR-1377, June 1980: Campbell, Longsine, and Reeves, The Distributed Velocity Method of Solving the Convective-Dispersion Equation, SAND80-0717, NUREG/CR-1376, July 1980). This curriculum is one in a series developed by Sandia National Laboratories for transfer of the capability to use the technology developed under the NRC funded High Level Waste Methodology Development Program.
Date: June 1, 1982
Creator: Iman, R.L. & Conover, W.J.
Partner: UNT Libraries Government Documents Department

Improved selection in totally monotone arrays

Description: This paper's main result is an O(({radical}{bar m}lgm)(n lg n) + mlg n)-time algorithm for computing the kth smallest entry in each row of an m {times} n totally monotone array. (A two-dimensional A = a(i,j) is totally monotone if for all i{sub 1} < i{sub 2} and j{sub 1} < j{sup 2}, < a(i{sub 1},j{sub 2}) implies a(i{sub 2},j{sub 1})). For large values of k (in particular, for k=(n/2)), this algorithm is significantly faster than the O(k(m+n))-time algorithm for the same problem due to Kravets and Park. An immediate consequence of this result is an O(n{sup 3/2} lg{sup 2}n)-time algorithm for computing the kth nearest neighbor of each vertex of a convex n-gon. In addition to the main result, we also give an O(n lg m)-time algorithm for computing an approximate median in each row of an m {times} n totally monotone array; this approximate median is an entry whose rank in its row lies between (n/4) and (3n/4) {minus} 1. 20 refs., 3 figs.
Date: January 1, 1991
Creator: Mansour, Y. (Harvard Univ., Cambridge, MA (United States). Aiken Computation Lab.); Park, J.K. (Sandia National Labs., Albuquerque, NM (United States)); Schieber, B. (International Business Machines Corp., Yorktown Heights, NY (United States). Thomas J. Watson Research Center) & Sen, S. (AT and T Bell Labs., Murray Hill, NJ (United States))
Partner: UNT Libraries Government Documents Department

Overview of the ANS (American Nuclear Society) mathematics and computation software standards

Description: The Mathematics and Computations Division of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains four ANSI/ANS software standards. These standards are: Recommended Programming Practices to Facilitate the Portability of Scientific Computer Programs, ANS-10.2; Guidelines for the Documentation of Computer Software, ANS-10.3; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Guidelines for Accommodating User Needs in Computer Program Development, ANS-10.5. 5 refs.
Date: January 1, 1991
Creator: Smetana, A.O.
Partner: UNT Libraries Government Documents Department

A dynamic menuing and security system

Description: Commonly, a system creator will seek to limit access to various parts of an information system based on the user is and what authorities should be granted that person. With a payroll system, it would be expected that only a limited number of people would be able to change data, a larger segment of managers would be able to view information on their particular department, and perhaps everyone would be able to see their own information. This sort of situation presents an interesting problem for the system designer who would like to minimize the amount of coding necessary to accomplish this level of flexibility and to simplify maintenance of the application system. We were presented with such a problem with a project accounting system. The two first implementations of this application system were based on simply locking users out of screens for which they had no authority. A polite message was displayed indicating that the user did not have authority to access the screen in question. This method had the potential of becoming a security problem because the code was replicated in each of fifteen menus. Changes had to propagated around several different menus for several different levels of authorization. This created difficulty for us in maintaining a consistent set of key strokes throughout the system. Exceptions had to be handled via replicated code. These problems led us to re-evaluate the existing system and to create a system based on the following set of user requirements: access to functions must be based on the user's level of authority; all menus should require the basic keystrokes; menus to which a user dose not have authority should not be displayed to that user; and finally, so-called super users'' must be able to access data from the point of view of various sub-organizations.
Date: January 1, 1992
Creator: Koon, D.M. & Zowin, J.A.
Partner: UNT Libraries Government Documents Department

Introduction to human factors

Description: Some background is given on the field of human factors. The nature of problems with current human/computer interfaces is discussed, some costs are identified, ideal attributes of graceful system interfaces are outlined, and some reasons are indicated why it's not easy to fix the problems. (LEW)
Date: March 1, 1988
Creator: Winters, J.M.
Partner: UNT Libraries Government Documents Department

Discrete Pearson distributions

Description: These distributions are generated by a first order recursive scheme which equates the ratio of successive probabilities to the ratio of two corresponding quadratics. The use of a linearized form of this model will produce equations in the unknowns matched by an appropriate set of moments (assumed to exist). Given the moments we may find valid solutions. These are two cases; (1) distributions defined on the non-negative integers (finite or infinite) and (2) distributions defined on negative integers as well. For (1), given the first four moments, it is possible to set this up as equations of finite or infinite degree in the probability of a zero occurrence, the sth component being a product of s ratios of linear forms in this probability in general. For (2) the equation for the zero probability is purely linear but may involve slowly converging series; here a particular case is the discrete normal. Regions of validity are being studied. 11 refs.
Date: November 1, 1991
Creator: Bowman, K.O. (Oak Ridge National Lab., TN (United States)); Shenton, L.R. (Georgia Univ., Athens, GA (United States)) & Kastenbaum, M.A. (Kastenbaum (M.A.), Basye, VA (United States))
Partner: UNT Libraries Government Documents Department

NA-NET numerical analysis net

Description: This report describes a facility called NA-NET created to allow numerical analysts (na) an easy method of communicating with one another. The main advantage of the NA-NET is uniformity of addressing. All mail is addressed to the Internet host na-net.ornl.gov'' at Oak Ridge National Laboratory. Hence, members of the NA-NET do not need to remember complicated addresses or even where a member is currently located. As long as moving members change their e-mail address in the NA-NET everything works smoothly. The NA-NET system is currently located at Oak Ridge National Laboratory. It is running on the same machine that serves netlib. Netlib is a separate facility that distributes mathematical software via electronic mail. For more information on netlib consult, or send the one-line message send index'' to netlib{at}ornl.gov. The following report describes the current NA-NET system from both a user's perspective and from an implementation perspective. Currently, there are over 2100 members in the NA-NET. An average of 110 mail messages pass through this facility daily.
Date: December 1, 1991
Creator: Dongarra, J. (Tennessee Univ., Knoxville, TN (United States). Dept. of Computer Science Oak Ridge National Lab., TN (United States)) & Rosener, B. (Tennessee Univ., Knoxville, TN (United States). Dept. of Computer Science)
Partner: UNT Libraries Government Documents Department

Using fullscreen CMS at CERN

Description: Fullscreen CMS is an optional console environment introduced in Release 5 of CMS which maintains the context of a VM session across invocations of full screen commands like XEDIT, FILELIST or MAIL. In addition it allows limited scrolling and windowing capabilities. This write-up provides CERNVM users who are interested in Fullscreen CMS with an overview of the concepts and operations which are involved. In that it is an optional environment, this write-up does not constitute an endorsement of Fullscreen CMS.
Date: May 1, 1991
Creator: White, B.
Partner: UNT Libraries Government Documents Department

Limit theorem for the maximum term in an EARMA(1,1) sequence when the parameter rho is one

Description: The EARMA(1,1) process was defined by Jacobs and Lewis (Advances in Applied Probability, 1977). Chernick (Ph.D. dissertation, Stanford University, 1978) showed that the limit for the maximum term is the same as for a sequence of independent, identically distributed exponential random variables when the parameter rho is less than one. When rho equals one, a different limit theorem is obtained. The resulting limit distribution is not an extreme value type. It is, however, of the general form given by Galambos (The Asymptotic Theory of Extreme Order Statistics, Wiley, 1978). The sequence is exchangeable.
Date: January 1, 1979
Creator: Chernick, M R
Partner: UNT Libraries Government Documents Department

Rationale of the ratio image in multispectral remote sensing

Description: A simple mathematical interpretation of the properties of ratio images derived from LANDSAT and other sources of multispectral imagery is presented. A spectral signature is defined which is well represented by ratios of pairs of spectral bands and can be related to the problem of clustering and unsupervised learning. Some practical problems arising in the generation of LANDSAT ratio images are considered, and an effective, simple method for reduction of the dynamic range of such images is presented along with digital image processing examples. 6 figures.
Date: January 1, 1977
Creator: Wecksung, G.W. & Breedlove, J.R. Jr.
Partner: UNT Libraries Government Documents Department

Automated tools for the generation of performance-based training

Description: The field of educational technology is not a new one, but the emphasis in the past has been on the use of technologies for the delivery of instruction and tests. This paper explores the application of technology to the development of performance-based instruction and to the analyses leading up to the development of the instruction. Several technologies are discussed, with specific software packages described. The purpose of these technologies is to streamline the instructional analysis and design process, using the computer for its strengths to aid the human-in-the-loop. Currently, the process is all accomplished manually. Applying automated tools to the process frees the humans from some of the tedium involved so that they can be dedicated to the more complex aspects of the process. 12 refs.
Date: January 1, 1990
Creator: Trainor, M. S. & Fries, J.
Partner: UNT Libraries Government Documents Department