364 Matching Results

Search Results

Advanced search parameters have been applied.

Eliminating columns in the simplex method for linear programming

Description: In this paper we pose and answer two questions about solutions of the linear complementarity problem (LCP). The first question is concerned with the conditions on a square matrix M which guarantee that for every vector q, the solutions of LCP (q,M) are identical to the Karush-Kuhn-Tucker points of the natural quadratic program associated with (q,M). In answering this question we introduce the class of ''row sufficient'' matrices. The transpose of such a matrix is what we call ''column sufficient.'' The latter matrices turn out to furnish the answer to our second question which asks for the conditions on M under which the solution set of (q,M) is convex for every q. In addition to these two main results, we discuss the connections of these two new matrix classes with other well-known matrix classes in linear complementarity theory. 23 refs.
Date: November 1, 1987
Creator: Ye, Yinyu
Partner: UNT Libraries Government Documents Department

Monitoring the usage of a computer system

Description: Controlling the usage of computer systems particularly those operated for the federal government, is an important topic today. Audit requirements have grown to the point where they can be a significant burden to the proprietors of the system. The paper briefly mentions several proposals for responding to increased audit requirements and for monitoring a system to detect unauthorized activity. A technique is proposed for situations where the proscribed or the intended activity can be characterized in terms of program or system performance parameters. The design of a usage monitoring system is outlined. The design is based on enhancing the audit data provided by the monitored system, capturing the audit data in a separate system to protect it from user access, and implementing one of the audit trail analysis systems currently under development.
Date: November 1, 1987
Creator: Bailey, D.J.
Partner: UNT Libraries Government Documents Department

Electronic authoring tools

Description: More than a decade ago, word processing software revolutionized the way documents were prepared, and productivity was increased. But the editing and formatting capabilities of most word processors did very little to prevent errors in spelling, typing, grammar, diction, style, or organization from slipping through to the final documents. In the past few years, the number of software tools that aid the author has increased substantially. They now vary in scope from simple spelling checkers to sophisticated diction analyzers and idea processors. Moreover, writing-aid-software described in this report is now available for many types of computing systems, including personal computers, scientific workstations, and mainframes. The various pieces of software can be used in interactive or non-interactive (batch) modes.
Date: January 1, 1987
Creator: Hurford, J.M.
Partner: UNT Libraries Government Documents Department

LaT/sub E/X memos and letters

Description: Letters and memos at Los Alamos National Laboratory (LANL) are formatted in accordance with rules established in the Laboratory's Office Procedures Manual. LaT/sub E/X style files were written to let people produce letters and memos without worrying about a complicated set of rules. Macro and template files are distributed through the Laboratory's Change Control system for use on VAX/UNIX, VAX/VMS, Sun, Apollo, and IBMPC. A testbed of several hundred test files is used to minimize bugs in the distributed versions. There is a choice of Computer Modern fonts or PostScript fonts. Memos and letters can be printed in Roman or typewriter typefaces. When called for, classification labels will be printed on every page. Headers on pages following the first page are compiled from information found on the first page. Letters can handle multiple addresses. Default options are provided where applicable, and error messages warn users about missing information fields. 2 refs., 6 figs.
Date: January 1, 1989
Creator: Sydoriak, S.J.
Partner: UNT Libraries Government Documents Department

Los Alamos CCS (Center for Computer Security) formal computer security model

Description: This paper provides a brief presentation of the formal computer security model currently being developed at the Los Alamos Department of Energy (DOE) Center for Computer Security (CCS). The initial motivation for this effort was the need to provide a method by which DOE computer security policy implementation could be tested and verified. The actual analytical model was a result of the integration of current research in computer security and previous modeling and research experiences. The model is being developed to define a generic view of the computer and network security domains, to provide a theoretical basis for the design of a security model, and to address the limitations of present models. Formal mathematical models for computer security have been designed and developed in conjunction with attempts to build secure computer systems since the early 70's. The foundation of the Los Alamos DOE CCS model is a series of functionally dependent probability equations, relations, and expressions. The mathematical basis appears to be justified and is undergoing continued discrimination and evolution. We expect to apply the model to the discipline of the Bell-Lapadula abstract sets of objects and subjects. 5 refs.
Date: January 1, 1989
Creator: Dreicer, J.S. & Hunteman, W.J. (Los Alamos National Lab., NM (USA))
Partner: UNT Libraries Government Documents Department

Software support: Pre-empting the quick question. [User's support group at Fermilab]

Description: High energy physicists, researchers and graduate students, from universities all around the world come to Fermi National Accelerator Laboratory to do their experiments. They use our computer facilities to perform all phases of their data-analysis and presentation. We have a large turnover of users and a rather small support group, in a multi-vendor environment. We strive to make our users self-sufficient through the use of well-publicized maintenance procedures, documentation systems, and product support standards. By these pre-emptive measures we attempt to have quick answers at hand for the truly quick questions, leaving us time for the interesting problems.
Date: September 1, 1987
Creator: Loebel, L.
Partner: UNT Libraries Government Documents Department

A multiple node software development environment

Description: Experimenters on over 30 DECnet nodes at Fermilab use software developed, distributed, and maintained by the Data Acquisition Software Group. A general methodology and set of tools have been developed to distribute, use and manage the software on different sites. The methodology and tools are of interest to any group developing and using software on multiple nodes.
Date: June 1, 1987
Creator: Heinicke, P.; Nicinski, T.; Constanta-Fanourakis, P.; Petravick, D.; Pordes, R.; Ritchie, D. et al.
Partner: UNT Libraries Government Documents Department

Modeling and generating input processes

Description: This tutorial paper provides information relevant to the selection and generation of stochastic inputs to simulation studies. The primary area considered is multivariate but much of the philosophy at least is relevant to univariate inputs as well. 14 refs.
Date: January 1, 1987
Creator: Johnson, M.E.
Partner: UNT Libraries Government Documents Department

A compendium of computer codes used in particle accelerator design and analysis

Description: In preparing this compilation, we came across the names of more than 150 programs that have been used in the design and analysis of accelerators. Many are obsolete and some are not easily transported from the institution where they were created. All are included in this compilation and filed at Los Alamos, but the obsolete codes have been removed from this article. Other than the judgment as to whether or not a particular code is obsolete, we have not made any critical evaluations. Computer codes and code compilations share the common problem of obsolescence. This compilation will probably be almost useless in three years or less. Useful codes become widely distributed. Users make improvements in distributed code, and the original author loses control over the evolution of the code as variations proliferate. Every generation of accelerator physicists produces code-builders; persons who feel that they can design more comprehensive or easier-to-use codes to do the tasks done by previous codes. This document is organized so that each code is described on a one- or two-page data sheet. The data sheets are arranged alphabetically by code name but are not numbered. In this way, it will be easy to insert new codes as they are discovered. There are a number of simulation codes that have no names, and we have not taken the time to obtain detailed information on all of them. However, there are two fairly current codes that we thought worthwhile to include here. Therefore we have arbitrarily assigned them names. One of these is a CERN code by Myers, which we have called BEAMBEAM and the other is a DESY code by Piwinski, which we have called BMBMI. The code data sheets are preceded by three indexes: subject, person-to-contact, and code acronym. It was not useful to list codes ...
Date: January 1, 1987
Partner: UNT Libraries Government Documents Department

How to write application code even a security auditor could love

Description: In the past the application programmer was frequently isolated from the computer security professional. The target machine might have various access controls and security plans, but when the programmer delivered a new application, it was rarely scrutinized from a security standpoint. Security reviews of application code are now being used to overcome this apparent oversight, but these reviews are often hampered by a lack of knowledge among programmers of techniques that make code secure and facilitate security analysis of the code. This paper informally describes fifteen general principles for producing good code that is easily reviewed. This paper is not a formal guideline, but is intended as an inside view of how one reviewer looks at code from a security standpoint.
Date: January 1, 1989
Creator: Barlich, G.L.
Partner: UNT Libraries Government Documents Department

Detection of anomalous computer session activity

Description: This paper describes recent Los Alamos National Laboratory (LANL) applications of research into automated anomaly detection. In the context of computer security, anomaly detection seeks to identify events shown in audit records that are inconsistent with routine operation and therefore may be indicative of an intrusion into the computer, serious human errors, or malicious behavior by a legitimate user. Access by an intruder, execution of ''Trojan horses'' and ''viruses,'' as well as malicious, destructive behavior are all assumed to produce anomalous events that are recorded in a computer audit trail. This trail, perhaps with augmented data collection capabilities, is processed, in real-time, to detect such events, alert a knowledgeable computer security officer to the threat, and help resolve the situation. 3 refs., 6 figs.
Date: January 1, 1988
Creator: Vaccaro, H.S.
Partner: UNT Libraries Government Documents Department

Computer based terrain analysis for operational planning

Description: Analysis of operational capability is an ongoing task for military commanders. In peacetime, most analysis is conducted via computer based combat simulations, where selected force structures engage in simulated combat to gain insight into specific scenarios. The command and control (C/sup 2/) mechanisms that direct combat forces are often neglected relative to the fidelity of representation of mechanical and physical entities. C/sup 2/ capabilities should include the ability to plan a mission, monitor execution activities, and redirect combat power when appropriate. This paper discusses the development of a computer based approach to mission planning for land warfare. The aspect emphasized is the computation and representation of relevant terrain features in the context of operational planning.
Date: January 1, 1987
Creator: Powell, D.R.
Partner: UNT Libraries Government Documents Department

VMS (Virtual Memory Systems) ALAP (Audit Log Analysis Package) 1. 0: An automated audit trail analysis tool

Description: Because multiuser computer systems typically record enormous quantities of information about user and system activities into system log files, auditing computer user/system activities is a formidable task. Recognizing that a manual audit of these log files is difficult and usually ineffective, the DOE Center for Computer Security has developed an automated audit trail analysis tool, Audit Log Analysis Package (ALAP). ALAP employs methodology developed at Los Alamos National Laboratory for the detection and analysis of anomalous data in large databases. ALAP is capable of processing vast amounts of audit data for detection and analysis of anomalous computer user and system behavior. The first application tool is VMS ALAP 1.0, targeted for Digital Equipment Corporation (DEC) Virtual Memory Systems (VMS). 2 refs.
Date: January 1, 1989
Creator: Martinez, D.P.
Partner: UNT Libraries Government Documents Department

Where does CBT (computer-based training) fit in, now that we know so much. A front end analysis study

Description: Computer-based training (CBT) has now been in existence for over two decades. It has been implemented in both the private sector and government organizations at an exponential rate. Nevertheless, many institutions, particularly educational institutions, have not yet introduced CBT. Our knowledge of what works and what does not, as well as hardware and software advances, has greatly increased in the past few years. This paper addresses many management considerations with respect to CBT. First, we consider the generic environment in which CBT might be used and then issues that affect costs and benefits, including lessons learned by the Cognitive Engineering Design and Research Team (CEDAR) of the Los Alamos National Laboratory in its assessments. The final section gives some ''how-to'' guidelines on increasing the probability of successfully introducing CBT into the training environment. The underlying theme of the paper is that management should be guided by what we now know about costs and benefits in its decisions regarding CBT and fight the lure of ''high tech'' glitter.
Date: January 1, 1987
Creator: Andrews, A.E. & Trainor, M.S.
Partner: UNT Libraries Government Documents Department

The -mdoc macro package: A software tool to support computer documentation standards

Description: At Los Alamos National Laboratory a small staff of writers and word processors in the Computer Documentation Group is responsible for producing computer documentation for the over 8000 users of the Laboratory's computer network. The -mdoc macro package was developed as a software tool to support that effort. The -mdoc macro package is used with the NROFF/TROFF document preparation system on the UNIX operating system. The -mdoc macro package incorporates the standards for computer documentation at Los Alamos that were established by the writers. Use of the -mdoc macro package has freed the staff of programming format details, allowing writers to concentrate on content of documents and word processors to produce documents in a timely manner. It is an easy-to-use software tool that adapts to changing skills, needs, and technology. 5 refs.
Date: September 16, 1987
Creator: Sanders, C.E.
Partner: UNT Libraries Government Documents Department

HIERtalker: A default hierarchy of high order neural networks that learns to read English aloud

Description: A new learning algorithm based on a default hierarchy of high order neural networks has been developed that is able to generalize as well as handle exceptions. It learns the ''building blocks'' or clusters of symbols in a stream that appear repeatedly and convey certain messages. The default hierarchy prevents a combinatoric explosion of rules. A simulator of such a hierarchy, HIERtalker, has been applied to the conversion of English words to phonemes. Achieved accuracy is 99% for trained words and ranges from 76% to 96% for sets of new words. 8 refs., 4 figs., 1 tab.
Date: January 1, 1988
Creator: An, Z.G.; Mniszewski, S.M.; Lee, Y.C.; Papcun, G. & Doolen, G.D.
Partner: UNT Libraries Government Documents Department

Dimension density: An intensive measure of chaos in spatially extended turbulent systems

Description: The determination of correlation dimensions by the Grassberger-Procaccia algorithm from an experimental time series has become a standard tool in the analysis of low dimensional chaotic systems. Here we want to carry over this method to spatially extended systems which have a decaying spatial correlation. In these cases the total number of degrees of freedom or overall ''dimension'' grows with the size of the system. Then in a finite size system the dimension of the overall dynamics can be recovered already from a single point measurement, if the resolution is greater than some size dependent threshold. Therefore we expect that the measured dimension values will increase when smaller and smaller spatial structures are resolved. This feature is also observed in turbulence experiments (U. Frisch). Thus the objective is to get an intensive (i.e. size independent) measure which locally characterizes turbulent systems. 5 refs., 2 figs.
Date: May 1, 1986
Creator: Kurz, T. & Mayer-Kress, G.
Partner: UNT Libraries Government Documents Department

Design, implementation, and operation of a class based batch queue scheduler for VAX/VMS

Description: Fermilab found that the standard VMS batch configuration options were inadequate for the job mix that exists on the Fermilab central computer facility VAX cluster. Accordingly, Fermilab designed and implemented a class based batch queue scheduler. This scheduler makes use of the standard VMS job controller and batch system. Users interact with the scheduler at job submission time by specification of CPU time limits and batch job characteristics. This scheduler allows Fermilab to make efficient use of our large heterogeneous VAX cluster which contains machines ranging from a VAX 780 to a VAX 8800. The scheduler was implemented using the VMS system services $GETQUI and $SNDJBC, without changes to the existing VMS job scheduler. As a result, the scheduler should remain compatible with future VMS versions. This session will discuss the design goals, implementation, and operational experience with Fermilab's class based batch queue scheduler.
Date: May 20, 1988
Creator: Chadwick, K.
Partner: UNT Libraries Government Documents Department

A document preparation system in a large network environment

Description: At Los Alamos National Laboratory, we have developed an integrated document preparation system that produces publication-quality documents. This system combines text formatters and computer graphics capabilities that have been adapted to meet the needs of users in a large scientific research laboratory. This paper describes the integration of document processing technology to develop a system architecture, based on a page description language, to provide network-wide capabilities in a distributed computing environment. We describe the Laboratory requirements, the integration and implementation issues, and the challenges we faced developing this system.
Date: January 1, 1988
Creator: Vigil, M.; Bouchier, S.; Sanders, C.; Sydoriak, S. & Wheeler, K.
Partner: UNT Libraries Government Documents Department

PC proliferation: Minimizing corporate risk through planning for application maintenance

Description: The rapid proliferation of personal computers, offering tremendous productivity gains for the knowledge worker, often creates new application maintenance tasks. Specific concerns include security, data integrity, and access authorization. Distributed networks require security and communication systems. Distributed data entry requires file servers, network support personnel, and synchronization methods to preserve the integrity of corporate data. Much PC software which must be maintained will be developed outside of standard-imposing environments and without benefit of formal training. A recommended method for limiting future maintenance problems is the formation of a staff possessing skills specific to problem solving in the areas mentioned and functioning as PC consultants for the area of the knowledge worker.
Date: January 1, 1987
Creator: Shafer, L.I.
Partner: UNT Libraries Government Documents Department

How EXCELERATOR changed my job

Description: I started using a Computer-Aided Software Engineering (CASE) analysis/design tool two years ago in the Administrative Data Processing Division (ADP) and have observed that the way I approach my job has changed. The end product is the same, but the product is derived differently. The ability to change what is done gives freedom to experiment and iterate. This freedom has changed some steps and resequenced others in arriving at a structured specification for a new system. Additionally, the analysis capability of the tools allow the performance of work that was often disregarded because it was either too tedious or overwhelming. This presentation will address the changes observed in the process of analysis and design using CASE. 2 figs.
Date: January 1, 1988
Creator: Weir, D.
Partner: UNT Libraries Government Documents Department

Grid generation software engineering at Los Alamos

Description: We have collected and re-engineered a small library of computer codes for general-purpose grid generation in one-, two-, and three-dimensional domains. The design intent was to produce easy-to-use general purpose codes that are portable to as many different hardware and software environments as practical, that are consistent in programming style and user interface, and that cover a gamut of applications. The paper describes some of the features of the codes, emphasizing the perspective of the potential user or programmer, rather than that of the researcher interested in mathematical techniques. 7 refs., 3 figs.
Date: January 1, 1988
Creator: Clark, G.L. & Ankeny, L.A.
Partner: UNT Libraries Government Documents Department

Mass storage system reference model system management

Description: System Management is the collection of functions that are primarily concerned with the control, performance and utilization of the Mass Storage System defined by the Mass Storage System Reference Model. These functions are often very site-dependent, involve human decision making, and span multiple ''severs'' of the Mass Storage System. The functions may be implemented as standalone programs, may be integrated with the other Mass Storage System software, or may just be policy. 4 refs.
Date: January 1, 1988
Creator: Collins, B. & McLarty, T.
Partner: UNT Libraries Government Documents Department