123 Matching Results

Search Results

Advanced search parameters have been applied.

An Adaptive System for Process Control

Description: Abstract: Researchers at the U.S. Bureau of Mines (USBM) have developed adaptive process control systems in which genetic algorithms (GA's) are used to augment fuzzy logic controllers (FLC's). GA's are search algorithms that rapidly locate near-optimum solutions to a wide spectrum of problems by loosely modeling the search procedures of natural genetics. FLC's are rule-based systems that efficiently manipulate a problem environment by modeling the "rule-of-thumb" strategy used in human decisionmaking. Together, GA's and FLC's include all of the capabilities necessary to produce powerful, efficient, and robust adaptive control systems. To perform efficiently, such control systems require a cont element to manipulate the problem environment, an analysis element to recognize changes in the problem environment, and an adaptive element to adjust to the changes in the problem environment. The control system also employs a computer simulation of the problem environment. Details of an overall adaptive control system are discussed. A specific laboratory acid-base pH system is used to demonstrate the ideas presented; all results are from the physical laboratory system and not from a computer simulation.
Date: 1995
Creator: Karr, Charles L.; Gentry, E. J. & Stanley, D. A.
Partner: UNT Libraries Government Documents Department

Computational Intelligence Based Data Fusion Algorithm for Dynamic sEMG and Skeletal Muscle Force Modelling

Description: In this work, an array of three surface Electrography (sEMG) sensors are used to acquired muscle extension and contraction signals for 18 healthy test subjects. The skeletal muscle force is estimated using the acquired sEMG signals and a Non-linear Wiener Hammerstein model, relating the two signals in a dynamic fashion. The model is obtained from using System Identification (SI) algorithm. The obtained force models for each sensor are fused using a proposed fuzzy logic concept with the intent to improve the force estimation accuracy and resilience to sensor failure or misalignment. For the fuzzy logic inference system, the sEMG entropy, the relative error, and the correlation of the force signals are considered for defining the membership functions. The proposed fusion algorithm yields an average of 92.49% correlation between the actual force and the overall estimated force output. In addition, the proposed fusionbased approach is implemented on a test platform. Experiments indicate an improvement in finger/hand force estimation.
Date: August 1, 2013
Creator: Chandrasekhar Potluri,; Anugolu, Madhavi; Schoen, Marco P. & Naidu, D. Subbaram
Partner: UNT Libraries Government Documents Department

Fuzzy Logic Connectivity in Semiconductor Defect Clustering

Description: In joining defects on semiconductor wafer maps into clusters, it is common for defects caused by different sources to overlap. Simple morphological image processing tends to either join too many unrelated defects together or not enough together. Expert semiconductor fabrication engineers have demonstrated that they can easily group clusters of defects from a common manufacturing problem source into a single signature. Capturing this thought process is ideally suited for fuzzy logic. A system of rules was developed to join disconnected clusters based on properties such as elongation, orientation, and distance. The clusters are evaluated on a pair-wise basis using the fuzzy rules and are joined or not joined based on a defuzzification and threshold. The system continuously re-evaluates the clusters under consideration as their fuzzy memberships change with each joining action. The fuzzy membership functions for each pair-wise feature, the techniques used to measure the features, and methods for improving the speed of the system are all developed. Examples of the process are shown using real-world semiconductor wafer maps obtained from chip manufacturers. The algorithm is utilized in the Spatial Signature Analyzer (SSA) software, a joint development project between Oak Ridge National Lab (ORNL) and SEMATECH.
Date: January 24, 1999
Creator: Gleason, S.S.; Kamowski, T.P. & Tobin, K.W.
Partner: UNT Libraries Government Documents Department

A comparison of approximate reasoning results using information uncertainty

Description: An Approximate Reasoning (AR) model is a useful alternative to a probabilistic model when there is a need to draw conclusions from information that is qualitative. For certain systems, much of the information available is elicited from subject matter experts (SME). One such example is the risk of attack on a particular facility by a pernicious adversary. In this example there are several avenues of attack, i.e. scenarios, and AR can be used to model the risk of attack associated with each scenario. The qualitative information available and provided by the SME is comprised of linguistic values which are well suited for an AR model but meager for other modeling approaches. AR models can produce many competing results. Associated with each competing AR result is a vector of linguistic values and a respective degree of membership in each value. A suitable means to compare and segregate AR results would be an invaluable tool to analysts and decisions makers. A viable method would be to quantify the information uncertainty present in each AR result then use the measured quantity comparatively. One issue of concern for measuring the infornlation uncertainty involved with fuzzy uncertainty is that previously proposed approaches focus on the information uncertainty involved within the entire fuzzy set. This paper proposes extending measures of information uncertainty to AR results, which involve only one degree of membership for each fuzzy set included in the AR result. An approach to quantify the information uncertainty in the AR result is presented.
Date: January 1, 2009
Creator: Chavez, Gregory; Key, Brian; Zerkle, David & Shevitz, Daniel
Partner: UNT Libraries Government Documents Department

Towards Resilient Critical Infrastructures: Application of Type-2 Fuzzy Logic in Embedded Network Security Cyber Sensor

Description: Resiliency and cyber security of modern critical infrastructures is becoming increasingly important with the growing number of threats in the cyber-environment. This paper proposes an extension to a previously developed fuzzy logic based anomaly detection network security cyber sensor via incorporating Type-2 Fuzzy Logic (T2 FL). In general, fuzzy logic provides a framework for system modeling in linguistic form capable of coping with imprecise and vague meanings of words. T2 FL is an extension of Type-1 FL which proved to be successful in modeling and minimizing the effects of various kinds of dynamic uncertainties. In this paper, T2 FL provides a basis for robust anomaly detection and cyber security state awareness. In addition, the proposed algorithm was specifically developed to comply with the constrained computational requirements of low-cost embedded network security cyber sensors. The performance of the system was evaluated on a set of network data recorded from an experimental cyber-security test-bed.
Date: August 1, 2011
Creator: Linda, Ondrej; Vollmer, Todd; Alves-Foss, Jim & Manic, Milos
Partner: UNT Libraries Government Documents Department

Fuzzy Logic Based Anomaly Detection for Embedded Network Security Cyber Sensor

Description: Resiliency and security in critical infrastructure control systems in the modern world of cyber terrorism constitute a relevant concern. Developing a network security system specifically tailored to the requirements of such critical assets is of a primary importance. This paper proposes a novel learning algorithm for anomaly based network security cyber sensor together with its hardware implementation. The presented learning algorithm constructs a fuzzy logic rule based model of normal network behavior. Individual fuzzy rules are extracted directly from the stream of incoming packets using an online clustering algorithm. This learning algorithm was specifically developed to comply with the constrained computational requirements of low-cost embedded network security cyber sensors. The performance of the system was evaluated on a set of network data recorded from an experimental test-bed mimicking the environment of a critical infrastructure control system.
Date: April 1, 2011
Creator: Linda, Ondrej; Vollmer, Todd; Wright, Jason & Manic, Milos
Partner: UNT Libraries Government Documents Department

Improving Cyber-Security of Smart Grid Systems via Anomaly Detection and Linguistic Domain Knowledge

Description: The planned large scale deployment of smart grid network devices will generate a large amount of information exchanged over various types of communication networks. The implementation of these critical systems will require appropriate cyber-security measures. A network anomaly detection solution is considered in this work. In common network architectures multiple communications streams are simultaneously present, making it difficult to build an anomaly detection solution for the entire system. In addition, common anomaly detection algorithms require specification of a sensitivity threshold, which inevitably leads to a tradeoff between false positives and false negatives rates. In order to alleviate these issues, this paper proposes a novel anomaly detection architecture. The designed system applies the previously developed network security cyber-sensor method to individual selected communication streams allowing for learning accurate normal network behavior models. Furthermore, the developed system dynamically adjusts the sensitivity threshold of each anomaly detection algorithm based on domain knowledge about the specific network system. It is proposed to model this domain knowledge using Interval Type-2 Fuzzy Logic rules, which linguistically describe the relationship between various features of the network communication and the possibility of a cyber attack. The proposed method was tested on experimental smart grid system demonstrating enhanced cyber-security.
Date: August 1, 2012
Creator: Linda, Ondrej; Vollmer, Todd & Manic, Milos
Partner: UNT Libraries Government Documents Department

The use of fuzzy control system methods for characterizing expert judgment uncertainty distributions

Description: Fuzzy logic methods permit experts to assess parameters affecting performance of components/systems in natural language terms more familiar to them (e.g., high, good, etc.). Recognizing that there is a cost associated with obtaining more precise information, the authors particular interest is in cases where the relationship between the condition of the system and its performance is not well understood, especially for some sets of possible operating conditions, and where developing a better understanding is very difficult and/or expensive. The methods allow the experts to make use of the level of precision with which they understand the underlying process. The authors consider and compare various methods of formulating the process just described, with an application in reliability analysis where expert information forms a significant (if not sole) source of data for reliability analysis. The flow of information through the fuzzy-control-systems based analysis is studied using a simple, hypothetical problem which mimics the structure used to elicit expert information in Parse. They also characterize the effect of using progressively more refined information and examine the use of fuzzy-based methods as data pooling/fusion mechanisms.
Date: December 1998
Creator: Smith, R. E.; Booker, J. M.; Bement, T. R.; Parkinson, W. J.; Meyer, M. A. & Jamshidi, M.
Partner: UNT Libraries Government Documents Department

Towards a formal taxonomy of hybrid uncertainty representations

Description: Recent years have seen a proliferation of methods in addition to probability theory to represent information and uncertainty, including fuzzy sets and systems, fuzzy measures, rough sets, random sets, possibility distributions, imprecise probabilities, etc. We can identify these fields collectively as General Information Theory. The components of GIT represent information according to different axiomatic bases, and are thus capable of capturing different semantic aspects of uncertainty. Traditionally, these semantic criteria include such categories as fuzziness, vagueness, nonspecificity, conflict, and randomness. So it is clear that there is a pressing need for the GIT community to synthesize these methods, searching out larger formal frameworks within which to place these various components with respect to each other. Ideally, syntactic (mathematical) generalization can both aid and be aided by the semantic analysis available in terms of the conceptual categories outlined above. In this paper we present some preliminary ideas about how to formally relate various uncertainty representations together in a taxonomic lattice, capturing both syntactic and semantic generalization. Some partial and provisional results are shown. Assume a simple finite universe of discourse {Omega} = (a, b, c). We want to describe a situation in which we ask a question of the sort {open_quotes}what is the value of a variable x which takes values in {Omega}?{close_quotes}. When there is no uncertainty, we have a single alternative, say x = a. In logical terms, we would say that the proposition p: {open_quotes}the value of x is a{close_quotes} is TRUE. Our approach begins with two primitive concepts which can change our knowledge of x, each of which represents a different form of uncertainty, nonspecificity and fuxxiness.
Date: February 1, 1997
Creator: Joslyn, C. & Rocha, L.
Partner: UNT Libraries Government Documents Department

System reliability assessment with an approximate reasoning model

Description: The projected service life of weapons in the US nuclear stockpile will exceed the original design life of their critical components. Interim metrics are needed to describe weapon states for use in simulation models of the nuclear weapons complex. The authors present an approach to this problem based upon the theory of approximate reasoning (AR) that allows meaningful assessments to be made in an environment where reliability models are incomplete. AR models are designed to emulate the inference process used by subject matter experts. The emulation is based upon a formal logic structure that relates evidence about components. This evidence is translated using natural language expressions into linguistic variables that describe membership in fuzzy sets. The authors introduce a metric that measures the acceptability of a weapon to nuclear deterrence planners. Implication rule bases are used to draw a series of forward chaining inferences about the acceptability of components, subsystems and individual weapons. They describe each component in the AR model in some detail and illustrate its behavior with a small example. The integration of the acceptability metric into a prototype model to simulate the weapons complex is also described.
Date: December 1998
Creator: Eisenhawer, S. W.; Bott, T. F.; Helm, T. M. & Boerigter, S. T.
Partner: UNT Libraries Government Documents Department

Automatic tool path generation for finish machining

Description: A system for automatic tool path generation was developed at Sandia National Laboratories for finish machining operations. The system consists of a commercially available 5-axis milling machine controlled by Sandia developed software. This system was used to remove overspray on cast turbine blades. A laser-based, structured-light sensor, mounted on a tool holder, is used to collect 3D data points around the surface of the turbine blade. Using the digitized model of the blade, a tool path is generated which will drive a 0.375 inch diameter CBN grinding pin around the tip of the blade. A fuzzified digital filter was developed to properly eliminate false sensor readings caused by burrs, holes and overspray. The digital filter was found to successfully generate the correct tool path for a blade with intentionally scanned holes and defects. The fuzzified filter improved the computation efficiency by a factor of 25. For application to general parts, an adaptive scanning algorithm was developed and presented with simulation results. A right pyramid and an ellipsoid were scanned successfully with the adaptive algorithm.
Date: March 1, 1997
Creator: Kwok, Kwan S.; Loucks, C.S. & Driessen, B.J.
Partner: UNT Libraries Government Documents Department

Intelligent controllers for battlefield simulations

Description: This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This project involved research in the area of warfighting simulation technology and methods. In the first year our emphasis was to produce a prototype repository for simulation software objects and simulation execution executives in the context of an integrated theater missile defense problem. This provided a necessary precursor to the detailed development. In the second year more detailed and complete environments. In the second year more detailed and complete development of composable simulation software was undertaken. An object description language was developed, an object composition architecture was developed and implemented, and a prototyping test bed was produced to assess the technical concepts and as a demonstration tool for programmatic activities. In the third and final year it became apparent that the representation of intelligent entities, specifically those that fulfill command and control functions in warfighting systems, was a challenging and serious technical shortcoming of all existing approaches to simulation of warfare. 10 refs.
Date: August 1, 1997
Creator: Barrett, C.L.; Stroud, P. & Reidys, C.
Partner: UNT Libraries Government Documents Department

Cooperative target convergence using multiple agents

Description: This work considers the problem of causing multiple (100`s) autonomous mobile robots to converge to a target and provides a follow-the-leader approach to the problem. Each robot has only a limited-range sensor for sending the target and also larger but also limited-range robot-to-robot communication capability. Because of the small amount of information available to the robots, a practical approach to improve convergence to the target is to have a robot follow the robot with the best quality of information. Specifically, each robot emits a signal that informs in-range robots what its status is. A robot has a status value of 0 if it is itself in range of the target. A robot has a status of 1 if it is not in range of the target but is in communication range of a robot that is in range of the target. A robot has a status of 2 if it is not in range of the target but is within range of another robot that has status 1, and so on. Of all the mobile robots that any given robot is in range of, it follows the one with the best status. The emergent behavior is the ant-like trails of robots following each other toward the target. If the robot is not in range of another robot that is either in range of the target or following another robot, the robot will assign-1 to its quality-of-information, and will execute an exhaustive search. The exhaustive search will continue until it encounters either the target or another robot with a nonnegative quality-of-information. The quality of information approach was extended to the case where each robot only has two-bit signals informing it of distance to in-range robots.
Date: October 1, 1997
Creator: Kwok, K.S. & Driessen, B.J.
Partner: UNT Libraries Government Documents Department


Description: Incomplete or sparse data such as geologic or formation characteristics introduce a high level of risk for oil exploration and development projects. ''Expert'' systems developed and used in several disciplines and industries have demonstrated beneficial results when working with sparse data. State-of-the-art expert exploration tools, relying on a database, and computer maps generated by neural networks and user inputs, have been developed through the use of ''fuzzy'' logic, a mathematical treatment of imprecise or non-explicit parameters and values. Oil prospecting risk has been reduced with the use of these properly verified and validated ''Fuzzy Expert Exploration (FEE) Tools.'' Through the course of this project, FEE Tools and supporting software were developed for two producing formations in southeast New Mexico. Tools of this type can be beneficial in many regions of the U.S. by enabling risk reduction in oil and gas prospecting as well as decreased prospecting and development costs. In today's oil industry environment, many smaller exploration companies lack the resources of a pool of expert exploration personnel. Downsizing, volatile oil prices, and scarcity of domestic exploration funds have also affected larger companies, and will, with time, affect the end users of oil industry products in the U.S. as reserves are depleted. The FEE Tools benefit a diverse group in the U.S., allowing a more efficient use of scarce funds, and potentially reducing dependence on foreign oil and providing lower product prices for consumers.
Date: March 1, 2005
Creator: Balch, Robert S. & Broadhead, Ron
Partner: UNT Libraries Government Documents Department


Description: Incomplete or sparse information on types of data such as geologic or formation characteristics introduces a high level of risk for oil exploration and development projects. ''Expert'' systems developed and used in several disciplines and industries have demonstrated beneficial results. A state-of-the-art exploration ''expert'' tool, relying on a computerized database and computer maps generated by neural networks, is being developed through the use of ''fuzzy'' logic, a relatively new mathematical treatment of imprecise or non-explicit parameters and values. Oil prospecting risk can be reduced with the use of a properly developed and validated ''Fuzzy Expert Exploration (FEE) Tool.'' This FEE Tool can be beneficial in many regions of the U.S. by enabling risk reduction in oil and gas prospecting as well as decreased prospecting and development costs. In the 1998-1999 oil industry environment, many smaller exploration companies lacked the resources of a pool of expert exploration personnel. Downsizing, low oil prices, and scarcity of exploration funds have also affected larger companies, and will, with time, affect the end users of oil industry products in the U.S. as reserves are depleted. The FEE Tool will benefit a diverse group in the U.S., leading to a more efficient use of scarce funds, and possibly decreasing dependence on foreign oil and lower product prices for consumers. This ninth of ten semi-annual reports contains a summary of progress to date, problems encountered, plans for the next year, and an assessment of the prospects for future progress. The emphasis during the March 2003 through September 2003 period was directed toward Silurian-Devonian geology, development of rules for the fuzzy system, and on-line software.
Date: October 15, 2003
Creator: Balch, Robert
Partner: UNT Libraries Government Documents Department

Fuzzy Set Theory Applied to Measurement Data for Exposure Control in Beryllium Part Manufacturing.

Description: Fuzzy set theory has been applied to some exposure control problems encountered in the machining and the manufacturing of beryllium parts at Los Alamos National Laboratory. A portion of that work is presented here. The major driving force for using fuzzy techniques in this case rather than classical statistical process control is that beryllium exposure is very task dependent and this manufacturing plant is quite atypical. It is feared that standard techniques produce too many false alarms. Our beryllium plant produces parts on a daily basis, but every day is different. Some days many parts are produced and some days only a few. Some times the parts are large and sometimes the parts are small. Some machining cuts are rough and some are fine. These factors and others make it hard to define a typical day. The problem of concern, for this study, is the worker beryllium exposure. Even though the plant is new and very modern and the exposure levels are expected to be well below the required levels, the Department of Energy (DOE), who is our major customer, has demanded that the levels for this plant be well below required levels. The control charts used to monitor this process are expected to answer two questions: (1) Is the process out of Control? Do we need to instigate special controls such as requiring workers to use respirators? (2) Are new, previously untested, controls making a difference? The standard Schewart type control charts, based on consistent plant operating conditions do not adequately answer this question. The approach described here is based upon a fuzzy modification to the Schewart Xbar-R chart. This approach is expected to yield better results than work based upon the classical probabilistic control chart.
Date: January 1, 2002
Creator: Parkinson, W. J. (William Jerry),; Abeln, S. P. (Stephen Patrick); Creek, K. L. (Kathryn L.); Mortensen, F. N. (Fred N.); Wantuck, P. J. (Paul J.); Ross, Timothy J. et al.
Partner: UNT Libraries Government Documents Department

Effect of noise on chaotic fuzzy mappings

Description: Chaotic mappings in the space of fuzzy sets induced by mappings of the underlying reference set are investigated. Different fuzzification schemes are considered and their impact on the resultant iterated fuzzy set, under a quadratic mapping, is studied numerically. The fuzzy set mapping is described in terms of the mapping of level cuts, resulting from the resolution theorem for fuzzy sets. In the two-dimensional case, a generalized notion, given as a fuzzy set, of the Hausdorff dimension is formulated. An example, based on the Henon Mapping, is provided.
Date: March 1, 1996
Creator: Zardecki, A.
Partner: UNT Libraries Government Documents Department

Grey analysis

Description: Grey logic is not another name for fuzzy logic. Grey logic--also called grey analysis or grey system theory--is a new technology, a group of techniques for system analysis and modeling. Like fuzzy logic, grey logic is useful in situations with incomplete and uncertain information. Grey analysis is particularly applicable in instances with very limited data and in cases with little system knowledge or understanding. In this paper, a summary of the basic concepts of grey analysis is provided, with descriptions of its application to several classes of problems. Calculations methods are provided for grey relation analysis, and for modeling and prediction using grey methods.
Date: December 1, 1996
Creator: Cable, G.D.
Partner: UNT Libraries Government Documents Department

Hybrid processing of stochastic and subjective uncertainty data

Description: Uncertainty analyses typically recognize separate stochastic and subjective sources of uncertainty, but do not systematically combine the two, although a large amount of data used in analyses is partly stochastic and partly subjective. We have developed methodology for mathematically combining stochastic and subjective data uncertainty, based on new ``hybrid number`` approaches. The methodology can be utilized in conjunction with various traditional techniques, such as PRA (probabilistic risk assessment) and risk analysis decision support. Hybrid numbers have been previously examined as a potential method to represent combinations of stochastic and subjective information, but mathematical processing has been impeded by the requirements inherent in the structure of the numbers, e.g., there was no known way to multiply hybrids. In this paper, we will demonstrate methods for calculating with hybrid numbers that avoid the difficulties. By formulating a hybrid number as a probability distribution that is only fuzzy known, or alternatively as a random distribution of fuzzy numbers, methods are demonstrated for the full suite of arithmetic operations, permitting complex mathematical calculations. It will be shown how information about relative subjectivity (the ratio of subjective to stochastic knowledge about a particular datum) can be incorporated. Techniques are also developed for conveying uncertainty information visually, so that the stochastic and subjective constituents of the uncertainty, as well as the ratio of knowledge about the two, are readily apparent. The techniques demonstrated have the capability to process uncertainty information for independent, uncorrelated data, and for some types of dependent and correlated data. Example applications are suggested, illustrative problems are worked, and graphical results are given.
Date: November 1, 1995
Creator: Cooper, J.A.; Ferson, S. & Ginzburg, L.
Partner: UNT Libraries Government Documents Department

Possibilistic systems within a general information theory

Description: The author surveys possibilistic systems theory and place it in the context of Imprecise Probabilities and General Information Theory (GIT). In particular, he argues that possibilistic systems hold a distinct position within a broadly conceived, synthetic GIT. The focus is on systems and applications which are semantically grounded by empirical measurement methods (statistical counting), rather than epistemic or subjective knowledge elicitation or assessment methods. Regarding fuzzy measures as special provisions, and evidence measures (belief and plausibility measures) as special fuzzy measures, thereby he can measure imprecise probabilities directly and empirically from set-valued frequencies (random set measurement). More specifically, measurements of random intervals yield empirical fuzzy intervals. In the random set (Dempster-Shafer) context, probability and possibility measures stand as special plausibility measures in that their distributionality (decomposability) maps directly to an aggregable structure of the focal classes of their random sets. Further, possibility measures share with imprecise probabilities the ability to better handle open world problems where the universe of discourse is not specified in advance. In addition to empirically grounded measurement methods, possibility theory also provides another crucial component of a full systems theory, namely prediction methods in the form of finite (Markov) processes which are also strictly analogous to the probabilistic forms.
Date: June 1, 1999
Creator: Joslyn, C.
Partner: UNT Libraries Government Documents Department

Sensor fusion and nonlinear prediction for anomalous event detection

Description: The authors consider the problem of using the information from various time series, each one characterizing a different physical quantity, to predict the future state of the system and, based on that information, to detect and classify anomalous events. They stress the application of principal components analysis (PCA) to analyze and combine data from different sensors. They construct both linear and nonlinear predictors. In particular, for linear prediction the authors use the least-mean-square (LMS) algorithm and for nonlinear prediction they use both backpropagation (BP) networks and fuzzy predictors (FP). As an application, they consider the prediction of gamma counts from past values of electron and gamma counts recorded by the instruments of a high altitude satellite.
Date: March 7, 1995
Creator: Hernandez, J.V.; Moore, K.R. & Elphic, R.C.
Partner: UNT Libraries Government Documents Department

A fuzzy control system for a three-phase oil field centrifuge

Description: The three-phase centrifuge discussed here is an excellent device for cleaning up oil field and refinery wastes. These wastes are typically composed of hydrocarbons, water, and solids. This technology converts waste, which is often classified as hazardous, into salable oil, reusable water, and solids that can be placed in landfills. No secondary waste is produced. A major problem is that only one person can set up and run the equipment well enough to provide an optimal cleanup. Demand for this technology has far exceeded a one-man operation. The solution to this problem is an intelligent control system that can replace a highly skilled operator so that several centrifuges can be operated at different locations at the same time.
Date: December 31, 1998
Creator: Parkinson, W.J.; Smith, R.E.; Wantuck, P.J. & Miller, N.
Partner: UNT Libraries Government Documents Department

Continued fractions: Yet another tool to overcome the curse of dimensionality

Description: The authors provide a rapid prediction method, in which a larger number of antecedents than currently considered is accounted for. To this end, they encode the successive (possibly rescaled) values of a time series, as the partial quotients of a continued fraction, resulting in a number from the unit interval. The accuracy of a ruled-based system utilizing this coding is investigated to some extent. Qualitative criteria for the applicability of the algorithm are formulated.
Date: December 31, 1998
Creator: Zardecki, A.
Partner: UNT Libraries Government Documents Department