Computational Complexity of Hopfield Networks

Computational Complexity of Hopfield Networks

Date: August 1998
Creator: Tseng, Hung-Li
Description: There are three main results in this dissertation. They are PLS-completeness of discrete Hopfield network convergence with eight different restrictions, (degree 3, bipartite and degree 3, 8-neighbor mesh, dual of the knight's graph, hypercube, butterfly, cube-connected cycles and shuffle-exchange), exponential convergence behavior of discrete Hopfield network, and simulation of Turing machines by discrete Hopfield Network.
Contributing Partner: UNT Libraries
Classification by Neural Network and Statistical Models in Tandem: Does Integration Enhance Performance?

Classification by Neural Network and Statistical Models in Tandem: Does Integration Enhance Performance?

Date: December 1998
Creator: Mitchell, David
Description: The major purposes of the current research are twofold. The first purpose is to present a composite approach to the general classification problem by using outputs from various parametric statistical procedures and neural networks. The second purpose is to compare several parametric and neural network models on a transportation planning related classification problem and five simulated classification problems.
Contributing Partner: UNT Libraries
Hopfield Networks as an Error Correcting Technique for Speech Recognition

Hopfield Networks as an Error Correcting Technique for Speech Recognition

Access: Use of this item is restricted to the UNT Community.
Date: May 2004
Creator: Bireddy, Chakradhar
Description: I experimented with Hopfield networks in the context of a voice-based, query-answering system. Hopfield networks are used to store and retrieve patterns. I used this technique to store queries represented as natural language sentences and I evaluated the accuracy of the technique for error correction in a spoken question-answering dialog between a computer and a user. I show that the use of an auto-associative Hopfield network helps make the speech recognition system more fault tolerant. I also looked at the available encoding schemes to convert a natural language sentence into a pattern of zeroes and ones that can be stored in the Hopfield network reliably, and I suggest scalable data representations which allow storing a large number of queries.
Contributing Partner: UNT Libraries
A Multi-Time Scale Learning Mechanism for Neuromimic Processing

A Multi-Time Scale Learning Mechanism for Neuromimic Processing

Date: August 1994
Creator: Mobus, George E. (George Edward)
Description: Learning and representing and reasoning about temporal relations, particularly causal relations, is a deep problem in artificial intelligence (AI). Learning such representations in the real world is complicated by the fact that phenomena are subject to multiple time scale influences and may operate with a strange attractor dynamic. This dissertation proposes a new computational learning mechanism, the adaptrode, which, used in a neuromimic processing architecture may help to solve some of these problems. The adaptrode is shown to emulate the dynamics of real biological synapses and represents a significant departure from the classical weighted input scheme of conventional artificial neural networks. Indeed the adaptrode is shown, by analysis of the deep structure of real synapses, to have a strong structural correspondence with the latter in terms of multi-time scale biophysical processes. Simulations of an adaptrode-based neuron and a small network of neurons are shown to have the same learning capabilities as invertebrate animals in classical conditioning. Classical conditioning is considered a fundamental learning task in animals. Furthermore, it is subject to temporal ordering constraints that fulfill the criteria of causal relations in natural systems. It may offer clues to the learning of causal relations and mechanisms for causal reasoning. The ...
Contributing Partner: UNT Libraries
A Heuristic Procedure for Specifying Parameters in Neural Network Models for Shewhart X-bar Control Chart Applications

A Heuristic Procedure for Specifying Parameters in Neural Network Models for Shewhart X-bar Control Chart Applications

Date: December 1993
Creator: Nam, Kyungdoo T.
Description: This study develops a heuristic procedure for specifying parameters for a neural network configuration (learning rate, momentum, and the number of neurons in a single hidden layer) in Shewhart X-bar control chart applications. Also, this study examines the replicability of the neural network solution when the neural network is retrained several times with different initial weights.
Contributing Partner: UNT Libraries