208 Matching Results

Search Results

Advanced search parameters have been applied.

Test of the generalizability of "KBIT" (an artificial intelligence-derived assessment instrument) across medical problems

Description: The purpose of this study was to determine if KBIT's (Knowledge Base Inference Tool) psychometric properties and the relationships between course and fine cognitive constructs could be shown to be generalizeable across medical problems Subsequently, this study represents the initial testing of KBIT's generalizeability via the use of the problem space called neurological "weakness".
Date: May 1991
Creator: Papa, Frank J.
Partner: UNT Libraries

Large neighborhood search for the double traveling salesman problem with multiple stacks

Description: This paper considers a complex real-life short-haul/long haul pickup and delivery application. The problem can be modeled as double traveling salesman problem (TSP) in which the pickups and the deliveries happen in the first and second TSPs respectively. Moreover, the application features multiple stacks in which the items must be stored and the pickups and deliveries must take place in reserve (LIFO) order for each stack. The goal is to minimize the total travel time satisfying these constraints. This paper presents a large neighborhood search (LNS) algorithm which improves the best-known results on 65% of the available instances and is always within 2% of the best-known solutions.
Date: January 1, 2009
Creator: Bent, Russell W & Van Hentenryck, Pascal
Partner: UNT Libraries Government Documents Department

Expert Systems and Emergency Management: An Annotated Bibliography

Description: Abstract: This report is the result of an in-depth review of the recent technical literature on expert systems,. The material contained in this report provided a basis for assessing the potential for using expert systems in emergency management operations. In choosing the material for inclusion in this report, special emphasis was placed on those aspects of expert systems which addressed the types of problems encountered in emergency management operations. The report is designed for use as a resource document and as a tutorial on expert systems and emergency management. Each chapter consists of a brief topic essay followed by a set of references which expand on the main themes of the essay.
Date: November 1986
Creator: Gass, Saul I.; Bhasker, Suneel & Chapman, Robert E.
Partner: UNT Libraries Government Documents Department

Faith in the algorithm, part 1: beyond the turing test

Description: Since the Turing test was first proposed by Alan Turing in 1950, the goal of artificial intelligence has been predicated on the ability for computers to imitate human intelligence. However, the majority of uses for the computer can be said to fall outside the domain of human abilities and it is exactly outside of this domain where computers have demonstrated their greatest contribution. Another definition for artificial intelligence is one that is not predicated on human mimicry, but instead, on human amplification, where the algorithms that are best at accomplishing this are deemed the most intelligent. This article surveys various systems that augment human and social intelligence.
Date: January 1, 2009
Creator: Rodriguez, Marko A & Pepe, Alberto
Partner: UNT Libraries Government Documents Department

Orbit-product representation and correction of Gaussian belief propagation

Description: We present a new interpretation of Gaussian belief propagation (GaBP) based on the 'zeta function' representation of the determinant as a product over orbits of a graph. We show that GaBP captures back-tracking orbits of the graph and consider how to correct this estimate by accounting for non-backtracking orbits. We show that the product over non-backtracking orbits may be interpreted as the determinant of the non-backtracking adjacency matrix of the graph with edge weights based on the solution of GaBP. An efficient method is proposed to compute a truncated correction factor including all non-backtracking orbits up to a specified length.
Date: January 1, 2009
Creator: Johnson, Jason K; Chertkov, Michael & Chernyak, Vladimir
Partner: UNT Libraries Government Documents Department

Training SVMs without offset

Description: We develop, analyze, and test a training algorithm for support vector machine cla.'>sifiers without offset. Key features of this algorithm are a new stopping criterion and a set of working set selection strategies that, although inexpensive, do not lead to substantially more iterations than the optimal working set selection strategy. For these working set strategies, we establish convergence rates that coincide with the best known rates for SYMs with offset. We further conduct various experiments that investigate both the run time behavior and the performed iterations of the new training algorithm. It turns out, that the new algorithm needs less iterations and run-time than standard training algorithms for SYMs with offset.
Date: January 1, 2009
Creator: Steinwart, Ingo; Hush, Don & Scovel, Clint
Partner: UNT Libraries Government Documents Department

Computer Realization of Human Music Cognition

Description: This study models the human process of music cognition on the digital computer. The definition of music cognition is derived from the work in music cognition done by the researchers Carol Krumhansl and Edward Kessler, and by Mari Jones, as well as from the music theories of Heinrich Schenker. The computer implementation functions in three stages. First, it translates a musical "performance" in the form of MIDI (Musical Instrument Digital Interface) messages into LISP structures. Second, the various parameters of the performance are examined separately a la Jones's joint accent structure, quantified according to psychological findings, and adjusted to a common scale. The findings of Krumhansl and Kessler are used to evaluate the consonance of each note with respect to the key of the piece and with respect to the immediately sounding harmony. This process yields a multidimensional set of points, each of which is a cognitive evaluation of a single musical event within the context of the piece of music within which it occurred. This set of points forms a metric space in multi-dimensional Euclidean space. The third phase of the analysis maps the set of points into a topology-preserving data structure for a Schenkerian-like middleground structural analysis. This process yields a hierarchical stratification of all the musical events (notes) in a piece of music. It has been applied to several pieces of music with surprising results. In each case, the analysis obtained very closely resembles a structural analysis which would be supplied by a human theorist. The results obtained invite us to take another look at the representation of knowledge and perception from another perspective, that of a set of points in a topological space, and to ask if such a representation might not be useful in other domains. It also leads us to ask if such a ...
Date: August 1988
Creator: Albright, Larry E. (Larry Eugene)
Partner: UNT Libraries

A Comparative Analysis of Guided vs. Query-Based Intelligent Tutoring Systems (ITS) Using a Class-Entity-Relationship-Attribute (CERA) Knowledge Base

Description: One of the greatest problems facing researchers in the sub field of Artificial Intelligence known as Intelligent Tutoring Systems (ITS) is the selection of a knowledge base designs that will facilitate the modification of the knowledge base. The Class-Entity-Relationship-Attribute (CERA), proposed by R. P. Brazile, holds certain promise as a more generic knowledge base design framework upon which can be built robust and efficient ITS. This study has a twofold purpose. The first is to demonstrate that a CERA knowledge base can be constructed for an ITS on a subset of the domain of Cretaceous paleontology and function as the "expert module" of the ITS. The second is to test the validity of the ideas that students guided through a lesson learn more factual knowledge, while those who explore the knowledge base that underlies the lesson through query at their own pace will be able to formulate their own integrative knowledge from the knowledge gained in their explorations and spend more time on the system. This study concludes that a CERA-based system can be constructed as an effective teaching tool. However, while an ITS - treatment provides for statistically significant gains in achievement test scores, the type of treatment seems not to matter as much as time spent on task. This would seem to indicate that a query-based system which allows the user to progress at their own pace would be a better type of system for the presentation of material due to the greater amount of on-line computer time exhibited by the users.
Date: August 1987
Creator: Hall, Douglas Lee
Partner: UNT Libraries

A Tutorial on the Construction of High-Performance Resolution/Paramodulation Systems

Description: Over the past 25 years, researchers have written numerous deduction systems based on resolution and paramodulation. Of these systems, a very few have been capable of generating and maintaining a formula database "containing more than just a few thousand clauses. These few systems were used to explore mechanisms for rapidly extracting limited subsets of relevant" clauses. We have written this tutorial to reflect some of the best ideas that have emerged and to cast them in a form that makes them easily accessible to students wishing to write their own high-performance systems.
Date: September 1990
Creator: Butler, R. & Overbeek, R.
Partner: UNT Libraries Government Documents Department

Theory Institute in Automated Reasoning : Held at Argonne National Laboratory, August 6-10, 1990

Description: On August 6--10, 1990, Argonne National Laboratory hosted a Theory Institute in Automated Reasoning. The institute was organized by the Mathematics and Computer Science Division and was supported by special funding from Argonne's Physical Research Program Administration. The focus of the Institute was on the obstacles confronting the effective automation of reasoning. The objective was to lay the groundwork for formulating a theory governing the interrelationship of representation, inference rule, and strategy. Here we summarize the activities that took place during the week-long Institute. We also present an evaluation of the progress achieved-progress that includes the solution of challenge questions, the increasing use of both our database of problems and our automated reasoning program OTTER, and the discovery of new used for OTTER.
Date: August 1990
Creator: Wos, Larry
Partner: UNT Libraries Government Documents Department

An ITP Workbook

Description: This collection of exercises has been prepared to teach the use of the automated reasoning system ITP. Previous knowledge of automated reasoning is not presumed. The exercises are designed for use with the UNIX operating system.
Date: December 1986
Creator: Kalman, John A.
Partner: UNT Libraries Government Documents Department

A Design Philosophy for Reliable Systems, Including Control

Description: This report develops a framework for a universe of discourse usable by such non-human experts. It is based on the idea that a design has many features of a contract and may be described as a contract between humans and a machine, defining what each must do to attain a goal. Several points are discussed: the use of techniques in analytical redundancy and their place as analogues in administrative control for conventional techniques in physical control; the use of redundant computer systems to protect against hardware faults; the necessity to prove properties of software used in redundant hardware, because software faults are common modes across redundant hardware; and some issues in choosing a programming language for provable control software. Because proof of correctness is costly, it should be used only where necessary. This report concludes that the degree of reliability needed by the plant model used in analytic redundancy protection need not be nearly as reliable as the mechanism to detect discrepancy between plant and model.
Date: April 1984
Creator: Gabriel, John R.
Partner: UNT Libraries Government Documents Department

Tutorial on the Warren Abstract Machine for Computational Logic

Description: Tutorial description of the Warren machine with a basic introduction to the motivation of the machine and the instructions that define it. Discussion of the fairly limited extensions required to extend the machine for more general use outside of implementations of logic programming. Substantial speedups will occur due to improvements in the implementation of the basic algorithms.
Date: June 1985
Creator: Gabriel, John R.; Lindholm, Tim; Lusk, E. L. & Overbeek, R. A.
Partner: UNT Libraries Government Documents Department

Oracle inequalities for SVMs that are based on random entropy numbers

Description: In this paper we present a new technique for bounding local Rademacher averages of function classes induced by a loss function and a reproducing kernel Hilbert space (RKHS). At the heart of this technique lies the observation that certain expectations of random entropy numbers can be bounded by the eigenvalues of the integral operator associated to the RKHS. We then work out the details of the new technique by establishing two new oracle inequalities for SVMs, which complement and generalize orevious results.
Date: January 1, 2009
Creator: Steinwart, Ingo
Partner: UNT Libraries Government Documents Department

Hybrid Human-Artificial Intelligence Approach for Pavement Distress Assessment (PICUCHA)

Description: This article proposes a new method named PICture Unsupervised Classification with Human Analysis (PICUCHA) to circumvent many of the limitations of existing approaches, based on a combination of human and artificial intelligence.
Date: July 2017
Creator: Salini, Reus; Xu, Bugao & Carvalho, Regis
Partner: UNT College of Merchandising, Hospitality and Tourism

The Emerging Intelligence and Its Critical Look at Us

Description: Abstract: In response to Susan Gunn's editorial, I offer a less comforting but more utilitarian perspective on the life and death of artificial consciousness. Admittedly an unpopular view, it suggests that concurrence with Gunn's message represents the seeds of our own destruction, as an emerging synthetic intelligence begins to extinguish us.
Date: Autumn 1998
Creator: Thaler, Stephen L.
Partner: UNT Libraries

Temporal Connectionist Expert Systems Using a Temporal Backpropagation Algorithm

Description: Representing time has been considered a general problem for artificial intelligence research for many years. More recently, the question of representing time has become increasingly important in representing human decision making process through connectionist expert systems. Because most human behaviors unfold over time, any attempt to represent expert performance, without considering its temporal nature, can often lead to incorrect results. A temporal feedforward neural network model that can be applied to a number of neural network application areas, including connectionist expert systems, has been introduced. The neural network model has a multi-layer structure, i.e. the number of layers is not limited. Also, the model has the flexibility of defining output nodes in any layer. This is especially important for connectionist expert system applications. A temporal backpropagation algorithm which supports the model has been developed. The model along with the temporal backpropagation algorithm makes it extremely practical to define any artificial neural network application. Also, an approach that can be followed to decrease the memory space used by weight matrix has been introduced. The algorithm was tested using a medical connectionist expert system to show how best we describe not only the disease but also the entire course of the disease. The system, first, was trained using a pattern that was encoded from the expert system knowledge base rules. Following then, series of experiments were carried out using the temporal model and the temporal backpropagation algorithm. The first series of experiments was done to determine if the training process worked as predicted. In the second series of experiments, the weight matrix in the trained system was defined as a function of time intervals before presenting the system with the learned patterns. The result of the two experiments indicate that both approaches produce correct results. The only difference between the two results ...
Date: December 1993
Creator: Civelek, Ferda N. (Ferda Nur)
Partner: UNT Libraries