You limited your search to:

  Partner: UNT Libraries
 Degree Discipline: Computer Science
 Collection: UNT Theses and Dissertations
Simulating the Spread of Infectious Diseases in Heterogeneous Populations with Diverse Interactions Characteristics
The spread of infectious diseases has been a public concern throughout human history. Historic recorded data has reported the severity of infectious disease epidemics in different ages. Ancient Greek physician Hippocrates was the first to analyze the correlation between diseases and their environment. Nowadays, health authorities are in charge of planning strategies that guarantee the welfare of citizens. The simulation of contagion scenarios contributes to the understanding of the epidemic behavior of diseases. Computational models facilitate the study of epidemics by integrating disease and population data to the simulation. The use of detailed demographic and geographic characteristics allows researchers to construct complex models that better resemble reality and the integration of these attributes permits us to understand the rules of interaction. The interaction of individuals with similar characteristics forms synthetic structures that depict clusters of interaction. The synthetic environments facilitate the study of the spread of infectious diseases in diverse scenarios. The characteristics of the population and the disease concurrently affect the local and global epidemic progression. Every cluster’ epidemic behavior constitutes the global epidemic for a clustered population. By understanding the correlation between structured populations and the spread of a disease, current dissertation research makes possible to identify risk groups of specific characteristics and devise containment strategies that facilitate health authorities to improve mitigation strategies. digital.library.unt.edu/ark:/67531/metadc407831/
A Smooth-turn Mobility Model for Airborne Networks
In this article, I introduce a novel airborne network mobility model, called the Smooth Turn Mobility Model, that captures the correlation of acceleration for airborne vehicles across time and spatial coordinates. E?ective routing in airborne networks (ANs) relies on suitable mobility models that capture the random movement pattern of airborne vehicles. As airborne vehicles cannot make sharp turns as easily as ground vehicles do, the widely used mobility models for Mobile Ad Hoc Networks such as Random Waypoint and Random Direction models fail. Our model is realistic in capturing the tendency of airborne vehicles toward making straight trajectory and smooth turns with large radius, and whereas is simple enough for tractable connectivity analysis and routing design. digital.library.unt.edu/ark:/67531/metadc149603/
Socioscope: Human Relationship and Behavior Analysis in Mobile Social Networks
Access: Use of this item is restricted to the UNT Community.
The widely used mobile phone, as well as its related technologies had opened opportunities for a complete change on how people interact and build relationship across geographic and time considerations. The convenience of instant communication by mobile phones that broke the barrier of space and time is evidently the key motivational point on why such technologies so important in people's life and daily activities. Mobile phones have become the most popular communication tools. Mobile phone technology is apparently changing our relationship to each other in our work and lives. The impact of new technologies on people's lives in social spaces gives us the chance to rethink the possibilities of technologies in social interaction. Accordingly, mobile phones are basically changing social relations in ways that are intricate to measure with any precision. In this dissertation I propose a socioscope model for social network, relationship and human behavior analysis based on mobile phone call detail records. Because of the diversities and complexities of human social behavior, one technique cannot detect different features of human social behaviors. Therefore I use multiple probability and statistical methods for quantifying social groups, relationships and communication patterns, for predicting social tie strengths and for detecting human behavior changes and unusual consumption events. I propose a new reciprocity index to measure the level of reciprocity between users and their communication partners. The experimental results show that this approach is effective. Among other applications, this work is useful for homeland security, detection of unwanted calls (e.g., spam), telecommunication presence, and marketing. In my future work I plan to analyze and study the social network dynamics and evolution. digital.library.unt.edu/ark:/67531/metadc30533/
Split array and scalar data cache: A comprehensive study of data cache organization.
Existing cache organization suffers from the inability to distinguish different types of localities, and non-selectively cache all data rather than making any attempt to take special advantage of the locality type. This causes unnecessary movement of data among the levels of the memory hierarchy and increases in miss ratio. In this dissertation I propose a split data cache architecture that will group memory accesses as scalar or array references according to their inherent locality and will subsequently map each group to a dedicated cache partition. In this system, because scalar and array references will no longer negatively affect each other, cache-interference is diminished, delivering better performance. Further improvement is achieved by the introduction of victim cache, prefetching, data flattening and reconfigurability to tune the array and scalar caches for specific application. The most significant contribution of my work is the introduction of novel cache architecture for embedded microprocessor platforms. My proposed cache architecture uses reconfigurability coupled with split data caches to reduce area and power consumed by cache memories while retaining performance gains. My results show excellent reductions in both memory size and memory access times, translating into reduced power consumption. Since there was a huge reduction in miss rates at L-1 caches, further power reduction is achieved by partially or completely shutting down L-2 data or L-2 instruction caches. The saving in cache sizes resulting from these designs can be used for other processor activities including instruction and data prefetching, branch-prediction buffers. The potential benefits of such techniques for embedded applications have been evaluated in my work. I also explore how my cache organization performs for non-numeric data structures. I propose a novel idea called "Data flattening" which is a profile based memory allocation technique to compress sparsely scattered pointer data into regular contiguous memory locations and explore the potentials of my proposed Spit cache organization for data treated with data flattening method. digital.library.unt.edu/ark:/67531/metadc3932/
Study and Sample Implementation of the Secure Shell Protocol (SSH)
Access: Use of this item is restricted to the UNT Community.
Security is one of the main concerns of users who need to connect to a remote computer for various purposes, such as checking e-mails or viewing files. However in today's computer networks, privacy, transmission to intended client is not guaranteed. If data is transmitted over the Internet or a local network as plain text it may be captured and viewed by anyone with little technical knowledge. This may include sensitive data such as passwords. Big businesses use firewalls, virtual private networks and encrypt their transmissions to counter this at high costs. Secure shell protocol (SSH) provides an answer to this. SSH is a software protocol for secure communication over an insecure network. SSH not only offers authentication of hosts but also encrypts the sessions between the client and the server and is transparent to the end user. This Problem in Lieu of Thesis makes a study of SSH and creates a sample secure client and server which follows SSH and examines its performance. digital.library.unt.edu/ark:/67531/metadc4143/
A Study of Perceptually Tuned, Wavelet Based, Rate Scalable, Image and Video Compression
Access: Use of this item is restricted to the UNT Community.
In this dissertation, first, we have proposed and implemented a new perceptually tuned wavelet based, rate scalable, and color image encoding/decoding system based on the human perceptual model. It is based on state-of-the-art research on embedded wavelet image compression technique, Contrast Sensitivity Function (CSF) for Human Visual System (HVS) and extends this scheme to handle optimal bit allocation among multiple bands, such as Y, Cb, and Cr. Our experimental image codec shows very exciting results in compression performance and visual quality comparing to the new wavelet based international still image compression standard - JPEG 2000. On the other hand, our codec also shows significant better speed performance and comparable visual quality in comparison to the best codec available in rate scalable color image compression - CSPIHT that is based on Set Partition In Hierarchical Tree (SPIHT) and Karhunen-Loeve Transform (KLT). Secondly, a novel wavelet based interframe compression scheme has been developed and put into practice. It is based on the Flexible Block Wavelet Transform (FBWT) that we have developed. FBWT based interframe compression is very efficient in both compression and speed performance. The compression performance of our video codec is compared with H263+. At the same bit rate, our encoder, being comparable to the H263+ scheme, with a slightly lower (Peak Signal Noise Ratio (PSNR) value, produces a more visually pleasing result. This implementation also preserves scalability of wavelet embedded coding technique. Thirdly, the scheme to handle optimal bit allocation among color bands for still imagery has been modified and extended to accommodate the spatial-temporal sensitivity of the HVS model. The bit allocation among color bands based on Kelly's spatio-temporal CSF model is designed to achieve the perceptual optimum for human eyes. A perceptually tuned, wavelet based, rate scalable video encoding/decoding system has been designed and implemented based on this new bit allocation scheme. Finally to present the potential applications of our rate scalable video codec, a prototype system for rate scalable video streaming over the Internet has been designed and implemented to deal with the bandwidth unpredictability of the Internet. digital.library.unt.edu/ark:/67531/metadc3074/
Survey of Approximation Algorithms for Set Cover Problem
In this thesis, I survey 11 approximation algorithms for unweighted set cover problem. I have also implemented the three algorithms and created a software library that stores the code I have written. The algorithms I survey are: 1. Johnson's standard greedy; 2. f-frequency greedy; 3. Goldsmidt, Hochbaum and Yu's modified greedy; 4. Halldorsson's local optimization; 5. Dur and Furer semi local optimization; 6. Asaf Levin's improvement to Dur and Furer; 7. Simple rounding; 8. Randomized rounding; 9. LP duality; 10. Primal-dual schema; and 11. Network flow technique. Most of the algorithms surveyed are refinements of standard greedy algorithm. digital.library.unt.edu/ark:/67531/metadc12118/
Symplectic Integration of Nonseparable Hamiltonian Systems
Numerical methods are usually necessary in solving Hamiltonian systems since there is often no closed-form solution. By utilizing a general property of Hamiltonians, namely the symplectic property, all of the qualities of the system may be preserved for indefinitely long integration times because all of the integral (Poincare) invariants are conserved. This allows for more reliable results and frequently leads to significantly shorter execution times as compared to conventional methods. The resonant triad Hamiltonian with one degree of freedom will be focused upon for most of the numerical tests because of its difficult nature and, moreover, analytical results exist whereby useful comparisons can be made. digital.library.unt.edu/ark:/67531/metadc278485/
System and Methods for Detecting Unwanted Voice Calls
Voice over IP (VoIP) is a key enabling technology for the migration of circuit-switched PSTN architectures to packet-based IP networks. However, this migration is successful only if the present problems in IP networks are addressed before deploying VoIP infrastructure on a large scale. One of the important issues that the present VoIP networks face is the problem of unwanted calls commonly referred to as SPIT (spam over Internet telephony). Mostly, these SPIT calls are from unknown callers who broadcast unwanted calls. There may be unwanted calls from legitimate and known people too. In this case, the unwantedness depends on social proximity of the communicating parties. For detecting these unwanted calls, I propose a framework that analyzes incoming calls for unwanted behavior. The framework includes a VoIP spam detector (VSD) that analyzes incoming VoIP calls for spam behavior using trust and reputation techniques. The framework also includes a nuisance detector (ND) that proactively infers the nuisance (or reluctance of the end user) to receive incoming calls. This inference is based on past mutual behavior between the calling and the called party (i.e., caller and callee), the callee's presence (mood or state of mind) and tolerance in receiving voice calls from the caller, and the social closeness between the caller and the callee. The VSD and ND learn the behavior of callers over time and estimate the possibility of the call to be unwanted based on predetermined thresholds configured by the callee (or the filter administrators). These threshold values have to be automatically updated for integrating dynamic behavioral changes of the communicating parties. For updating these threshold values, I propose an automatic calibration mechanism using receiver operating characteristics curves (ROC). The VSD and ND use this mechanism for dynamically updating thresholds for optimizing their accuracy of detection. In addition to unwanted calls to the callees in a VoIP network, there can be unwanted traffic coming into a VoIP network that attempts to compromise VoIP network devices. Intelligent hackers can create malicious VoIP traffic for disrupting network activities. Hence, there is a need to frequently monitor the risk levels of critical network infrastructure. Towards realizing this objective, I describe a network level risk management mechanism that prioritizes resources in a VoIP network. The prioritization scheme involves an adaptive re-computation model of risk levels using attack graphs and Bayesian inference techniques. All the above techniques collectively account for a domain-level VoIP security solution. digital.library.unt.edu/ark:/67531/metadc5155/
Techniques for Improving Uniformity in Direct Mapped Caches
Directly mapped caches are an attractive option for processor designers as they combine fast lookup times with reduced complexity and area. However, directly-mapped caches are prone to higher miss-rates as there are no candidates for replacement on a cache miss, hence data residing in a cache set would have to be evicted to the next level cache. Another issue that inhibits cache performance is the non-uniformity of accesses exhibited by most applications: some sets are under-utilized while others receive the majority of accesses. This implies that increasing the size of caches may not lead to proportionally improved cache hit rates. Several solutions that address cache non-uniformity have been proposed in the literature. These techniques have been proposed over the past decade and each proposal independently claims the benefit of reduced conflict misses. However, because the published results use different benchmarks and different experimental setups, (there is no established frame of reference for comparing these results) it is not easy to compare them. In this work we report a side-by-side comparison of these techniques. Finally, we propose and Adaptive-Partitioned cache for multi-threaded applications. This design limits inter-thread thrashing while dynamically reducing traffic to heavily accessed sets. digital.library.unt.edu/ark:/67531/metadc68025/
Temporally Correct Algorithms for Transaction Concurrency Control in Distributed Databases
Access: Use of this item is restricted to the UNT Community.
Many activities are comprised of temporally dependent events that must be executed in a specific chronological order. Supportive software applications must preserve these temporal dependencies. Whenever the processing of this type of an application includes transactions submitted to a database that is shared with other such applications, the transaction concurrency control mechanisms within the database must also preserve the temporal dependencies. A basis for preserving temporal dependencies is established by using (within the applications and databases) real-time timestamps to identify and order events and transactions. The use of optimistic approaches to transaction concurrency control can be undesirable in such situations, as they allow incorrect results for database read operations. Although the incorrectness is detected prior to transaction committal and the corresponding transaction(s) restarted, the impact on the application or entity that submitted the transaction can be too costly. Three transaction concurrency control algorithms are proposed in this dissertation. These algorithms are based on timestamp ordering, and are designed to preserve temporal dependencies existing among data-dependent transactions. The algorithms produce execution schedules that are equivalent to temporally ordered serial schedules, where the temporal order is established by the transactions' start times. The algorithms provide this equivalence while supporting currency to the extent out-of-order commits and reads. With respect to the stated concern with optimistic approaches, two of the proposed algorithms are risk-free and return to read operations only committed data-item values. Risk with the third algorithm is greatly reduced by its conservative bias. All three algorithms avoid deadlock while providing risk-free or reduced-risk operation. The performance of the algorithms is determined analytically and with experimentation. Experiments are performed using functional database management system models that implement the proposed algorithms and the well-known Conservative Multiversion Timestamp Ordering algorithm. digital.library.unt.edu/ark:/67531/metadc2743/
A Theoretical Network Model and the Incremental Hypercube-Based Networks
The study of multicomputer interconnection networks is an important area of research in parallel processing. We introduce vertex-symmetric Hamming-group graphs as a model to design a wide variety of network topologies including the hypercube network. digital.library.unt.edu/ark:/67531/metadc277860/
Toward a Data-Type-Based Real Time Geospatial Data Stream Management System
The advent of sensory and communication technologies enables the generation and consumption of large volumes of streaming data. Many of these data streams are geo-referenced. Existing spatio-temporal databases and data stream management systems are not capable of handling real time queries on spatial extents. In this thesis, we investigated several fundamental research issues toward building a data-type-based real time geospatial data stream management system. The thesis makes contributions in the following areas: geo-stream data models, aggregation, window-based nearest neighbor operators, and query optimization strategies. The proposed geo-stream data model is based on second-order logic and multi-typed algebra. Both abstract and discrete data models are proposed and exemplified. I further propose two useful geo-stream operators, namely Region By and WNN, which abstract common aggregation and nearest neighbor queries as generalized data model constructs. Finally, I propose three query optimization algorithms based on spatial, temporal, and spatio-temporal constraints of geo-streams. I show the effectiveness of the data model through many query examples. The effectiveness and the efficiency of the algorithms are validated through extensive experiments on both synthetic and real data sets. This work established the fundamental building blocks toward a full-fledged geo-stream database management system and has potential impact in many applications such as hazard weather alerting and monitoring, traffic analysis, and environmental modeling. digital.library.unt.edu/ark:/67531/metadc68070/
Towards Communicating Simple Sentence using Pictorial Representations
Language can sometimes be an impediment in communication. Whether we are talking about people who speak different languages, students who are learning a new language, or people with language disorders, the understanding of linguistic representations in a given language requires a certain amount of knowledge that not everybody has. In this thesis, we propose "translation through pictures" as a means for conveying simple pieces of information across language barriers, and describe a system that can automatically generate pictorial representations for simple sentences. Comparative experiments conducted on visual and linguistic representations of information show that a considerable amount of understanding can be achieved through pictorial descriptions, with results within a comparable range of those obtained with current machine translation techniques. Moreover, a user study conducted around the pictorial translation system reveals that users found the system to generally produce correct word/image associations, and rate the system as interactive and intelligent. digital.library.unt.edu/ark:/67531/metadc5263/
A Unifying Version Model for Objects and Schema in Object-Oriented Database System
There have been a number of different versioning models proposed. The research in this area can be divided into two categories: object versioning and schema versioning. In this dissertation, both problem domains are considered as a single unit. This dissertation describes a unifying version model (UVM) for maintaining changes to both objects and schema. UVM handles schema versioning operations by using object versioning techniques. The result is that the UVM allows the OODBMS to be much smaller than previous systems. Also, programmers need know only one set of versioning operations; thus, reducing the learning time by half. This dissertation shows that UVM is a simple but semantically sound and powerful version model for both objects and schema. digital.library.unt.edu/ark:/67531/metadc279222/
Urban surface characterization using LiDAR and aerial imagery.
Many calamities in history like hurricanes, tornado and flooding are proof to the large scale impact they cause to the life and economy. Computer simulation and GIS helps in modeling a real world scenario, which assists in evacuation planning, damage assessment, assistance and reconstruction. For achieving computer simulation and modeling there is a need for accurate classification of ground objects. One of the most significant aspects of this research is that it achieves improved classification for regions within which light detection and ranging (LiDAR) has low spatial resolution. This thesis describes a method for accurate classification of bare ground, water body, roads, vegetation, and structures using LiDAR data and aerial Infrared imagery. The most basic step for any terrain modeling application is filtering which is classification of ground and non-ground points. We present an integrated systematic method that makes classification of terrain and non-terrain points effective. Our filtering method uses the geometric feature of the triangle meshes created from LiDAR samples and calculate the confidence for every point. Geometric homogenous blocks and confidence are derived from TIN model and gridded LiDAR samples. The results from two representations are used in a classifier to determine if the block belongs ground or otherwise. Another important step is detection of water body, which is based on the LiDAR sample density of the region. Objects like tress and bare ground are characterized by the geometric features present in the LiDAR and the color features in the infrared imagery. These features are fed into a SVM classifier which detects bare-ground in the given region. Similarly trees are extracted using another trained SVM classifier. Once we obtain bare-grounds and trees, roads are extracted by removing the bare grounds. Structures are identified by the properties of non-ground segments. Experiments were conducted using LiDAR samples and Infrared imagery from the city of New Orleans. We evaluated the influence of different parameters to the classification. Water bodies were extracted successfully using density measures. Experiments showed that fusion of geometric properties and confidence levels resulted into efficient classification of ground and non-ground regions. Classification of vegetation using SVM was promising and effective using the features like height variation, HSV, angle etc. It is demonstrated that our methods successfully classified the region by using LiDAR data in a complex urban area with high-rise buildings. digital.library.unt.edu/ark:/67531/metadc12196/
User Modeling Tools for Virtual Architecture
As the use of virtual environments (VEs) is becoming more widespread, user needs are becoming a more significant part in those environments. In order to adapt to the needs of the user, a system should be able to infer user interests and goals. I developed an architecture for user modeling to understand users' interests in a VE by monitoring their actions. In this paper, I discussed the architecture and the virtual environment that was created to test it. This architecture employs sensors to keep track of all the users' actions, data structures that can store a record of significant events that have occurred in the environment, and a rule base. The rule base continually monitors the data collected from the sensors, world state, and event history in order to update the user goal inferences. These inferences can then be used to modify the flow of events within a VE. digital.library.unt.edu/ark:/67531/metadc4169/
Using Extended Logic Programs to Formalize Commonsense Reasoning
In this dissertation, we investigate how commonsense reasoning can be formalized by using extended logic programs. In this investigation, we first use extended logic programs to formalize inheritance hierarchies with exceptions by adopting McCarthy's simple abnormality formalism to express uncertain knowledge. In our representation, not only credulous reasoning can be performed but also the ambiguity-blocking inheritance and the ambiguity-propagating inheritance in skeptical reasoning are simulated. In response to the anomalous extension problem, we explore and discover that the intuition underlying commonsense reasoning is a kind of forward reasoning. The unidirectional nature of this reasoning is applied by many reformulations of the Yale shooting problem to exclude the undesired conclusion. We then identify defeasible conclusions in our representation based on the syntax of extended logic programs. A similar idea is also applied to other formalizations of commonsense reasoning to achieve such a purpose. digital.library.unt.edu/ark:/67531/metadc278054/
Using Normal Deduction Graphs in Common Sense Reasoning
This investigation proposes a powerful formalization of common sense knowledge based on function-free normal deduction graphs (NDGs) which form a powerful tool for deriving Horn and non-Horn clauses without functions. Such formalization allows common sense reasoning since it has the ability to handle not only negative but also incomplete information. digital.library.unt.edu/ark:/67531/metadc277922/
Using Reinforcement Learning in Partial Order Plan Space
Access: Use of this item is restricted to the UNT Community.
Partial order planning is an important approach that solves planning problems without completely specifying the orderings between the actions in the plan. This property provides greater flexibility in executing plans; hence making the partial order planners a preferred choice over other planning methodologies. However, in order to find partially ordered plans, partial order planners perform a search in plan space rather than in space of world states and an uninformed search in plan space leads to poor efficiency. In this thesis, I discuss applying a reinforcement learning method, called First-visit Monte Carlo method, to partial order planning in order to design agents which do not need any training data or heuristics but are still able to make informed decisions in plan space based on experience. Communicating effectively with the agent is crucial in reinforcement learning. I address how this task was accomplished in plan space and the results from an evaluation of a blocks world test bed. digital.library.unt.edu/ark:/67531/metadc5232/
Visualization of Surfaces and 3D Vector Fields
Visualization of trivariate functions and vector fields with three components in scientific computation is still a hard problem in compute graphic area. People build their own visualization packages for their special purposes. And there exist some general-purpose packages (MatLab, Vis5D), but they all require extensive user experience on setting all the parameters in order to generate images. We present a simple package to produce simplified but productive images of 3-D vector fields. We used this method to render the magnetic field and current as solutions of the Ginzburg-Landau equations on a 3-D domain. digital.library.unt.edu/ark:/67531/metadc3235/
Voting Operating System (VOS)
Access: Use of this item is restricted to the UNT Community.
The electronic voting machine (EVM) plays a very important role in a country where government officials are elected into office. Throughout the world, a specific operating system that tends to the specific requirement of the EVM does not exist. Existing EVM technology depends upon the various operating systems currently available, thus ignoring the basic needs of the system. There is a compromise over the basic requirements in order to develop the systems on the basis on an already available operating system, thus having a lot of scope for error. It is necessary to know the specific details of the particular device for which the operating system is being developed. In this document, I evaluate existing EVMs and identify flaws and shortcomings. I propose a solution for a new operating system that meets the specific requirements of the EVM, calling it Voting Operating System (VOS, pronounced 'voice'). The identification technique can be simplified by using the fingerprint technology that determines the identity of a person based on two fingerprints. I also discuss the various parts of the operating system that have to be implemented that can tend to all the basic requirements of an EVM, including implementation of the memory manager, process manager and file system of the proposed operating system. digital.library.unt.edu/ark:/67531/metadc4648/
Web-based airline ticket booking system.
Online airline ticket booking system is one of the essential applications of E-commerce. With the development of Internet and security technology, more and more people begin to consume online, which is more convenient and personal than traditional way. The goal of this system is to make people purchase airline tickets easily. The system is written in JAVATM. Chapter 1 will introduce some basic conception of the technologies have been used in this system. Chapter 2 shows how the database and the system are designed. Chapter 3 shows the logic of the Web site. In Chapter 4 the interface of the system will be given. Chapter 5 tells the platform of this system. digital.library.unt.edu/ark:/67531/metadc4556/
Web Services for Libraries
Access: Use of this item is restricted to the UNT Community.
Library information systems use different software applications and automated systems to gain access to distributed information. Rapid application development, changes made to existing software applications and development of new software on different platforms can make it difficult for library information systems to interoperate. Web services are used to offer better information access and retrieval solutions and hence make it more cost effective for libraries. This research focuses on how web services are implemented with the standard protocols like SOAP, WSDL and UDDI using different programming languages and platforms to achieve interoperability for libraries. It also shows how libraries can make use of this new technology. Since web services built on different platforms can interact with each other, libraries can access information with more efficiency and flexibility. digital.library.unt.edu/ark:/67531/metadc4361/
A Wireless Traffic Surveillance System Using Video Analytics
Video surveillance systems have been commonly used in transportation systems to support traffic monitoring, speed estimation, and incident detection. However, there are several challenges in developing and deploying such systems, including high development and maintenance costs, bandwidth bottleneck for long range link, and lack of advanced analytics. In this thesis, I leverage current wireless, video camera, and analytics technologies, and present a wireless traffic monitoring system. I first present an overview of the system. Then I describe the site investigation and several test links with different hardware/software configurations to demonstrate the effectiveness of the system. The system development process was documented to provide guidelines for future development. Furthermore, I propose a novel speed-estimation analytics algorithm that takes into consideration roads with slope angles. I prove the correctness of the algorithm theoretically, and validate the effectiveness of the algorithm experimentally. The experimental results on both synthetic and real dataset show that the algorithm is more accurate than the baseline algorithm 80% of the time. On average the accuracy improvement of speed estimation is over 3.7% even for very small slope angles. digital.library.unt.edu/ark:/67531/metadc68005/
XML-Based Agent Scripts and Inference Mechanisms
Natural language understanding has been a persistent challenge to researchers in various computer science fields, in a number of applications ranging from user support systems to entertainment and online teaching. A long term goal of the Artificial Intelligence field is to implement mechanisms that enable computers to emulate human dialogue. The recently developed ALICEbots, virtual agents with underlying AIML scripts, by A.L.I.C.E. foundation, use AIML scripts - a subset of XML - as the underlying pattern database for question answering. Their goal is to enable pattern-based, stimulus-response knowledge content to be served, received and processed over the Web, or offline, in the manner similar to HTML and XML. In this thesis, we describe a system that converts the AIML scripts to Prolog clauses and reuses them as part of a knowledge processor. The inference mechanism developed in this thesis is able to successfully match the input pattern with our clauses database even if words are missing. We also emulate the pattern deduction algorithm of the original logic deduction mechanism. Our rules, compatible with Semantic Web standards, bring structure to the meaningful content of Web pages and support interactive content retrieval using natural language. digital.library.unt.edu/ark:/67531/metadc4288/