You limited your search to:

  Partner: UNT Libraries
 Department: Department of Computer Science and Engineering
 Collection: UNT Theses and Dissertations
A Dual Dielectric Approach for Performance Aware Reduction of Gate Leakage in Combinational Circuits

A Dual Dielectric Approach for Performance Aware Reduction of Gate Leakage in Combinational Circuits

Date: May 2006
Creator: Mukherjee, Valmiki
Description: Design of systems in the low-end nanometer domain has introduced new dimensions in power consumption and dissipation in CMOS devices. With continued and aggressive scaling, using low thickness SiO2 for the transistor gates, gate leakage due to gate oxide direct tunneling current has emerged as the major component of leakage in the CMOS circuits. Therefore, providing a solution to the issue of gate oxide leakage has become one of the key concerns in achieving low power and high performance CMOS VLSI circuits. In this thesis, a new approach is proposed involving dual dielectric of dual thicknesses (DKDT) for the reducing both ON and OFF state gate leakage. It is claimed that the simultaneous utilization of SiON and SiO2 each with multiple thicknesses is a better approach for gate leakage reduction than the conventional usage of a single gate dielectric (SiO2), possibly with multiple thicknesses. An algorithm is developed for DKDT assignment that minimizes the overall leakage for a circuit without compromising with the performance. Extensive experiments were carried out on ISCAS'85 benchmarks using 45nm technology which showed that the proposed approach can reduce the leakage, as much as 98% (in an average 89.5%), without degrading the performance.
Contributing Partner: UNT Libraries
Using Reinforcement Learning in Partial Order Plan Space

Using Reinforcement Learning in Partial Order Plan Space

Access: Use of this item is restricted to the UNT Community.
Date: May 2006
Creator: Ceylan, Hakan
Description: Partial order planning is an important approach that solves planning problems without completely specifying the orderings between the actions in the plan. This property provides greater flexibility in executing plans; hence making the partial order planners a preferred choice over other planning methodologies. However, in order to find partially ordered plans, partial order planners perform a search in plan space rather than in space of world states and an uninformed search in plan space leads to poor efficiency. In this thesis, I discuss applying a reinforcement learning method, called First-visit Monte Carlo method, to partial order planning in order to design agents which do not need any training data or heuristics but are still able to make informed decisions in plan space based on experience. Communicating effectively with the agent is crucial in reinforcement learning. I address how this task was accomplished in plan space and the results from an evaluation of a blocks world test bed.
Contributing Partner: UNT Libraries
VLSI Architecture and FPGA Prototyping of a Secure Digital Camera for Biometric Application

VLSI Architecture and FPGA Prototyping of a Secure Digital Camera for Biometric Application

Date: August 2006
Creator: Adamo, Oluwayomi Bamidele
Description: This thesis presents a secure digital camera (SDC) that inserts biometric data into images found in forms of identification such as the newly proposed electronic passport. However, putting biometric data in passports makes the data vulnerable for theft, causing privacy related issues. An effective solution to combating unauthorized access such as skimming (obtaining data from the passport's owner who did not willingly submit the data) or eavesdropping (intercepting information as it moves from the chip to the reader) could be judicious use of watermarking and encryption at the source end of the biometric process in hardware like digital camera or scanners etc. To address such issues, a novel approach and its architecture in the framework of a digital camera, conceptualized as an SDC is presented. The SDC inserts biometric data into passport image with the aid of watermarking and encryption processes. The VLSI (very large scale integration) architecture of the functional units of the SDC such as watermarking and encryption unit is presented. The result of the hardware implementation of Rijndael advanced encryption standard (AES) and a discrete cosine transform (DCT) based visible and invisible watermarking algorithm is presented. The prototype chip can carry out simultaneous encryption and watermarking, which ...
Contributing Partner: UNT Libraries
Resource Management in Wireless Networks

Resource Management in Wireless Networks

Date: August 2006
Creator: Arepally, Anurag
Description: A local call admission control (CAC) algorithm for third generation wireless networks was designed and implemented, which allows for the simulation of network throughput for different spreading factors and various mobility scenarios. A global CAC algorithm is also implemented and used as a benchmark since it is inherently optimized; it yields the best possible performance but has an intensive computational complexity. Optimized local CAC algorithm achieves similar performance as global CAC algorithm at a fraction of the computational cost. Design of a dynamic channel assignment algorithm for IEEE 802.11 wireless systems is also presented. Channels are assigned dynamically depending on the minimal interference generated by the neighboring access points on a reference access point. Analysis of dynamic channel assignment algorithm shows an improvement by a factor of 4 over the default settings of having all access points use the same channel, resulting significantly higher network throughput.
Contributing Partner: UNT Libraries
Modeling Infectious Disease Spread Using Global Stochastic Field Simulation

Modeling Infectious Disease Spread Using Global Stochastic Field Simulation

Date: August 2006
Creator: Venkatachalam, Sangeeta
Description: Susceptibles-infectives-removals (SIR) and its derivatives are the classic mathematical models for the study of infectious diseases in epidemiology. In order to model and simulate epidemics of an infectious disease, a global stochastic field simulation paradigm (GSFS) is proposed, which incorporates geographic and demographic based interactions. The interaction measure between regions is a function of population density and geographical distance, and has been extended to include demographic and migratory constraints. The progression of diseases using GSFS is analyzed, and similar behavior to the SIR model is exhibited by GSFS, using the geographic information systems (GIS) gravity model for interactions. The limitations of the SIR and similar models of homogeneous population with uniform mixing are addressed by the GSFS model. The GSFS model is oriented to heterogeneous population, and can incorporate interactions based on geography, demography, environment and migration patterns. The progression of diseases can be modeled at higher levels of fidelity using the GSFS model, and facilitates optimal deployment of public health resources for prevention, control and surveillance of infectious diseases.
Contributing Partner: UNT Libraries
Group-EDF: A New Approach and an Efficient Non-Preemptive Algorithm for Soft Real-Time Systems

Group-EDF: A New Approach and an Efficient Non-Preemptive Algorithm for Soft Real-Time Systems

Date: August 2006
Creator: Li, Wenming
Description: Hard real-time systems in robotics, space and military missions, and control devices are specified with stringent and critical time constraints. On the other hand, soft real-time applications arising from multimedia, telecommunications, Internet web services, and games are specified with more lenient constraints. Real-time systems can also be distinguished in terms of their implementation into preemptive and non-preemptive systems. In preemptive systems, tasks are often preempted by higher priority tasks. Non-preemptive systems are gaining interest for implementing soft-real applications on multithreaded platforms. In this dissertation, I propose a new algorithm that uses a two-level scheduling strategy for scheduling non-preemptive soft real-time tasks. Our goal is to improve the success ratios of the well-known earliest deadline first (EDF) approach when the load on the system is very high and to improve the overall performance in both underloaded and overloaded conditions. Our approach, known as group-EDF (gEDF), is based on dynamic grouping of tasks with deadlines that are very close to each other, and using a shortest job first (SJF) technique to schedule tasks within the group. I believe that grouping tasks dynamically with similar deadlines and utilizing secondary criteria, such as minimizing the total execution time can lead to new and more ...
Contributing Partner: UNT Libraries
Natural Language Interfaces to Databases

Natural Language Interfaces to Databases

Date: December 2006
Creator: Chandra, Yohan
Description: Natural language interfaces to databases (NLIDB) are systems that aim to bridge the gap between the languages used by humans and computers, and automatically translate natural language sentences to database queries. This thesis proposes a novel approach to NLIDB, using graph-based models. The system starts by collecting as much information as possible from existing databases and sentences, and transforms this information into a knowledge base for the system. Given a new question, the system will use this knowledge to analyze and translate the sentence into its corresponding database query statement. The graph-based NLIDB system uses English as the natural language, a relational database model, and SQL as the formal query language. In experiments performed with natural language questions ran against a large database containing information about U.S. geography, the system showed good performance compared to the state-of-the-art in the field.
Contributing Partner: UNT Libraries
An Approach Towards Self-Supervised Classification Using Cyc

An Approach Towards Self-Supervised Classification Using Cyc

Date: December 2006
Creator: Coursey, Kino High
Description: Due to the long duration required to perform manual knowledge entry by human knowledge engineers it is desirable to find methods to automatically acquire knowledge about the world by accessing online information. In this work I examine using the Cyc ontology to guide the creation of Naïve Bayes classifiers to provide knowledge about items described in Wikipedia articles. Given an initial set of Wikipedia articles the system uses the ontology to create positive and negative training sets for the classifiers in each category. The order in which classifiers are generated and used to test articles is also guided by the ontology. The research conducted shows that a system can be created that utilizes statistical text classification methods to extract information from an ad-hoc generated information source like Wikipedia for use in a formal semantic ontology like Cyc. Benefits and limitations of the system are discussed along with future work.
Contributing Partner: UNT Libraries
Comparative Study of RSS-Based Collaborative Localization Methods in Wireless Sensor Networks

Comparative Study of RSS-Based Collaborative Localization Methods in Wireless Sensor Networks

Date: December 2006
Creator: Koneru, Avanthi
Description: In this thesis two collaborative localization techniques are studied: multidimensional scaling (MDS) and maximum likelihood estimator (MLE). A synthesis of a new location estimation method through a serial integration of these two techniques, such that an estimate is first obtained using MDS and then MLE is employed to fine-tune the MDS solution, was the subject of this research using various simulation and experimental studies. In the simulations, important issues including the effects of sensor node density, reference node density and different deployment strategies of reference nodes were addressed. In the experimental study, the path loss model of indoor environments is developed by determining the environment-specific parameters from the experimental measurement data. Then, the empirical path loss model is employed in the analysis and simulation study of the performance of collaborative localization techniques.
Contributing Partner: UNT Libraries
CLUE: A Cluster Evaluation Tool

CLUE: A Cluster Evaluation Tool

Date: December 2006
Creator: Parker, Brandon S.
Description: Modern high performance computing is dependent on parallel processing systems. Most current benchmarks reveal only the high level computational throughput metrics, which may be sufficient for single processor systems, but can lead to a misrepresentation of true system capability for parallel systems. A new benchmark is therefore proposed. CLUE (Cluster Evaluator) uses a cellular automata algorithm to evaluate the scalability of parallel processing machines. The benchmark also uses algorithmic variations to evaluate individual system components' impact on the overall serial fraction and efficiency. CLUE is not a replacement for other performance-centric benchmarks, but rather shows the scalability of a system and provides metrics to reveal where one can improve overall performance. CLUE is a new benchmark which demonstrates a better comparison among different parallel systems than existing benchmarks and can diagnose where a particular parallel system can be optimized.
Contributing Partner: UNT Libraries