You limited your search to:

  Partner: UNT Libraries
 Decade: 2010-2019
 Degree Discipline: Computer Science
 Degree Level: Doctoral
 Collection: UNT Theses and Dissertations
3D Reconstruction Using Lidar and Visual Images

3D Reconstruction Using Lidar and Visual Images

Date: December 2012
Creator: Duraisamy, Prakash
Description: In this research, multi-perspective image registration using LiDAR and visual images was considered. 2D-3D image registration is a difficult task because it requires the extraction of different semantic features from each modality. This problem is solved in three parts. The first step involves detection and extraction of common features from each of the data sets. The second step consists of associating the common features between two different modalities. Traditional methods use lines or orthogonal corners as common features. The third step consists of building the projection matrix. Many existing methods use global positing system (GPS) or inertial navigation system (INS) for an initial estimate of the camera pose. However, the approach discussed herein does not use GPS, INS, or any such devices for initial estimate; hence the model can be used in places like the lunar surface or Mars where GPS or INS are not available. A variation of the method is also described, which does not require strong features from both images but rather uses intensity gradients in the image. This can be useful when one image does not have strong features (such as lines) or there are too many extraneous features.
Contributing Partner: UNT Libraries
Autonomic Failure Identification and Diagnosis for Building Dependable Cloud Computing Systems

Autonomic Failure Identification and Diagnosis for Building Dependable Cloud Computing Systems

Date: May 2014
Creator: Guan, Qiang
Description: The increasingly popular cloud-computing paradigm provides on-demand access to computing and storage with the appearance of unlimited resources. Users are given access to a variety of data and software utilities to manage their work. Users rent virtual resources and pay for only what they use. In spite of the many benefits that cloud computing promises, the lack of dependability in shared virtualized infrastructures is a major obstacle for its wider adoption, especially for mission-critical applications. Virtualization and multi-tenancy increase system complexity and dynamicity. They introduce new sources of failure degrading the dependability of cloud computing systems. To assure cloud dependability, in my dissertation research, I develop autonomic failure identification and diagnosis techniques that are crucial for understanding emergent, cloud-wide phenomena and self-managing resource burdens for cloud availability and productivity enhancement. We study the runtime cloud performance data collected from a cloud test-bed and by using traces from production cloud systems. We define cloud signatures including those metrics that are most relevant to failure instances. We exploit profiled cloud performance data in both time and frequency domain to identify anomalous cloud behaviors and leverage cloud metric subspace analysis to automate the diagnosis of observed failures. We implement a prototype of the ...
Contributing Partner: UNT Libraries
Boosting for Learning From Imbalanced, Multiclass Data Sets

Boosting for Learning From Imbalanced, Multiclass Data Sets

Access: Use of this item is restricted to the UNT Community.
Date: December 2013
Creator: Abouelenien, Mohamed
Description: In many real-world applications, it is common to have uneven number of examples among multiple classes. The data imbalance, however, usually complicates the learning process, especially for the minority classes, and results in deteriorated performance. Boosting methods were proposed to handle the imbalance problem. These methods need elongated training time and require diversity among the classifiers of the ensemble to achieve improved performance. Additionally, extending the boosting method to handle multi-class data sets is not straightforward. Examples of applications that suffer from imbalanced multi-class data can be found in face recognition, where tens of classes exist, and in capsule endoscopy, which suffers massive imbalance between the classes. This dissertation introduces RegBoost, a new boosting framework to address the imbalanced, multi-class problems. This method applies a weighted stratified sampling technique and incorporates a regularization term that accommodates multi-class data sets and automatically determines the error bound of each base classifier. The regularization parameter penalizes the classifier when it misclassifies instances that were correctly classified in the previous iteration. The parameter additionally reduces the bias towards majority classes. Experiments are conducted using 12 diverse data sets with moderate to high imbalance ratios. The results demonstrate superior performance of the proposed method compared ...
Contributing Partner: UNT Libraries
Design and Implementation of Large-Scale Wireless Sensor Networks for Environmental Monitoring Applications

Design and Implementation of Large-Scale Wireless Sensor Networks for Environmental Monitoring Applications

Date: May 2010
Creator: Yang, Jue
Description: Environmental monitoring represents a major application domain for wireless sensor networks (WSN). However, despite significant advances in recent years, there are still many challenging issues to be addressed to exploit the full potential of the emerging WSN technology. In this dissertation, we introduce the design and implementation of low-power wireless sensor networks for long-term, autonomous, and near-real-time environmental monitoring applications. We have developed an out-of-box solution consisting of a suite of software, protocols and algorithms to provide reliable data collection with extremely low power consumption. Two wireless sensor networks based on the proposed solution have been deployed in remote field stations to monitor soil moisture along with other environmental parameters. As parts of the ever-growing environmental monitoring cyberinfrastructure, these networks have been integrated into the Texas Environmental Observatory system for long-term operation. Environmental measurement and network performance results are presented to demonstrate the capability, reliability and energy-efficiency of the network.
Contributing Partner: UNT Libraries
A Framework for Analyzing and Optimizing Regional Bio-Emergency Response Plans

A Framework for Analyzing and Optimizing Regional Bio-Emergency Response Plans

Date: December 2010
Creator: Schneider, Tamara
Description: The presence of naturally occurring and man-made public health threats necessitate the design and implementation of mitigation strategies, such that adequate response is provided in a timely manner. Since multiple variables, such as geographic properties, resource constraints, and government mandated time-frames must be accounted for, computational methods provide the necessary tools to develop contingency response plans while respecting underlying data and assumptions. A typical response scenario involves the placement of points of dispensing (PODs) in the affected geographic region to supply vaccines or medications to the general public. Computational tools aid in the analysis of such response plans, as well as in the strategic placement of PODs, such that feasible response scenarios can be developed. Due to the sensitivity of bio-emergency response plans, geographic information, such as POD locations, must be kept confidential. The generation of synthetic geographic regions allows for the development of emergency response plans on non-sensitive data, as well as for the study of the effects of single geographic parameters. Further, synthetic representations of geographic regions allow for results to be published and evaluated by the scientific community. This dissertation presents methodology for the analysis of bio-emergency response plans, methods for plan optimization, as well as methodology ...
Contributing Partner: UNT Libraries
Framework for Evaluating Dynamic Memory Allocators Including a New Equivalence Class Based Cache-conscious Allocator

Framework for Evaluating Dynamic Memory Allocators Including a New Equivalence Class Based Cache-conscious Allocator

Date: August 2013
Creator: Janjusic, Tomislav
Description: Software applications’ performance is hindered by a variety of factors, but most notably by the well-known CPU-memory speed gap (often known as the memory wall). This results in the CPU sitting idle waiting for data to be brought from memory to processor caches. The addressing used by caches cause non-uniform accesses to various cache sets. The non-uniformity is due to several reasons, including how different objects are accessed by the code and how the data objects are located in memory. Memory allocators determine where dynamically created objects are placed, thus defining addresses and their mapping to cache locations. It is important to evaluate how different allocators behave with respect to the localities of the created objects. Most allocators use a single attribute, the size, of an object in making allocation decisions. Additional attributes such as the placement with respect to other objects, or specific cache area may lead to better use of cache memories. In this dissertation, we proposed and implemented a framework that allows for the development and evaluation of new memory allocation techniques. At the root of the framework is a memory tracing tool called Gleipnir, which provides very detailed information about every memory access, and relates it ...
Contributing Partner: UNT Libraries
GPS CaPPture: a System for GPS Trajectory Collection, Processing, and Destination Prediction

GPS CaPPture: a System for GPS Trajectory Collection, Processing, and Destination Prediction

Date: May 2012
Creator: Griffin, Terry W.
Description: In the United States, smartphone ownership surpassed 69.5 million in February 2011 with a large portion of those users (20%) downloading applications (apps) that enhance the usability of a device by adding additional functionality. a large percentage of apps are written specifically to utilize the geographical position of a mobile device. One of the prime factors in developing location prediction models is the use of historical data to train such a model. with larger sets of training data, prediction algorithms become more accurate; however, the use of historical data can quickly become a downfall if the GPS stream is not collected or processed correctly. Inaccurate or incomplete or even improperly interpreted historical data can lead to the inability to develop accurately performing prediction algorithms. As GPS chipsets become the standard in the ever increasing number of mobile devices, the opportunity for the collection of GPS data increases remarkably. the goal of this study is to build a comprehensive system that addresses the following challenges: (1) collection of GPS data streams in a manner such that the data is highly usable and has a reduction in errors; (2) processing and reduction of the collected data in order to prepare it and ...
Contributing Partner: UNT Libraries
Investigating the Extractive Summarization of Literary Novels

Investigating the Extractive Summarization of Literary Novels

Date: December 2011
Creator: Ceylan, Hakan
Description: Abstract Due to the vast amount of information we are faced with, summarization has become a critical necessity of everyday human life. Given that a large fraction of the electronic documents available online and elsewhere consist of short texts such as Web pages, news articles, scientific reports, and others, the focus of natural language processing techniques to date has been on the automation of methods targeting short documents. We are witnessing however a change: an increasingly larger number of books become available in electronic format. This means that the need for language processing techniques able to handle very large documents such as books is becoming increasingly important. This thesis addresses the problem of summarization of novels, which are long and complex literary narratives. While there is a significant body of research that has been carried out on the task of automatic text summarization, most of this work has been concerned with the summarization of short documents, with a particular focus on news stories. However, novels are different in both length and genre, and consequently different summarization techniques are required. This thesis attempts to close this gap by analyzing a new domain for summarization, and by building unsupervised and supervised systems ...
Contributing Partner: UNT Libraries
Measuring Semantic Relatedness Using Salient Encyclopedic Concepts

Measuring Semantic Relatedness Using Salient Encyclopedic Concepts

Date: August 2011
Creator: Hassan, Samer
Description: While pragmatics, through its integration of situational awareness and real world relevant knowledge, offers a high level of analysis that is suitable for real interpretation of natural dialogue, semantics, on the other end, represents a lower yet more tractable and affordable linguistic level of analysis using current technologies. Generally, the understanding of semantic meaning in literature has revolved around the famous quote ``You shall know a word by the company it keeps''. In this thesis we investigate the role of context constituents in decoding the semantic meaning of the engulfing context; specifically we probe the role of salient concepts, defined as content-bearing expressions which afford encyclopedic definitions, as a suitable source of semantic clues to an unambiguous interpretation of context. Furthermore, we integrate this world knowledge in building a new and robust unsupervised semantic model and apply it to entail semantic relatedness between textual pairs, whether they are words, sentences or paragraphs. Moreover, we explore the abstraction of semantics across languages and utilize our findings into building a novel multi-lingual semantic relatedness model exploiting information acquired from various languages. We demonstrate the effectiveness and the superiority of our mono-lingual and multi-lingual models through a comprehensive set of evaluations on specialized ...
Contributing Partner: UNT Libraries
Modeling and Analysis of Next Generation 9-1-1 Emergency Medical Dispatch Protocols

Modeling and Analysis of Next Generation 9-1-1 Emergency Medical Dispatch Protocols

Date: August 2013
Creator: Gupta, Neeraj Kant
Description: Emergency Medical Dispatch Protocols are guidelines that a 9-1-1 dispatcher uses to evaluate the nature of emergency, resources to send and the nature of help provided to the 9-1-1 caller. The current Dispatch Protocols are based on voice only call. But the Next Generation 9-1-1 (NG9-1-1) architecture will allow multimedia emergency calls. In this thesis I analyze and model the Emergency Medical Dispatch Protocols for NG9-1-1 architecture. I have identified various technical aspects to improve the NG9-1-1 Dispatch Protocols. The devices (smartphone) at the caller end have advanced to a point where they can be used to send and receive video, pictures and text. There are sensors embedded in them that can be used for initial diagnosis of the injured person. There is a need to improve the human computer (smartphone) interface to take advantage of technology so that callers can easily make use of various features available to them. The dispatchers at the 9-1-1 call center can make use of these new protocols to improve the quality and the response time. They will have capability of multiple media streams to interact with the caller and the first responders.The specific contributions in this thesis include developing applications that use smartphone ...
Contributing Partner: UNT Libraries
FIRST PREV 1 2 NEXT LAST