431 Matching Results

Search Results

2-D image segmentation using minimum spanning trees

Description: This paper presents a new algorithm for partitioning a gray-level image into connected homogeneous regions. The novelty of this algorithm lies in the fact that by constructing a minimum spanning tree representation of a gray-level image, it reduces a region partitioning problem to a minimum spanning tree partitioning problem, and hence reduces the computational complexity of the region partitioning problem. The tree-partitioning algorithm, in essence, partitions a minimum spanning tree into subtrees, representing different homogeneous regions, by minimizing the sum of variations of gray levels over all subtrees under the constraints that each subtree should have at least a specified number of nodes, and two adjacent subtrees should have significantly different average gray-levels. Two (faster) heuristic implementations are also given for large-scale region partitioning problems. Test results have shown that the segmentation results are satisfactory and insensitive to noise.
Date: September 1995
Creator: Xu, Y. & Uberbacher, E. C.
Partner: UNT Libraries Government Documents Department

3GPP Long Term Evolution LTE Scheduling

Description: Future generation cellular networks are expected to deliver an omnipresent broadband access network for an endlessly increasing number of subscribers. Long term Evolution (LTE) represents a significant milestone towards wireless networks known as 4G cellular networks. A key feature of LTE is the implementation of enhanced Radio Resource Management (RRM) mechanism to improve the system performance. The structure of LTE networks was simplified by diminishing the number of the nodes of the core network. Also, the design of the radio protocol architecture is quite unique. In order to achieve high data rate in LTE, 3rd Generation Partnership Project (3GPP) has selected Orthogonal Frequency Division Multiplexing (OFDM) as an appropriate scheme in terms of downlinks. However, the proper scheme for an uplink is the Single-Carrier Frequency Domain Multiple Access due to the peak-to-average-power-ratio (PAPR) constraint. LTE packet scheduling plays a primary role as part of RRM to improve the system’s data rate as well as supporting various QoS requirements of mobile services. The major function of the LTE packet scheduler is to assign Physical Resource Blocks (PRBs) to mobile User Equipment (UE). In our work, we formed a proposed packet scheduler algorithm. The proposed scheduler algorithm acts based on the number of UEs attached to the eNodeB. To evaluate the proposed scheduler algorithm, we assumed two different scenarios based on a number of UEs. When the number of UE is lower than the number of PRBs, the UEs with highest Channel Quality Indicator (CQI) will be assigned PRBs. Otherwise, the scheduler will assign PRBs based on a given proportional fairness metric. The eNodeB’s throughput is increased when the proposed algorithm was implemented.
Date: December 2013
Creator: Alotaibi, Sultan
Partner: UNT Libraries

Accurate Joint Detection from Depth Videos towards Pose Analysis

Description: Joint detection is vital for characterizing human pose and serves as a foundation for a wide range of computer vision applications such as physical training, health care, entertainment. This dissertation proposed two methods to detect joints in the human body for pose analysis. The first method detects joints by combining body model and automatic feature points detection together. The human body model maps the detected extreme points to the corresponding body parts of the model and detects the position of implicit joints. The dominant joints are detected after implicit joints and extreme points are located by a shortest path based methods. The main contribution of this work is a hybrid framework to detect joints on the human body to achieve robustness to different body shapes or proportions, pose variations and occlusions. Another contribution of this work is the idea of using geodesic features of the human body to build a model for guiding the human pose detection and estimation. The second proposed method detects joints by segmenting human body into parts first and then detect joints by making the detection algorithm focusing on each limb. The advantage of applying body part segmentation first is that the body segmentation method narrows down the searching area for each joint so that the joint detection method can provide more stable and accurate results.
Date: May 2018
Creator: Kong, Longbo
Partner: UNT Libraries

Adaptive dimension reduction for clustering high dimensional data

Description: It is well-known that for high dimensional data clustering, standard algorithms such as EM and the K-means are often trapped in local minimum. many initialization methods were proposed to tackle this problem, but with only limited success. In this paper they propose a new approach to resolve this problem by repeated dimension reductions such that K-means or EM are performed only in very low dimensions. Cluster membership is utilized as a bridge between the reduced dimensional sub-space and the original space, providing flexibility and ease of implementation. Clustering analysis performed on highly overlapped Gaussians, DNA gene expression profiles and internet newsgroups demonstrate the effectiveness of the proposed algorithm.
Date: October 1, 2002
Creator: Ding, Chris; He, Xiaofeng; Zha, Hongyuan & Simon, Horst
Partner: UNT Libraries Government Documents Department

Adaptive Power Management for Autonomic Resource Configuration in Large-scale Computer Systems

Description: In order to run and manage resource-intensive high-performance applications, large-scale computing and storage platforms have been evolving rapidly in various domains in both academia and industry. The energy expenditure consumed to operate and maintain these cloud computing infrastructures is a major factor to influence the overall profit and efficiency for most cloud service providers. Moreover, considering the mitigation of environmental damage from excessive carbon dioxide emission, the amount of power consumed by enterprise-scale data centers should be constrained for protection of the environment.Generally speaking, there exists a trade-off between power consumption and application performance in large-scale computing systems and how to balance these two factors has become an important topic for researchers and engineers in cloud and HPC communities. Therefore, minimizing the power usage while satisfying the Service Level Agreements have become one of the most desirable objectives in cloud computing research and implementation. Since the fundamental feature of the cloud computing platform is hosting workloads with a variety of characteristics in a consolidated and on-demand manner, it is demanding to explore the inherent relationship between power usage and machine configurations. Subsequently, with an understanding of these inherent relationships, researchers are able to develop effective power management policies to optimize productivity by balancing power usage and system performance. In this dissertation, we develop an autonomic power-aware system management framework for large-scale computer systems. We propose a series of techniques including coarse-grain power profiling, VM power modelling, power-aware resource auto-configuration and full-system power usage simulator. These techniques help us to understand the characteristics of power consumption of various system components. Based on these techniques, we are able to test various job scheduling strategies and develop resource management approaches to enhance the systems' power efficiency.
Date: August 2015
Creator: Zhang, Ziming
Partner: UNT Libraries

Advanced Power Amplifiers Design for Modern Wireless Communication

Description: Modern wireless communication systems use spectrally efficient modulation schemes to reach high data rate transmission. These schemes are generally involved with signals with high peak-to-average power ratio (PAPR). Moreover, the development of next generation wireless communication systems requires the power amplifiers to operate over a wide frequency band or multiple frequency bands to support different applications. These wide-band and multi-band solutions will lead to reductions in both the size and cost of the whole system. This dissertation presents several advanced power amplifier solutions to provide wide-band and multi-band operations with efficiency improvement at power back-offs.
Date: August 2015
Creator: Shao, Jin
Partner: UNT Libraries

Agent 2003 Conference on Challenges in Social Simulation

Description: Welcome to the Proceedings of the fourth in a series of agent simulation conferences cosponsored by Argonne National Laboratory and The University of Chicago. Agent 2003 is the second conference in which three Special Interest Groups from the North American Association for Computational Social and Organizational Science (NAACSOS) have been involved in planning the program--Computational Social Theory; Simulation Applications; and Methods, Toolkits and Techniques. The theme of Agent 2003, Challenges in Social Simulation, is especially relevant, as there seems to be no shortage of such challenges. Agent simulation has been applied with increasing frequency to social domains for several decades, and its promise is clear and increasingly visible. Like any nascent scientific methodology, however, it faces a number of problems or issues that must be addressed in order to progress. These challenges include: (1) Validating models relative to the social settings they are designed to represent; (2) Developing agents and interactions simple enough to understand but sufficiently complex to do justice to the social processes of interest; (3) Bridging the gap between empirically spare artificial societies and naturally occurring social phenomena; (4) Building multi-level models that span processes across domains; (5) Promoting a dialog among theoretical, qualitative, and empirical social scientists and area experts, on the one hand, and mathematical and computational modelers and engineers, on the other; (6) Using that dialog to facilitate substantive progress in the social sciences; and (7) Fulfilling the aspirations of users in business, government, and other application areas, while recognizing and addressing the preceding challenges. Although this list hardly exhausts the challenges the field faces, it does identify topics addressed throughout the presentations of Agent 2003. Agent 2003 is part of a much larger process in which new methods and techniques are applied to difficult social issues. Among the resources that give us the ...
Date: January 1, 2003
Creator: Clemmons, Margaret
Partner: UNT Libraries Government Documents Department

Algorithm and simulation development in support of response strategies for contamination events in air and water systems.

Description: Chemical/Biological/Radiological (CBR) contamination events pose a considerable threat to our nation's infrastructure, especially in large internal facilities, external flows, and water distribution systems. Because physical security can only be enforced to a limited degree, deployment of early warning systems is being considered. However to achieve reliable and efficient functionality, several complex questions must be answered: (1) where should sensors be placed, (2) how can sparse sensor information be efficiently used to determine the location of the original intrusion, (3) what are the model and data uncertainties, (4) how should these uncertainties be handled, and (5) how can our algorithms and forward simulations be sufficiently improved to achieve real time performance? This report presents the results of a three year algorithmic and application development to support the identification, mitigation, and risk assessment of CBR contamination events. The main thrust of this investigation was to develop (1) computationally efficient algorithms for strategically placing sensors, (2) identification process of contamination events by using sparse observations, (3) characterization of uncertainty through developing accurate demands forecasts and through investigating uncertain simulation model parameters, (4) risk assessment capabilities, and (5) reduced order modeling methods. The development effort was focused on water distribution systems, large internal facilities, and outdoor areas.
Date: January 1, 2006
Creator: Waanders, Bart Van Bloemen
Partner: UNT Libraries Government Documents Department

Algorithm Optimizations in Genomic Analysis Using Entropic Dissection

Description: In recent years, the collection of genomic data has skyrocketed and databases of genomic data are growing at a faster rate than ever before. Although many computational methods have been developed to interpret these data, they tend to struggle to process the ever increasing file sizes that are being produced and fail to take advantage of the advances in multi-core processors by using parallel processing. In some instances, loss of accuracy has been a necessary trade off to allow faster computation of the data. This thesis discusses one such algorithm that has been developed and how changes were made to allow larger input file sizes and reduce the time required to achieve a result without sacrificing accuracy. An information entropy based algorithm was used as a basis to demonstrate these techniques. The algorithm dissects the distinctive patterns underlying genomic data efficiently requiring no a priori knowledge, and thus is applicable in a variety of biological research applications. This research describes how parallel processing and object-oriented programming techniques were used to process larger files in less time and achieve a more accurate result from the algorithm. Through object oriented techniques, the maximum allowable input file size was significantly increased from 200 mb to 2000 mb. Using parallel processing techniques allowed the program to finish processing data in less than half the time of the sequential version. The accuracy of the algorithm was improved by reducing data loss throughout the algorithm. Finally, adding user-friendly options enabled the program to use requests more effectively and further customize the logic used within the algorithm.
Date: August 2015
Creator: Danks, Jacob R.
Partner: UNT Libraries

Amesos 1.0 reference guide.

Description: This document describes the main functionalities of the Amesos package, version 1.0. Amesos, available as part of Trilinos 4.0, provides an object-oriented interface to several serial and parallel sparse direct solvers libraries, for the solution of the linear systems of equations A X = B where A is a real sparse, distributed matrix, defined as an EpetraRowMatrix object, and X and B are defined as EpetraMultiVector objects. Amesos provides a common look-and-feel to several direct solvers, insulating the user from each package's details, such as matrix and vector formats, and data distribution.
Date: May 1, 2004
Creator: Sala, Marzio & Stanley, Ken D. (Oberlin, OH)
Partner: UNT Libraries Government Documents Department

Analysis and Optimization of Graphene FET based Nanoelectronic Integrated Circuits

Description: Like cell to the human body, transistors are the basic building blocks of any electronics circuits. Silicon has been the industries obvious choice for making transistors. Transistors with large size occupy large chip area, consume lots of power and the number of functionalities will be limited due to area constraints. Thus to make the devices smaller, smarter and faster, the transistors are aggressively scaled down in each generation. Moore's law states that the transistors count in any electronic circuits doubles every 18 months. Following this Moore's law, the transistor has already been scaled down to 14 nm. However there are limitations to how much further these transistors can be scaled down. Particularly below 10 nm, these silicon based transistors hit the fundamental limits like loss of gate control, high leakage and various other short channel effects. Thus it is not possible to favor the silicon transistors for future electronics applications. As a result, the research has shifted to new device concepts and device materials alternative to silicon. Carbon is the next abundant element found in the Earth and one of such carbon based nanomaterial is graphene. Graphene when extracted from Graphite, the same material used as the lid in pencil, have a tremendous potential to take future electronics devices to new heights in terms of size, cost and efficiency. Thus after its first experimental discovery of graphene in 2004, graphene has been the leading research area for both academics as well as industries. This dissertation is focused on the analysis and optimization of graphene based circuits for future electronics. The first part of this dissertation considers graphene based transistors for analog/radio frequency (RF) circuits. In this section, a dual gate Graphene Field Effect Transistor (GFET) is considered to build the case study circuits like voltage controlled oscillator (VCO) and low ...
Access: This item is restricted to UNT Community Members. Login required if off-campus.
Date: May 2016
Creator: Joshi, Shital
Partner: UNT Libraries

Analysis and Performance of a Cyber-Human System and Protocols for Geographically Separated Collaborators

Description: This dissertation provides an innovative mechanism to collaborate two geographically separated people on a physical task and a novel method to measure Complexity Index (CI) and calculate Minimal Complexity Index (MCI) of a collaboration protocol. The protocol is represented as a structure, and the information content of it is measured in bits to understand the complex nature of the protocol. Using the complexity metrics, one can analyze the performance of a collaborative system and a collaboration protocol. Security and privacy of the consumers are vital while seeking remote help; this dissertation also provides a novel authorization framework for dynamic access control of resources on an input-constrained appliance used for completing the physical task. Using the innovative Collaborative Appliance for REmote-help (CARE) and with the support of a remotely located expert, fifty-nine subjects with minimal or no prior mechanical knowledge are able to elevate a car for replacing a tire in an average time of six minutes and 53 seconds and with an average protocol complexity of 171.6 bits. Moreover, thirty subjects with minimal or no prior plumbing knowledge are able to change the cartridge of a faucet in an average time of ten minutes and with an average protocol complexity of 250.6 bits. Our experiments and results show that one can use the developed mechanism and methods for expanding the protocols for a variety of home, vehicle, and appliance repairs and installations.
Access: This item is restricted to UNT Community Members. Login required if off-campus.
Date: December 2017
Creator: Jonnada, Srikanth
Partner: UNT Libraries

Analyzing Microwave Spectra Collected by the Solar Radio Burst Locator

Description: Modern communication systems rely heavily upon microwave, radio, and other electromagnetic frequency bands as a means of providing wireless communication links. Although convenient, wireless communication is susceptible to electromagnetic interference. Solar activity causes both direct interference through electromagnetic radiation as well as indirect interference caused by charged particles interacting with Earth's magnetic field. The Solar Radio Burst Locator (SRBL) is a United States Air Force radio telescope designed to detect and locate solar microwave bursts as they occur on the Sun. By analyzing these events, the Air Force hopes to gain a better understanding of the root causes of solar interference and improve interference forecasts. This thesis presents methods of searching and analyzing events found in the previously unstudied SRBL data archive. A new web-based application aids in the searching and visualization of the data. Comparative analysis is performed amongst data collected by SRBL and several other instruments. This thesis also analyzes events across the time, intensity, and frequency domains. These analysis methods can be used to aid in the detection and understanding of solar events so as to provide improved forecasts of solar-induced electromagnetic interference.
Date: May 2007
Creator: Kincaid, Cheryl-Annette
Partner: UNT Libraries

Analyzing PICL trace data with MEDEA

Description: Execution traces and performance statistics can be collected for parallel applications on a variety of multiprocessor platforms by using the Portable Instrumented Communication Library (PICL). The static and dynamic performance characteristics of performance data can be analyzed easily and effectively with the facilities provided within the MEasurements Description Evaluation and Analysis tool (MEDEA). This report describes the integration of the PICL trace file format into MEDEA. A case study is then outlined that uses PICL and MEDEA to characterize the performance of a parallel benchmark code executed on different hardware platforms and using different parallel algorithms and communication protocols.
Date: November 1, 1993
Creator: Merlo, A. P. & Worley, P. H.
Partner: UNT Libraries Government Documents Department

Anchor Nodes Placement for Effective Passive Localization

Description: Wireless sensor networks are composed of sensor nodes, which can monitor an environment and observe events of interest. These networks are applied in various fields including but not limited to environmental, industrial and habitat monitoring. In many applications, the exact location of the sensor nodes is unknown after deployment. Localization is a process used to find sensor node's positional coordinates, which is vital information. The localization is generally assisted by anchor nodes that are also sensor nodes but with known locations. Anchor nodes generally are expensive and need to be optimally placed for effective localization. Passive localization is one of the localization techniques where the sensor nodes silently listen to the global events like thunder sounds, seismic waves, lighting, etc. According to previous studies, the ideal location to place anchor nodes was on the perimeter of the sensor network. This may not be the case in passive localization, since the function of anchor nodes here is different than the anchor nodes used in other localization systems. I do extensive studies on positioning anchor nodes for effective localization. Several simulations are run in dense and sparse networks for proper positioning of anchor nodes. I show that, for effective passive localization, the optimal placement of the anchor nodes is at the center of the network in such a way that no three anchor nodes share linearity. The more the non-linearity, the better the localization. The localization for our network design proves better when I place anchor nodes at right angles.
Date: August 2010
Creator: Pasupathy, Karthikeyan
Partner: UNT Libraries

Anchor Nodes Placement for Effective Passive Localization

Description: This paper discusses anchor nodes placement for effective passive localization. The authors show that, for effective passive localization, the optimal placement of the anchor nodes is at the center of the network in such a way that no three anchor nodes share linearity.
Date: 2011
Creator: Akl, Robert G.; Pasupathy, Karthikeyan & Haidar, Mohamad
Partner: UNT College of Engineering

An Annotated Bibliography of Mobile Agents in Networks

Description: The purpose of this thesis is to present a comprehensive colligation of applications of mobile agents in networks, and provide a baseline association of these systems. This work has been motivated by the fact that mobile agent systems have been deemed proficuous alternatives in system applications. Several mobile agent systems have been developed to provide scalable and cogent solutions in network-centric applications. This thesis examines some existing mobile agent systems in core networking areas, in particular, those of network and resource management, routing, and the provision of fault tolerance and security. The inherent features of these systems are discussed with respect to their specific functionalities. The applicability and efficacy of mobile agents are further considered in the specific areas mentioned above. Although an initial foray into a collation of this nature, the goal of this annotated bibliography is to provide a generic referential view of mobile agent systems in network applications.
Date: December 2002
Creator: Sriraman, Sandhya
Partner: UNT Libraries

Application-Specific Things Architectures for IoT-Based Smart Healthcare Solutions

Description: Human body is a complex system organized at different levels such as cells, tissues and organs, which contributes to 11 important organ systems. The functional efficiency of this complex system is evaluated as health. Traditional healthcare is unable to accommodate everyone's need due to the ever-increasing population and medical costs. With advancements in technology and medical research, traditional healthcare applications are shaping into smart healthcare solutions. Smart healthcare helps in continuously monitoring our body parameters, which helps in keeping people health-aware. It provides the ability for remote assistance, which helps in utilizing the available resources to maximum potential. The backbone of smart healthcare solutions is Internet of Things (IoT) which increases the computing capacity of the real-world components by using cloud-based solutions. The basic elements of these IoT based smart healthcare solutions are called "things." Things are simple sensors or actuators, which have the capacity to wirelessly connect with each other and to the internet. The research for this dissertation aims in developing architectures for these things, focusing on IoT-based smart healthcare solutions. The core for this dissertation is to contribute to the research in smart healthcare by identifying applications which can be monitored remotely. For this, application-specific thing architectures were proposed based on monitoring a specific body parameter; monitoring physical health for family and friends; and optimizing the power budget of IoT body sensor network using human body communications. The experimental results show promising scope towards improving the quality of life, through needle-less and cost-effective smart healthcare solutions.
Date: May 2018
Creator: Sundaravadivel, Prabha
Partner: UNT Libraries

An Approach Towards Self-Supervised Classification Using Cyc

Description: Due to the long duration required to perform manual knowledge entry by human knowledge engineers it is desirable to find methods to automatically acquire knowledge about the world by accessing online information. In this work I examine using the Cyc ontology to guide the creation of Naïve Bayes classifiers to provide knowledge about items described in Wikipedia articles. Given an initial set of Wikipedia articles the system uses the ontology to create positive and negative training sets for the classifiers in each category. The order in which classifiers are generated and used to test articles is also guided by the ontology. The research conducted shows that a system can be created that utilizes statistical text classification methods to extract information from an ad-hoc generated information source like Wikipedia for use in a formal semantic ontology like Cyc. Benefits and limitations of the system are discussed along with future work.
Date: December 2006
Creator: Coursey, Kino High
Partner: UNT Libraries

Automated Classification of Emotions Using Song Lyrics

Description: This thesis explores the classification of emotions in song lyrics, using automatic approaches applied to a novel corpus of 100 popular songs. I use crowd sourcing via Amazon Mechanical Turk to collect line-level emotions annotations for this collection of song lyrics. I then build classifiers that rely on textual features to automatically identify the presence of one or more of the following six Ekman emotions: anger, disgust, fear, joy, sadness and surprise. I compare different classification systems and evaluate the performance of the automatic systems against the manual annotations. I also introduce a system that uses data collected from the social network Twitter. I use the Twitter API to collect a large corpus of tweets manually labeled by their authors for one of the six emotions of interest. I then compare the classification of emotions obtained when training on data automatically collected from Twitter versus data obtained through crowd sourced annotations.
Date: December 2012
Creator: Schellenberg, Rajitha
Partner: UNT Libraries

Automated Defense Against Worm Propagation.

Description: Worms have caused significant destruction over the last few years. Network security elements such as firewalls, IDS, etc have been ineffective against worms. Some worms are so fast that a manual intervention is not possible. This brings in the need for a stronger security architecture which can automatically react to stop worm propagation. The method has to be signature independent so that it can stop new worms. In this thesis, an automated defense system (ADS) is developed to automate defense against worms and contain the worm to a level where manual intervention is possible. This is accomplished with a two level architecture with feedback at each level. The inner loop is based on control system theory and uses the properties of PID (proportional, integral and differential controller). The outer loop works at the network level and stops the worm to reach its spread saturation point. In our lab setup, we verified that with only inner loop active the worm was delayed, and with both loops active we were able to restrict the propagation to 10% of the targeted hosts. One concern for deployment of a worm containment mechanism was degradation of throughput for legitimate traffic. We found that with proper intelligent algorithm we can minimize the degradation to an acceptable level.
Access: This item is restricted to UNT Community Members. Login required if off-campus.
Date: December 2005
Creator: Patwardhan, Sudeep
Partner: UNT Libraries

Automated GUI Tests Generation for Android Apps Using Q-learning

Description: Mobile applications are growing in popularity and pose new problems in the area of software testing. In particular, mobile applications heavily depend upon user interactions and a dynamically changing environment of system events. In this thesis, we focus on user-driven events and use Q-learning, a reinforcement machine learning algorithm, to generate tests for Android applications under test (AUT). We implement a framework that automates the generation of GUI test cases by using our Q-learning approach and compare it to a uniform random (UR) implementation. A novel feature of our approach is that we generate user-driven event sequences through the GUI, without the source code or the model of the AUT. Hence, considerable amount of cost and time are saved by avoiding the need for model generation for generating the tests. Our results show that the systematic path exploration used by Q-learning results in higher average code coverage in comparison to the uniform random approach.
Date: May 2017
Creator: Koppula, Sreedevi
Partner: UNT Libraries

Automated Real-time Objects Detection in Colonoscopy Videos for Quality Measurements

Description: The effectiveness of colonoscopy depends on the quality of the inspection of the colon. There was no automated measurement method to evaluate the quality of the inspection. This thesis addresses this issue by investigating an automated post-procedure quality measurement technique and proposing a novel approach automatically deciding a percentage of stool areas in images of digitized colonoscopy video files. It involves the classification of image pixels based on their color features using a new method of planes on RGB (red, green and blue) color space. The limitation of post-procedure quality measurement is that quality measurements are available long after the procedure was done and the patient was released. A better approach is to inform any sub-optimal inspection immediately so that the endoscopist can improve the quality in real-time during the procedure. This thesis also proposes an extension to post-procedure method to detect stool, bite-block, and blood regions in real-time using color features in HSV color space. These three objects play a major role in quality measurements in colonoscopy. The proposed method partitions very large positive examples of each of these objects into a number of groups. These groups are formed by taking intersection of positive examples with a hyper plane. This hyper plane is named as 'positive plane'. 'Convex hulls' are used to model positive planes. Comparisons with traditional classifiers such as K-nearest neighbor (K-NN) and support vector machines (SVM) proves the soundness of the proposed method in terms of accuracy and speed that are critical in the targeted real-time quality measurement system.
Date: August 2013
Creator: Kumara, Muthukudage Jayantha
Partner: UNT Libraries