A Wireless Traffic Surveillance System Using Video Analytics

A Wireless Traffic Surveillance System Using Video Analytics

Date: May 2011
Creator: Luo, Ning
Description: Video surveillance systems have been commonly used in transportation systems to support traffic monitoring, speed estimation, and incident detection. However, there are several challenges in developing and deploying such systems, including high development and maintenance costs, bandwidth bottleneck for long range link, and lack of advanced analytics. In this thesis, I leverage current wireless, video camera, and analytics technologies, and present a wireless traffic monitoring system. I first present an overview of the system. Then I describe the site investigation and several test links with different hardware/software configurations to demonstrate the effectiveness of the system. The system development process was documented to provide guidelines for future development. Furthermore, I propose a novel speed-estimation analytics algorithm that takes into consideration roads with slope angles. I prove the correctness of the algorithm theoretically, and validate the effectiveness of the algorithm experimentally. The experimental results on both synthetic and real dataset show that the algorithm is more accurate than the baseline algorithm 80% of the time. On average the accuracy improvement of speed estimation is over 3.7% even for very small slope angles.
Contributing Partner: UNT Libraries
Anchor Nodes Placement for Effective Passive Localization

Anchor Nodes Placement for Effective Passive Localization

Access: Use of this item is restricted to the UNT Community.
Date: August 2010
Creator: Pasupathy, Karthikeyan
Description: Wireless sensor networks are composed of sensor nodes, which can monitor an environment and observe events of interest. These networks are applied in various fields including but not limited to environmental, industrial and habitat monitoring. In many applications, the exact location of the sensor nodes is unknown after deployment. Localization is a process used to find sensor node's positional coordinates, which is vital information. The localization is generally assisted by anchor nodes that are also sensor nodes but with known locations. Anchor nodes generally are expensive and need to be optimally placed for effective localization. Passive localization is one of the localization techniques where the sensor nodes silently listen to the global events like thunder sounds, seismic waves, lighting, etc. According to previous studies, the ideal location to place anchor nodes was on the perimeter of the sensor network. This may not be the case in passive localization, since the function of anchor nodes here is different than the anchor nodes used in other localization systems. I do extensive studies on positioning anchor nodes for effective localization. Several simulations are run in dense and sparse networks for proper positioning of anchor nodes. I show that, for effective passive localization, the ...
Contributing Partner: UNT Libraries
Effective and Accelerated Informative Frame Filtering in Colonoscopy Videos Using Graphic Processing Units

Effective and Accelerated Informative Frame Filtering in Colonoscopy Videos Using Graphic Processing Units

Date: August 2010
Creator: Karri, Venkata Praveen
Description: Colonoscopy is an endoscopic technique that allows a physician to inspect the mucosa of the human colon. Previous methods and software solutions to detect informative frames in a colonoscopy video (a process called informative frame filtering or IFF) have been hugely ineffective in (1) covering the proper definition of an informative frame in the broadest sense and (2) striking an optimal balance between accuracy and speed of classification in both real-time and non real-time medical procedures. In my thesis, I propose a more effective method and faster software solutions for IFF which is more effective due to the introduction of a heuristic algorithm (derived from experimental analysis of typical colon features) for classification. It contributed to a 5-10% boost in various performance metrics for IFF. The software modules are faster due to the incorporation of sophisticated parallel-processing oriented coding techniques on modern microprocessors. Two IFF modules were created, one for post-procedure and the other for real-time. Code optimizations through NVIDIA CUDA for GPU processing and/or CPU multi-threading concepts embedded in two significant microprocessor design philosophies (multi-core design and many-core design) resulted a 5-fold acceleration for the post-procedure module and a 40-fold acceleration for the real-time module. Some innovative software modules, ...
Contributing Partner: UNT Libraries
Arithmetic Computations and Memory Management Using a Binary Tree Encoding af Natural Numbers

Arithmetic Computations and Memory Management Using a Binary Tree Encoding af Natural Numbers

Date: December 2011
Creator: Haraburda, David
Description: Two applications of a binary tree data type based on a simple pairing function (a bijection between natural numbers and pairs of natural numbers) are explored. First, the tree is used to encode natural numbers, and algorithms that perform basic arithmetic computations are presented along with formal proofs of their correctness. Second, using this "canonical" representation as a base type, algorithms for encoding and decoding additional isomorphic data types of other mathematical constructs (sets, sequences, etc.) are also developed. An experimental application to a memory management system is constructed and explored using these isomorphic types. A practical analysis of this system's runtime complexity and space savings are provided, along with a proof of concept framework for both applications of the binary tree type, in the Java programming language.
Contributing Partner: UNT Libraries
Kalman Filtering Approach to Optimize OFDM Data Rate

Kalman Filtering Approach to Optimize OFDM Data Rate

Date: August 2011
Creator: Wunnava, Sashi Prabha
Description: This study is based on applying a non-linear mapping method, here the unscented Kalman filter; to estimate and optimize data rate resulting from the arrival rate having a Poisson distribution in an orthogonal frequency division multiplexing (OFDM) transmission system. OFDM is an emerging multi-carrier modulation scheme. With the growing need for quality of service in wireless communications, it is highly necessary to optimize resources in such a way that the overall performance of the system models should rise while keeping in mind the objective to achieve high data rate and efficient spectral methods in the near future. In this study, the results from the OFDM-TDMA transmission system have been used to apply cross-layer optimization between layers so as to treat different resources between layers simultaneously. The main controller manages the transmission of data between layers using the multicarrier modulation techniques. The unscented Kalman filter is used here to perform nonlinear mapping by estimating and optimizing the data rate, which result from the arrival rate having a Poisson distribution.
Contributing Partner: UNT Libraries
End of Insertion Detection in Colonoscopy Videos

End of Insertion Detection in Colonoscopy Videos

Date: August 2009
Creator: Malik, Avnish Rajbal
Description: Colorectal cancer is the second leading cause of cancer-related deaths behind lung cancer in the United States. Colonoscopy is the preferred screening method for detection of diseases like Colorectal Cancer. In the year 2006, American Society for Gastrointestinal Endoscopy (ASGE) and American College of Gastroenterology (ACG) issued guidelines for quality colonoscopy. The guidelines suggest that on average the withdrawal phase during a screening colonoscopy should last a minimum of 6 minutes. My aim is to classify the colonoscopy video into insertion and withdrawal phase. The problem is that currently existing shot detection techniques cannot be applied because colonoscopy is a single camera shot from start to end. An algorithm to detect phase boundary has already been developed by the MIGLAB team. Existing method has acceptable levels of accuracy but the main issue is dependency on MPEG (Moving Pictures Expert Group) 1/2. I implemented exhaustive search for motion estimation to reduce the execution time and improve the accuracy. I took advantages of the C/C++ programming languages with multithreading which helped us get even better performances in terms of execution time. I propose a method for improving the current method of colonoscopy video analysis and also an extension for the same to ...
Contributing Partner: UNT Libraries
Study of the effects of background and motion camera on the efficacy of Kalman and particle filter algorithms.

Study of the effects of background and motion camera on the efficacy of Kalman and particle filter algorithms.

Date: August 2009
Creator: Morita, Yasuhiro
Description: This study compares independent use of two known algorithms (Kalmar filter with background subtraction and Particle Filter) that are commonly deployed in object tracking applications. Object tracking in general is very challenging; it presents numerous problems that need to be addressed by the application in order to facilitate its successful deployment. Such problems range from abrupt object motion, during tracking, to a change in appearance of the scene and the object, as well as object to scene occlusions, and camera motion among others. It is important to take into consideration some issues, such as, accounting for noise associated with the image in question, ability to predict to an acceptable statistical accuracy, the position of the object at a particular time given its current position. This study tackles some of the issues raised above prior to addressing how the use of either of the aforementioned algorithm, minimize or in some cases eliminate the negative effects
Contributing Partner: UNT Libraries
Classifying Pairwise Object Interactions: A Trajectory Analytics Approach

Classifying Pairwise Object Interactions: A Trajectory Analytics Approach

Date: May 2015
Creator: Janmohammadi, Siamak
Description: We have a huge amount of video data from extensively available surveillance cameras and increasingly growing technology to record the motion of a moving object in the form of trajectory data. With proliferation of location-enabled devices and ongoing growth in smartphone penetration as well as advancements in exploiting image processing techniques, tracking moving objects is more flawlessly achievable. In this work, we explore some domain-independent qualitative and quantitative features in raw trajectory (spatio-temporal) data in videos captured by a fixed single wide-angle view camera sensor in outdoor areas. We study the efficacy of those features in classifying four basic high level actions by employing two supervised learning algorithms and show how each of the features affect the learning algorithms’ overall accuracy as a single factor or confounded with others.
Contributing Partner: UNT Libraries
Effects of UE Speed on MIMO Channel Capacity in LTE

Effects of UE Speed on MIMO Channel Capacity in LTE

Access: Use of this item is restricted to the UNT Community.
Date: August 2016
Creator: Shukla, Rahul
Description: With the introduction of 4G LTE, multiple new technologies were introduced. MIMO is one of the important technologies introduced with fourth generation. The main MIMO modes used in LTE are open loop and closed loop spatial multiplexing modes. This thesis develops an algorithm to calculate the threshold values of UE speed and SNR that is required to implement a switching algorithm which can switch between different MIMO modes for a UE based on the speed and channel conditions (CSI). Specifically, this thesis provides the values of UE speed and SNR at which we can get better results by switching between open loop and closed loop MIMO modes and then be scheduled in sub-channels accordingly. Thus, the results can be used effectively to get better channel capacity with less ISI. The main objectives of this thesis are: to determine the type of MIMO mode suitable for a UE with certain speed, to determine the effects of SNR on selection of MIMO modes, and to design and implement a scheduling algorithm to enhance channel capacity.
Contributing Partner: UNT Libraries
Qos Aware Service Oriented Architecture

Qos Aware Service Oriented Architecture

Date: August 2013
Creator: Adepu, Sagarika
Description: Service-oriented architecture enables web services to operate in a loosely-coupled setting and provides an environment for dynamic discovery and use of services over a network using standards such as WSDL, SOAP, and UDDI. Web service has both functional and non-functional characteristics. This thesis work proposes to add QoS descriptions (non-functional properties) to WSDL and compose various services to form a business process. This composition of web services also considers QoS properties along with functional properties and the composed services can again be published as a new Web Service and can be part of any other composition using Composed WSDL.
Contributing Partner: UNT Libraries
Integrity Verification of Applications on Radium Architecture

Integrity Verification of Applications on Radium Architecture

Date: August 2015
Creator: Tarigopula, Mohan Krishna
Description: Trusted Computing capability has become ubiquitous these days, and it is being widely deployed into consumer devices as well as enterprise platforms. As the number of threats is increasing at an exponential rate, it is becoming a daunting task to secure the systems against them. In this context, the software integrity measurement at runtime with the support of trusted platforms can be a better security strategy. Trusted Computing devices like TPM secure the evidence of a breach or an attack. These devices remain tamper proof if the hardware platform is physically secured. This type of trusted security is crucial for forensic analysis in the aftermath of a breach. The advantages of trusted platforms can be further leveraged if they can be used wisely. RADIUM (Race-free on-demand Integrity Measurement Architecture) is one such architecture, which is built on the strength of TPM. RADIUM provides an asynchronous root of trust to overcome the TOC condition of DRTM. Even though the underlying architecture is trusted, attacks can still compromise applications during runtime by exploiting their vulnerabilities. I propose an application-level integrity measurement solution that fits into RADIUM, to expand the trusted computing capability to the application layer. This is based on the concept ...
Contributing Partner: UNT Libraries
The Design Of A Benchmark For Geo-stream Management Systems

The Design Of A Benchmark For Geo-stream Management Systems

Date: December 2011
Creator: Shen, Chao
Description: The recent growth in sensor technology allows easier information gathering in real-time as sensors have grown smaller, more accurate, and less expensive. The resulting data is often in a geo-stream format continuously changing input with a spatial extent. Researchers developing geo-streaming management systems (GSMS) require a benchmark system for evaluation, which is currently lacking. This thesis presents GSMark, a benchmark for evaluating GSMSs. GSMark provides a data generator that creates a combination of synthetic and real geo-streaming data, a workload simulator to present the data to the GSMS as a data stream, and a set of benchmark queries that evaluate typical GSMS functionality and query performance. In particular, GSMark generates both moving points and evolving spatial regions, two fundamental data types for a broad range of geo-stream applications, and the geo-streaming queries on this data.
Contributing Partner: UNT Libraries
Radium: Secure Policy Engine in Hypervisor

Radium: Secure Policy Engine in Hypervisor

Date: August 2015
Creator: Shah, Tawfiq M
Description: The basis of today’s security systems is the trust and confidence that the system will behave as expected and are in a known good trusted state. The trust is built from hardware and software elements that generates a chain of trust that originates from a trusted known entity. Leveraging hardware, software and a mandatory access control policy technology is needed to create a trusted measurement environment. Employing a control layer (hypervisor or microkernel) with the ability to enforce a fine grained access control policy with hyper call granularity across multiple guest virtual domains can ensure that any malicious environment to be contained. In my research, I propose the use of radium's Asynchronous Root of Trust Measurement (ARTM) capability incorporated with a secure mandatory access control policy engine that would mitigate the limitations of the current hardware TPM solutions. By employing ARTM we can leverage asynchronous use of boot, launch, and use with the hypervisor proving its state and the integrity of the secure policy. My solution is using Radium (Race free on demand integrity architecture) architecture that will allow a more detailed measurement of applications at run time with greater semantic knowledge of the measured environments. Radium incorporation of a ...
Contributing Partner: UNT Libraries
Maintaining Web Applications Integrity Running on Radium

Maintaining Web Applications Integrity Running on Radium

Date: August 2015
Creator: Ur-Rehman, Wasi
Description: Computer security attacks take place due to the presence of vulnerabilities and bugs in software applications. Bugs and vulnerabilities are the result of weak software architecture and lack of standard software development practices. Despite the fact that software companies are investing millions of dollars in the research and development of software designs security risks are still at large. In some cases software applications are found to carry vulnerabilities for many years before being identified. A recent such example is the popular Heart Bleed Bug in the Open SSL/TSL. In today’s world, where new software application are continuously being developed for a varied community of users; it’s highly unlikely to have software applications running without flaws. Attackers on computer system securities exploit these vulnerabilities and bugs and cause threat to privacy without leaving any trace. The most critical vulnerabilities are those which are related to the integrity of the software applications. Because integrity is directly linked to the credibility of software application and data it contains. Here I am giving solution of maintaining web applications integrity running on RADIUM by using daikon. Daikon generates invariants, these invariants are used to maintain the integrity of the web application and also check the ...
Contributing Partner: UNT Libraries
Unique Channel Email System

Unique Channel Email System

Date: August 2015
Creator: Balakchiev, Milko
Description: Email connects 85% of the world. This paper explores the pattern of information overload encountered by majority of email users and examine what steps key email providers are taking to combat the problem. Besides fighting spam, popular email providers offer very limited tools to reduce the amount of unwanted incoming email. Rather, there has been a trend to expand storage space and aid the organization of email. Storing email is very costly and harmful to the environment. Additionally, information overload can be detrimental to productivity. We propose a simple solution that results in drastic reduction of unwanted mail, also known as graymail.
Contributing Partner: UNT Libraries
Baseband Noise Suppression in Ofdm Using Kalman Filter

Baseband Noise Suppression in Ofdm Using Kalman Filter

Date: May 2012
Creator: Rodda, Lasya
Description: As the technology is advances the reduced size of hardware gives rise to an additive 1/f baseband noise. This additive 1/f noise is a system noise generated due to miniaturization of hardware and affects the lower frequencies. Though 1/f noise does not show much effect in wide band channels because of its nature to affect only certain frequencies, 1/f noise becomes a prominent in OFDM communication systems where narrow band channels are used. in this thesis, I study the effects of 1/f noise on the OFDM systems and implement algorithms for estimation and suppression of the noise using Kalman filter. Suppression of the noise is achieved by subtracting the estimated noise from the received noise. I show that the performance of the system is considerably improved by applying the 1/f noise suppression.
Contributing Partner: UNT Libraries
Toward a Data-Type-Based Real Time Geospatial Data Stream Management System

Toward a Data-Type-Based Real Time Geospatial Data Stream Management System

Date: May 2011
Creator: Zhang, Chengyang
Description: The advent of sensory and communication technologies enables the generation and consumption of large volumes of streaming data. Many of these data streams are geo-referenced. Existing spatio-temporal databases and data stream management systems are not capable of handling real time queries on spatial extents. In this thesis, we investigated several fundamental research issues toward building a data-type-based real time geospatial data stream management system. The thesis makes contributions in the following areas: geo-stream data models, aggregation, window-based nearest neighbor operators, and query optimization strategies. The proposed geo-stream data model is based on second-order logic and multi-typed algebra. Both abstract and discrete data models are proposed and exemplified. I further propose two useful geo-stream operators, namely Region By and WNN, which abstract common aggregation and nearest neighbor queries as generalized data model constructs. Finally, I propose three query optimization algorithms based on spatial, temporal, and spatio-temporal constraints of geo-streams. I show the effectiveness of the data model through many query examples. The effectiveness and the efficiency of the algorithms are validated through extensive experiments on both synthetic and real data sets. This work established the fundamental building blocks toward a full-fledged geo-stream database management system and has potential impact in many ...
Contributing Partner: UNT Libraries
Scene Analysis Using Scale Invariant Feature Extraction and Probabilistic Modeling

Scene Analysis Using Scale Invariant Feature Extraction and Probabilistic Modeling

Access: Use of this item is restricted to the UNT Community.
Date: August 2011
Creator: Shen, Yao
Description: Conventional pattern recognition systems have two components: feature analysis and pattern classification. For any object in an image, features could be considered as the major characteristic of the object either for object recognition or object tracking purpose. Features extracted from a training image, can be used to identify the object when attempting to locate the object in a test image containing many other objects. To perform reliable scene analysis, it is important that the features extracted from the training image are detectable even under changes in image scale, noise and illumination. Scale invariant feature has wide applications such as image classification, object recognition and object tracking in the image processing area. In this thesis, color feature and SIFT (scale invariant feature transform) are considered to be scale invariant feature. The classification, recognition and tracking result were evaluated with novel evaluation criterion and compared with some existing methods. I also studied different types of scale invariant feature for the purpose of solving scene analysis problems. I propose probabilistic models as the foundation of analysis scene scenario of images. In order to differential the content of image, I develop novel algorithms for the adaptive combination for multiple features extracted from images. I ...
Contributing Partner: UNT Libraries
Occlusion Tolerant Object Recognition Methods for Video Surveillance and Tracking of Moving Civilian Vehicles

Occlusion Tolerant Object Recognition Methods for Video Surveillance and Tracking of Moving Civilian Vehicles

Date: December 2007
Creator: Pati, Nishikanta
Description: Recently, there is a great interest in moving object tracking in the fields of security and surveillance. Object recognition under partial occlusion is the core of any object tracking system. This thesis presents an automatic and real-time color object-recognition system which is not only robust but also occlusion tolerant. The intended use of the system is to recognize and track external vehicles entered inside a secured area like a school campus or any army base. Statistical morphological skeleton is used to represent the visible shape of the vehicle. Simple curve matching and different feature based matching techniques are used to recognize the segmented vehicle. Features of the vehicle are extracted upon entering the secured area. The vehicle is recognized from either a digital video frame or a static digital image when needed. The recognition engine will help the design of a high performance tracking system meant for remote video surveillance.
Contributing Partner: UNT Libraries
Video Analytics with Spatio-Temporal Characteristics of Activities

Video Analytics with Spatio-Temporal Characteristics of Activities

Date: May 2015
Creator: Cheng, Guangchun
Description: As video capturing devices become more ubiquitous from surveillance cameras to smart phones, the demand of automated video analysis is increasing as never before. One obstacle in this process is to efficiently locate where a human operator’s attention should be, and another is to determine the specific types of activities or actions without ambiguity. It is the special interest of this dissertation to locate spatial and temporal regions of interest in videos and to develop a better action representation for video-based activity analysis. This dissertation follows the scheme of “locating then recognizing” activities of interest in videos, i.e., locations of potentially interesting activities are estimated before performing in-depth analysis. Theoretical properties of regions of interest in videos are first exploited, based on which a unifying framework is proposed to locate both spatial and temporal regions of interest with the same settings of parameters. The approach estimates the distribution of motion based on 3D structure tensors, and locates regions of interest according to persistent occurrences of low probability. Two contributions are further made to better represent the actions. The first is to construct a unifying model of spatio-temporal relationships between reusable mid-level actions which bridge low-level pixels and high-level activities. Dense ...
Contributing Partner: UNT Libraries
An Empirical Study of How Novice Programmers Use the Web

An Empirical Study of How Novice Programmers Use the Web

Date: May 2016
Creator: Tula, Naveen
Description: Students often use the web as a source of help for problems that they encounter on programming assignments.In this work, we seek to understand how students use the web to search for help on their assignments.We used a mixed methods approach with 344 students who complete a survey and 41 students who participate in a focus group meetings and helped in recording data about their search habits.The survey reveals data about student reported search habits while the focus group uses a web browser plug-in to record actual search patterns.We examine the results collectively and as broken down by class year.Survey results show that at least 2/3 of the students from each class year rely on search engines to locate resources for help with their programming bugs in at least half of their assignments;search habits vary by class year;and the value of different types of resources such as tutorials and forums varies by class year.Focus group results exposes the high frequency web sites used by the students in solving their programming assignments.
Contributing Partner: UNT Libraries
Simulink(R) Based Design and Implementation of a Solar Power Based Mobile Charger

Simulink(R) Based Design and Implementation of a Solar Power Based Mobile Charger

Date: May 2016
Creator: Mukka, Manoj Kumar
Description: Electrical energy is used at approximately the rate of 15 Terawatts world-wide. Generating this much energy has become a primary concern for all nations. There are many ways of generating energy among which the most commonly used are non-renewable and will extinct much sooner than expected. Very active research is going on both to increase the use of renewable energy sources and to use the available energy with more efficiency. Among these sources, solar energy is being considered as the most abundant and has received high attention. The mobile phone has become one of the basic needs of modern life, with almost every human being having one.Individually a mobile phone consumes little power but collectively this becomes very large. This consideration motivated the research undertaken in this masters thesis. The objective of this thesis is to design a model for solar power based charging circuits for mobile phone using Simulink(R). This thesis explains a design procedure of solar power based mobile charger circuit using Simulink(R) which includes the models for the photo-voltaic array, maximum power point tracker, pulse width modulator, DC-DC converter and a battery.The first part of the thesis concentrates on electron level behavior of a solar cell, its ...
Contributing Partner: UNT Libraries
Evaluation Techniques and Graph-Based Algorithms for Automatic Summarization and Keyphrase Extraction

Evaluation Techniques and Graph-Based Algorithms for Automatic Summarization and Keyphrase Extraction

Date: August 2016
Creator: Hamid, Fahmida
Description: Automatic text summarization and keyphrase extraction are two interesting areas of research which extend along natural language processing and information retrieval. They have recently become very popular because of their wide applicability. Devising generic techniques for these tasks is challenging due to several issues. Yet we have a good number of intelligent systems performing the tasks. As different systems are designed with different perspectives, evaluating their performances with a generic strategy is crucial. It has also become immensely important to evaluate the performances with minimal human effort. In our work, we focus on designing a relativized scale for evaluating different algorithms. This is our major contribution which challenges the traditional approach of working with an absolute scale. We consider the impact of some of the environment variables (length of the document, references, and system-generated outputs) on the performance. Instead of defining some rigid lengths, we show how to adjust to their variations. We prove a mathematically sound baseline that should work for all kinds of documents. We emphasize automatically determining the syntactic well-formedness of the structures (sentences). We also propose defining an equivalence class for each unit (e.g. word) instead of the exact string matching strategy. We show an evaluation ...
Contributing Partner: UNT Libraries
Data-Driven Decision-Making Framework for Large-Scale Dynamical Systems under Uncertainty

Data-Driven Decision-Making Framework for Large-Scale Dynamical Systems under Uncertainty

Access: Use of this item is restricted to the UNT Community.
Date: August 2016
Creator: Xie, Junfei
Description: Managing large-scale dynamical systems (e.g., transportation systems, complex information systems, and power networks, etc.) in real-time is very challenging considering their complicated system dynamics, intricate network interactions, large scale, and especially the existence of various uncertainties. To address this issue, intelligent techniques which can quickly design decision-making strategies that are robust to uncertainties are needed. This dissertation aims to conquer these challenges by exploring a data-driven decision-making framework, which leverages big-data techniques and scalable uncertainty evaluation approaches to quickly solve optimal control problems. In particular, following techniques have been developed along this direction: 1) system modeling approaches to simplify the system analysis and design procedures for multiple applications; 2) effective simulation and analytical based approaches to efficiently evaluate system performance and design control strategies under uncertainty; and 3) big-data techniques that allow some computations of control strategies to be completed offline. These techniques and tools for analysis, design and control contribute to a wide range of applications including air traffic flow management, complex information systems, and airborne networks.
Contributing Partner: UNT Libraries
FIRST PREV 1 2 NEXT LAST