UNT Libraries - Browse

ABOUT BROWSE FEED

Infusing Automatic Question Generation with Natural Language Understanding

Description: Automatically generating questions from text for educational purposes is an active research area in natural language processing. The automatic question generation system accompanying this dissertation is MARGE, which is a recursive acronym for: MARGE automatically reads generates and evaluates. MARGE generates questions from both individual sentences and the passage as a whole, and is the first question generation system to successfully generate meaningful questions from textual units larger than a sentence. Prior work in automatic question generation from text treats a sentence as a string of constituents to be rearranged into as many questions as allowed by English grammar rules. Consequently, such systems overgenerate and create mainly trivial questions. Further, none of these systems to date has been able to automatically determine which questions are meaningful and which are trivial. This is because the research focus has been placed on NLG at the expense of NLU. In contrast, the work presented here infuses the questions generation process with natural language understanding. From the input text, MARGE creates a meaning analysis representation for each sentence in a passage via the DeconStructure algorithm presented in this work. Questions are generated from sentence meaning analysis representations using templates. The generated questions are automatically evaluated for question quality and importance via a ranking algorithm.
Date: December 2016
Creator: Mazidi, Karen

Simulink Based Modeling of a Multi Global Navigation Satellite System

Description: The objective of this thesis is to design a model for a multi global navigation satellite system using Simulink. It explains a design procedure which includes the models for transmitter and receiver for two different navigation systems. To overcome the problem, where less number of satellites are visible to determine location degrades the performance of any positioning system significantly, this research has done to make use of multi GNSS satellite signals in one navigation receiver.
Date: December 2016
Creator: Mukka, Nagaraju

Privacy Preserving EEG-based Authentication Using Perceptual Hashing

Description: The use of electroencephalogram (EEG), an electrophysiological monitoring method for recording the brain activity, for authentication has attracted the interest of researchers for over a decade. In addition to exhibiting qualities of biometric-based authentication, they are revocable, impossible to mimic, and resistant to coercion attacks. However, EEG signals carry a wealth of information about an individual and can reveal private information about the user. This brings significant privacy issues to EEG-based authentication systems as they have access to raw EEG signals. This thesis proposes a privacy-preserving EEG-based authentication system that preserves the privacy of the user by not revealing the raw EEG signals while allowing the system to authenticate the user accurately. In that, perceptual hashing is utilized and instead of raw EEG signals, their perceptually hashed values are used in the authentication process. In addition to describing the authentication process, algorithms to compute the perceptual hash are developed based on two feature extraction techniques. Experimental results show that an authentication system using perceptual hashing can achieve performance comparable to a system that has access to raw EEG signals if enough EEG channels are used in the process. This thesis also presents a security analysis to show that perceptual hashing can prevent information leakage.
Access: This item is restricted to UNT Community Members. Login required if off-campus.
Date: December 2016
Creator: Koppikar, Samir Dilip

Influence of Underlying Random Walk Types in Population Models on Resulting Social Network Types and Epidemiological Dynamics

Description: Epidemiologists rely on human interaction networks for determining states and dynamics of disease propagations in populations. However, such networks are empirical snapshots of the past. It will greatly benefit if human interaction networks are statistically predicted and dynamically created while an epidemic is in progress. We develop an application framework for the generation of human interaction networks and running epidemiological processes utilizing research on human mobility patterns and agent-based modeling. The interaction networks are dynamically constructed by incorporating different types of Random Walks and human rules of engagements. We explore the characteristics of the created network and compare them with the known theoretical and empirical graphs. The dependencies of epidemic dynamics and their outcomes on patterns and parameters of human motion and motives are encountered and presented through this research. This work specifically describes how the types and parameters of random walks define properties of generated graphs. We show that some configurations of the system of agents in random walk can produce network topologies with properties similar to small-world networks. Our goal is to find sets of mobility patterns that lead to empirical-like networks. The possibility of phase transitions in the graphs due to changes in the parameterization of agent walks is the focus of this research as this knowledge can lead to the possibility of disruptions to disease diffusions in populations. This research shall facilitate work of public health researchers to predict the magnitude of an epidemic and estimate resources required for mitigation.
Date: December 2016
Creator: Kolgushev, Oleg Mikhailovich

Real Time Assessment of a Video Game Player's State of Mind Using Off-the-Shelf Electroencephalography

Description: The focus of this research is on the development of a real time application that uses a low cost EEG headset to measure a player's state of mind while they play a video game. Using data collected using the Emotiv EPOC headset, various EEG processing techniques are tested to find ways of measuring a person's engagement and arousal levels. The ability to measure a person's engagement and arousal levels provide an opportunity to develop a model that monitor a person's flow while playing video games. Identifying when certain events occur, like when the player dies, will make it easier to identify when a player has left a state of flow. The real time application Brainwave captures data from the wireless Emotiv EPOC headset. Brainwave converts the raw EEG data into more meaningful brainwave band frequencies. Utilizing the brainwave frequencies the program trains multiple machine learning algorithms with data designed to identify when the player dies. Brainwave runs while the player plays through a video gaming monitoring their engagement and arousal levels for changes that cause the player to leave a state of flow. Brainwave reports to researchers and developers when the player dies along with the identification of the players exit of the state of flow.
Date: December 2016
Creator: McMahan, Timothy

Evaluation Techniques and Graph-Based Algorithms for Automatic Summarization and Keyphrase Extraction

Description: Automatic text summarization and keyphrase extraction are two interesting areas of research which extend along natural language processing and information retrieval. They have recently become very popular because of their wide applicability. Devising generic techniques for these tasks is challenging due to several issues. Yet we have a good number of intelligent systems performing the tasks. As different systems are designed with different perspectives, evaluating their performances with a generic strategy is crucial. It has also become immensely important to evaluate the performances with minimal human effort. In our work, we focus on designing a relativized scale for evaluating different algorithms. This is our major contribution which challenges the traditional approach of working with an absolute scale. We consider the impact of some of the environment variables (length of the document, references, and system-generated outputs) on the performance. Instead of defining some rigid lengths, we show how to adjust to their variations. We prove a mathematically sound baseline that should work for all kinds of documents. We emphasize automatically determining the syntactic well-formedness of the structures (sentences). We also propose defining an equivalence class for each unit (e.g. word) instead of the exact string matching strategy. We show an evaluation approach that considers the weighted relatedness of multiple references to adjust to the degree of disagreements between the gold standards. We publish the proposed approach as a free tool so that other systems can use it. We have also accumulated a dataset (scientific articles) with a reference summary and keyphrases for each document. Our approach is applicable not only for evaluating single-document based tasks but also for evaluating multiple-document based tasks. We have tested our evaluation method for three intrinsic tasks (taken from DUC 2004 conference), and in all three cases, it correlates positively with ROUGE. Based on our experiments ...
Date: August 2016
Creator: Hamid, Fahmida

Effects of UE Speed on MIMO Channel Capacity in LTE

Description: With the introduction of 4G LTE, multiple new technologies were introduced. MIMO is one of the important technologies introduced with fourth generation. The main MIMO modes used in LTE are open loop and closed loop spatial multiplexing modes. This thesis develops an algorithm to calculate the threshold values of UE speed and SNR that is required to implement a switching algorithm which can switch between different MIMO modes for a UE based on the speed and channel conditions (CSI). Specifically, this thesis provides the values of UE speed and SNR at which we can get better results by switching between open loop and closed loop MIMO modes and then be scheduled in sub-channels accordingly. Thus, the results can be used effectively to get better channel capacity with less ISI. The main objectives of this thesis are: to determine the type of MIMO mode suitable for a UE with certain speed, to determine the effects of SNR on selection of MIMO modes, and to design and implement a scheduling algorithm to enhance channel capacity.
Access: This item is restricted to UNT Community Members. Login required if off-campus.
Date: August 2016
Creator: Shukla, Rahul

Data-Driven Decision-Making Framework for Large-Scale Dynamical Systems under Uncertainty

Description: Managing large-scale dynamical systems (e.g., transportation systems, complex information systems, and power networks, etc.) in real-time is very challenging considering their complicated system dynamics, intricate network interactions, large scale, and especially the existence of various uncertainties. To address this issue, intelligent techniques which can quickly design decision-making strategies that are robust to uncertainties are needed. This dissertation aims to conquer these challenges by exploring a data-driven decision-making framework, which leverages big-data techniques and scalable uncertainty evaluation approaches to quickly solve optimal control problems. In particular, following techniques have been developed along this direction: 1) system modeling approaches to simplify the system analysis and design procedures for multiple applications; 2) effective simulation and analytical based approaches to efficiently evaluate system performance and design control strategies under uncertainty; and 3) big-data techniques that allow some computations of control strategies to be completed offline. These techniques and tools for analysis, design and control contribute to a wide range of applications including air traffic flow management, complex information systems, and airborne networks.
Access: This item is restricted to UNT Community Members. Login required if off-campus.
Date: August 2016
Creator: Xie, Junfei

Sensing and Decoding Brain States for Predicting and Enhancing Human Behavior, Health, and Security

Description: The human brain acts as an intelligent sensor by helping in effective signal communication and execution of logical functions and instructions, thus, coordinating all functions of the human body. More importantly, it shows the potential to combine prior knowledge with adaptive learning, thus ensuring constant improvement. These qualities help the brain to interact efficiently with both, the body (brain-body) as well as the environment (brain-environment). This dissertation attempts to apply the brain-body-environment interactions (BBEI) to elevate human existence and enhance our day-to-day experiences. For instance, when one stepped out of the house in the past, one had to carry keys (for unlocking), money (for purchasing), and a phone (for communication). With the advent of smartphones, this scenario changed completely and today, it is often enough to carry just one's smartphone because all the above activities can be performed with a single device. In the future, with advanced research and progress in BBEI interactions, one will be able to perform many activities by dictating it in one's mind without any physical involvement. This dissertation aims to shift the paradigm of existing brain-computer-interfaces from just ‘control' to ‘monitor, control, enhance, and restore' in three main areas - healthcare, transportation safety, and cryptography. In healthcare, measures were developed for understanding brain-body interactions by correlating cerebral autoregulation with brain signals. The variation in estimated blood flow of brain (obtained through EEG) was detected with evoked change in blood pressure, thus, enabling EEG metrics to be used as a first hand screening tool to check impaired cerebral autoregulation. To enhance road safety, distracted drivers' behavior in various multitasking scenarios while driving was identified by significant changes in the time-frequency spectrum of the EEG signals. A distraction metric was calculated to rank the severity of a distraction task that can be used as an intuitive measure ...
Access: This item is restricted to UNT Community Members. Login required if off-campus.
Date: August 2016
Creator: Bajwa, Garima

Network Security Tool for a Novice

Description: Network security is a complex field that is handled by security professionals who need certain expertise and experience to configure security systems. With the ever increasing size of the networks, managing them is going to be a daunting task. What kind of solution can be used to generate effective security configurations by both security professionals and nonprofessionals alike? In this thesis, a web tool is developed to simplify the process of configuring security systems by translating direct human language input into meaningful, working security rules. These human language inputs yield the security rules that the individual wants to implement in their network. The human language input can be as simple as, "Block Facebook to my son's PC". This tool will translate these inputs into specific security rules and install the translated rules into security equipment such as virtualized Cisco FWSM network firewall, Netfilter host-based firewall, and Snort Network Intrusion Detection. This tool is implemented and tested in both a traditional network and a cloud environment. One thousand input policies were collected from various users such as staff from UNT departments' and health science, including individuals with network security background as well as students with a non-computer science background to analyze the tool's performance. The tool is tested for its accuracy (91%) in generating a security rule. It is also tested for accuracy of the translated rule (86%) compared to a standard rule written by security professionals. Nevertheless, the network security tool built has shown promise to both experienced and inexperienced people in network security field by simplifying the provisioning process to result in accurate and effective network security rules.
Access: This item is restricted to UNT Community Members. Login required if off-campus.
Date: August 2016
Creator: Ganduri, Rajasekhar

New Frameworks for Secure Image Communication in the Internet of Things (IoT)

Description: The continuous expansion of technology, broadband connectivity and the wide range of new devices in the IoT cause serious concerns regarding privacy and security. In addition, in the IoT a key challenge is the storage and management of massive data streams. For example, there is always the demand for acceptable size with the highest quality possible for images to meet the rapidly increasing number of multimedia applications. The effort in this dissertation contributes to the resolution of concerns related to the security and compression functions in image communications in the Internet of Thing (IoT), due to the fast of evolution of IoT. This dissertation proposes frameworks for a secure digital camera in the IoT. The objectives of this dissertation are twofold. On the one hand, the proposed framework architecture offers a double-layer of protection: encryption and watermarking that will address all issues related to security, privacy, and digital rights management (DRM) by applying a hardware architecture of the state-of-the-art image compression technique Better Portable Graphics (BPG), which achieves high compression ratio with small size. On the other hand, the proposed framework of SBPG is integrated with the Digital Camera. Thus, the proposed framework of SBPG integrated with SDC is suitable for high performance imaging in the IoT, such as Intelligent Traffic Surveillance (ITS) and Telemedicine. Due to power consumption, which has become a major concern in any portable application, a low-power design of SBPG is proposed to achieve an energy- efficient SBPG design. As the visual quality of the watermarked and compressed images improves with larger values of PSNR, the results show that the proposed SBPG substantially increases the quality of the watermarked compressed images. Higher value of PSNR also shows how robust the algorithm is to different types of attack. From the results obtained for the energy- efficient SBPG ...
Access: This item is restricted to UNT Community Members. Login required if off-campus.
Date: August 2016
Creator: Albalawi, Umar Abdalah S

Modeling and Simulation of the Vector-Borne Dengue Disease and the Effects of Regional Variation of Temperature in the Disease Prevalence in Homogenous and Heterogeneous Human Populations

Description: The history of mitigation programs to contain vector-borne diseases is a story of successes and failures. Due to the complex interplay among multiple factors that determine disease dynamics, the general principles for timely and specific intervention for incidence reduction or eradication of life-threatening diseases has yet to be determined. This research discusses computational methods developed to assist in the understanding of complex relationships affecting vector-borne disease dynamics. A computational framework to assist public health practitioners with exploring the dynamics of vector-borne diseases, such as malaria and dengue in homogenous and heterogeneous populations, has been conceived, designed, and implemented. The framework integrates a stochastic computational model of interactions to simulate horizontal disease transmission. The intent of the computational modeling has been the integration of stochasticity during simulation of the disease progression while reducing the number of necessary interactions to simulate a disease outbreak. While there are improvements in the computational time reducing the number of interactions needed for simulating disease dynamics, the realization of interactions can remain computationally expensive. Using multi-threading technology to improve performance upon the original computational model, multi-threading experimental results have been tested and reported. In addition, to the contact model, the modeling of biological processes specific to the corresponding pathogen-carrier vector to increase the specificity of the vector-borne disease has been integrated. Last, automation for requesting, retrieving, parsing, and storing specific weather data and geospatial information from federal agencies to study the differences between homogenous and heterogeneous populations has been implemented.
Date: August 2016
Creator: Bravo-Salgado, Angel D

Exploring Analog and Digital Design Using the Open-Source Electric VLSI Design System

Description: The design of VLSI electronic circuits can be achieved at many different abstraction levels starting from system behavior to the most detailed, physical layout level. As the number of transistors in VLSI circuits is increasing, the complexity of the design is also increasing, and it is now beyond human ability to manage. Hence CAD (Computer Aided design) or EDA (Electronic Design Automation) tools are involved in the design. EDA or CAD tools automate the design, verification and testing of these VLSI circuits. In today’s market, there are many EDA tools available. However, they are very expensive and require high-performance platforms. One of the key challenges today is to select appropriate CAD or EDA tools which are open-source for academic purposes. This thesis provides a detailed examination of an open-source EDA tool called Electric VLSI Design system. An excellent and efficient CAD tool useful for students and teachers to implement ideas by modifying the source code, Electric fulfills these requirements. This thesis' primary objective is to explain the Electric software features and architecture and to provide various digital and analog designs that are implemented by this software for educational purposes. Since the choice of an EDA tool is based on the efficiency and functions that it can provide, this thesis explains all the analysis and synthesis tools that electric provides and how efficient they are. Hence, this thesis is of benefit for students and teachers that choose Electric as their open-source EDA tool for educational purposes.
Date: May 2016
Creator: Aluru, Gunasekhar

An Empirical Study of How Novice Programmers Use the Web

Description: Students often use the web as a source of help for problems that they encounter on programming assignments.In this work, we seek to understand how students use the web to search for help on their assignments.We used a mixed methods approach with 344 students who complete a survey and 41 students who participate in a focus group meetings and helped in recording data about their search habits.The survey reveals data about student reported search habits while the focus group uses a web browser plug-in to record actual search patterns.We examine the results collectively and as broken down by class year.Survey results show that at least 2/3 of the students from each class year rely on search engines to locate resources for help with their programming bugs in at least half of their assignments;search habits vary by class year;and the value of different types of resources such as tutorials and forums varies by class year.Focus group results exposes the high frequency web sites used by the students in solving their programming assignments.
Date: May 2016
Creator: Tula, Naveen

Analysis and Optimization of Graphene FET based Nanoelectronic Integrated Circuits

Description: Like cell to the human body, transistors are the basic building blocks of any electronics circuits. Silicon has been the industries obvious choice for making transistors. Transistors with large size occupy large chip area, consume lots of power and the number of functionalities will be limited due to area constraints. Thus to make the devices smaller, smarter and faster, the transistors are aggressively scaled down in each generation. Moore's law states that the transistors count in any electronic circuits doubles every 18 months. Following this Moore's law, the transistor has already been scaled down to 14 nm. However there are limitations to how much further these transistors can be scaled down. Particularly below 10 nm, these silicon based transistors hit the fundamental limits like loss of gate control, high leakage and various other short channel effects. Thus it is not possible to favor the silicon transistors for future electronics applications. As a result, the research has shifted to new device concepts and device materials alternative to silicon. Carbon is the next abundant element found in the Earth and one of such carbon based nanomaterial is graphene. Graphene when extracted from Graphite, the same material used as the lid in pencil, have a tremendous potential to take future electronics devices to new heights in terms of size, cost and efficiency. Thus after its first experimental discovery of graphene in 2004, graphene has been the leading research area for both academics as well as industries. This dissertation is focused on the analysis and optimization of graphene based circuits for future electronics. The first part of this dissertation considers graphene based transistors for analog/radio frequency (RF) circuits. In this section, a dual gate Graphene Field Effect Transistor (GFET) is considered to build the case study circuits like voltage controlled oscillator (VCO) and low ...
Access: This item is restricted to UNT Community Members. Login required if off-campus.
Date: May 2016
Creator: Joshi, Shital

Learning from small data set for object recognition in mobile platforms.

Description: Did you stand at a door with a bunch of keys and tried to find the right one to unlock the door? Did you hold a flower and wonder the name of it? A need of object recognition could rise anytime and any where in our daily lives. With the development of mobile devices object recognition applications become possible to provide immediate assistance. However, performing complex tasks in even the most advanced mobile platforms still faces great challenges due to the limited computing resources and computing power. In this thesis, we present an object recognition system that resides and executes within a mobile device, which can efficiently extract image features and perform learning and classification. To account for the computing constraint, a novel feature extraction method that minimizes the data size and maintains data consistency is proposed. This system leverages principal component analysis method and is able to update the trained classifier when new examples become available . Our system relieves users from creating a lot of examples and makes it user friendly. The experimental results demonstrate that a learning method trained with a very small number of examples can achieve recognition accuracy above 90% in various acquisition conditions. In addition, the system is able to perform learning efficiently.
Access: This item is restricted to UNT Community Members. Login required if off-campus.
Date: May 2016
Creator: Liu, Siyuan

Simulink(R) Based Design and Implementation of a Solar Power Based Mobile Charger

Description: Electrical energy is used at approximately the rate of 15 Terawatts world-wide. Generating this much energy has become a primary concern for all nations. There are many ways of generating energy among which the most commonly used are non-renewable and will extinct much sooner than expected. Very active research is going on both to increase the use of renewable energy sources and to use the available energy with more efficiency. Among these sources, solar energy is being considered as the most abundant and has received high attention. The mobile phone has become one of the basic needs of modern life, with almost every human being having one.Individually a mobile phone consumes little power but collectively this becomes very large. This consideration motivated the research undertaken in this masters thesis. The objective of this thesis is to design a model for solar power based charging circuits for mobile phone using Simulink(R). This thesis explains a design procedure of solar power based mobile charger circuit using Simulink(R) which includes the models for the photo-voltaic array, maximum power point tracker, pulse width modulator, DC-DC converter and a battery.The first part of the thesis concentrates on electron level behavior of a solar cell, its structure and its electrical model.The second part is to design an array of solar cells to generate the desired output.Finally, the third part is to design a DC-DC converter which can stabilize and provide the required input to the battery with the help of the maximum power point tracker and pulse width modulation.The obtained DC-DC converter is adjustable to meet the requirements of the battery. This design is aimed at charging a lithium ion battery with nominal voltage of 3.7 V, which can be taken as baseline to charge different types of batteries with different nominal voltages.
Date: May 2016
Creator: Mukka, Manoj Kumar

Detection of Ulcerative Colitis Severity and Enhancement of Informative Frame Filtering Using Texture Analysis in Colonoscopy Videos

Description: There are several types of disorders that affect our colon’s ability to function properly such as colorectal cancer, ulcerative colitis, diverticulitis, irritable bowel syndrome and colonic polyps. Automatic detection of these diseases would inform the endoscopist of possible sub-optimal inspection during the colonoscopy procedure as well as save time during post-procedure evaluation. But existing systems only detects few of those disorders like colonic polyps. In this dissertation, we address the automatic detection of another important disorder called ulcerative colitis. We propose a novel texture feature extraction technique to detect the severity of ulcerative colitis in block, image, and video levels. We also enhance the current informative frame filtering methods by detecting water and bubble frames using our proposed technique. Our feature extraction algorithm based on accumulation of pixel value difference provides better accuracy at faster speed than the existing methods making it highly suitable for real-time systems. We also propose a hybrid approach in which our feature method is combined with existing feature method(s) to provide even better accuracy. We extend the block and image level detection method to video level severity score calculation and shot segmentation. Also, the proposed novel feature extraction method can detect water and bubble frames in colonoscopy videos with very high accuracy in significantly less processing time even when clustering is used to reduce the training size by 10 times.
Date: December 2015
Creator: Dahal, Ashok

Integrity Verification of Applications on Radium Architecture

Description: Trusted Computing capability has become ubiquitous these days, and it is being widely deployed into consumer devices as well as enterprise platforms. As the number of threats is increasing at an exponential rate, it is becoming a daunting task to secure the systems against them. In this context, the software integrity measurement at runtime with the support of trusted platforms can be a better security strategy. Trusted Computing devices like TPM secure the evidence of a breach or an attack. These devices remain tamper proof if the hardware platform is physically secured. This type of trusted security is crucial for forensic analysis in the aftermath of a breach. The advantages of trusted platforms can be further leveraged if they can be used wisely. RADIUM (Race-free on-demand Integrity Measurement Architecture) is one such architecture, which is built on the strength of TPM. RADIUM provides an asynchronous root of trust to overcome the TOC condition of DRTM. Even though the underlying architecture is trusted, attacks can still compromise applications during runtime by exploiting their vulnerabilities. I propose an application-level integrity measurement solution that fits into RADIUM, to expand the trusted computing capability to the application layer. This is based on the concept of program invariants that can be used to learn the correct behavior of an application. I used Daikon, a tool to obtain dynamic likely invariants, and developed a method of observing these properties at runtime to verify the integrity. The integrity measurement component was implemented as a Python module on top of Volatility, a virtual machine introspection tool. My approach is a first step towards integrity attestation, using hypervisor-based introspection on RADIUM and a proof of concept of application-level measurement capability.
Date: August 2015
Creator: Tarigopula, Mohan Krishna

Automatic Removal of Complex Shadows From Indoor Videos

Description: Shadows in indoor scenarios are usually characterized with multiple light sources that produce complex shadow patterns of a single object. Without removing shadow, the foreground object tends to be erroneously segmented. The inconsistent hue and intensity of shadows make automatic removal a challenging task. In this thesis, a dynamic thresholding and transfer learning-based method for removing shadows is proposed. The method suppresses light shadows with a dynamically computed threshold and removes dark shadows using an online learning strategy that is built upon a base classifier trained with manually annotated examples and refined with the automatically identified examples in the new videos. Experimental results demonstrate that despite variation of lighting conditions in videos our proposed method is able to adapt to the videos and remove shadows effectively. The sensitivity of shadow detection changes slightly with different confidence levels used in example selection for classifier retraining and high confidence level usually yields better performance with less retraining iterations.
Date: August 2015
Creator: Mohapatra, Deepankar

Algorithm Optimizations in Genomic Analysis Using Entropic Dissection

Description: In recent years, the collection of genomic data has skyrocketed and databases of genomic data are growing at a faster rate than ever before. Although many computational methods have been developed to interpret these data, they tend to struggle to process the ever increasing file sizes that are being produced and fail to take advantage of the advances in multi-core processors by using parallel processing. In some instances, loss of accuracy has been a necessary trade off to allow faster computation of the data. This thesis discusses one such algorithm that has been developed and how changes were made to allow larger input file sizes and reduce the time required to achieve a result without sacrificing accuracy. An information entropy based algorithm was used as a basis to demonstrate these techniques. The algorithm dissects the distinctive patterns underlying genomic data efficiently requiring no a priori knowledge, and thus is applicable in a variety of biological research applications. This research describes how parallel processing and object-oriented programming techniques were used to process larger files in less time and achieve a more accurate result from the algorithm. Through object oriented techniques, the maximum allowable input file size was significantly increased from 200 mb to 2000 mb. Using parallel processing techniques allowed the program to finish processing data in less than half the time of the sequential version. The accuracy of the algorithm was improved by reducing data loss throughout the algorithm. Finally, adding user-friendly options enabled the program to use requests more effectively and further customize the logic used within the algorithm.
Date: August 2015
Creator: Danks, Jacob R.

Radium: Secure Policy Engine in Hypervisor

Description: The basis of today’s security systems is the trust and confidence that the system will behave as expected and are in a known good trusted state. The trust is built from hardware and software elements that generates a chain of trust that originates from a trusted known entity. Leveraging hardware, software and a mandatory access control policy technology is needed to create a trusted measurement environment. Employing a control layer (hypervisor or microkernel) with the ability to enforce a fine grained access control policy with hyper call granularity across multiple guest virtual domains can ensure that any malicious environment to be contained. In my research, I propose the use of radium's Asynchronous Root of Trust Measurement (ARTM) capability incorporated with a secure mandatory access control policy engine that would mitigate the limitations of the current hardware TPM solutions. By employing ARTM we can leverage asynchronous use of boot, launch, and use with the hypervisor proving its state and the integrity of the secure policy. My solution is using Radium (Race free on demand integrity architecture) architecture that will allow a more detailed measurement of applications at run time with greater semantic knowledge of the measured environments. Radium incorporation of a secure access control policy engine will give it the ability to limit or empower a virtual domain system. It can also enable the creation of a service oriented model of guest virtual domains that have the ability to perform certain operations such as introspecting other virtual domain systems to determine the integrity or system state and report it to a remote entity.
Date: August 2015
Creator: Shah, Tawfiq M

Maintaining Web Applications Integrity Running on Radium

Description: Computer security attacks take place due to the presence of vulnerabilities and bugs in software applications. Bugs and vulnerabilities are the result of weak software architecture and lack of standard software development practices. Despite the fact that software companies are investing millions of dollars in the research and development of software designs security risks are still at large. In some cases software applications are found to carry vulnerabilities for many years before being identified. A recent such example is the popular Heart Bleed Bug in the Open SSL/TSL. In today’s world, where new software application are continuously being developed for a varied community of users; it’s highly unlikely to have software applications running without flaws. Attackers on computer system securities exploit these vulnerabilities and bugs and cause threat to privacy without leaving any trace. The most critical vulnerabilities are those which are related to the integrity of the software applications. Because integrity is directly linked to the credibility of software application and data it contains. Here I am giving solution of maintaining web applications integrity running on RADIUM by using daikon. Daikon generates invariants, these invariants are used to maintain the integrity of the web application and also check the correct behavior of web application at run time on RADIUM architecture in case of any attack or malware. I used data invariants and program flow invariants in my solution to maintain the integrity of web-application against such attack or malware. I check the behavior of my proposed invariants at run-time using Lib-VMI/Volatility memory introspection tool. This is a novel approach and proof of concept toward maintaining web application integrity on RADIUM.
Date: August 2015
Creator: Ur-Rehman, Wasi

Towards Resistance Detection in Health Behavior Change Dialogue Systems

Description: One of the challenges fairly common in motivational interviewing is patient resistance to health behavior change. Hence, automated dialog systems aimed at counseling patients need to be capable of detecting resistance and appropriately altering dialog. This thesis focusses primarily on the development of such a system for automatic identification of patient resistance to behavioral change. This enables the dialogue system to direct the discourse towards a more agreeable ground and helping the patient overcome the obstacles in his or her way to change. This thesis also proposes a dialogue system framework for health behavior change via natural language analysis and generation. The proposed framework facilitates automated motivational interviewing from clinical psychology and involves three broad stages: rapport building and health topic identification, assessment of the patient’s opinion about making a change, and developing a plan. Using this framework patients can be encouraged to reflect on the options available and choose the best for a healthier life.
Date: August 2015
Creator: Sarma, Bandita