Search Results

open access

Accuracy and Interpretability Testing of Text Mining Methods

Description: Extracting meaningful information from large collections of text data is problematic because of the sheer size of the database. However, automated analytic methods capable of processing such data have emerged. These methods, collectively called text mining first began to appear in 1988. A number of additional text mining methods quickly developed in independent research silos with each based on unique mathematical algorithms. How good each of these methods are at analyzing text is unclear. … more
Date: August 2013
Creator: Ashton, Triss A.
Partner: UNT Libraries
open access

Application of Spectral Analysis to the Cycle Regression Algorithm

Description: Many techniques have been developed to analyze time series. Spectral analysis and cycle regression analysis represent two such techniques. This study combines these two powerful tools to produce two new algorithms; the spectral algorithm and the one-pass algorithm. This research encompasses four objectives. The first objective is to link spectral analysis with cycle regression analysis to determine an initial estimate of the sinusoidal period. The second objective is to determine the best spect… more
Date: August 1984
Creator: Shah, Vivek
Partner: UNT Libraries
open access

Call Option Premium Dynamics

Description: This study has a twofold purpose: to demonstrate the use of the Marquardt compromise method in estimating the unknown parameters contained in the probability call-option pricing models and to test empirically the following models: the Boness, the Black-Scholes, the Merton proportional dividend, the Ingersoll differential tax, and the Ingersoll proportional dividend and differential tax.
Date: December 1982
Creator: Chen, Jim
Partner: UNT Libraries
open access

The Chi Square Approximation to the Hypergeometric Probability Distribution

Description: This study compared the results of his chi square text of independence and the corrected chi square statistic against Fisher's exact probability test (the hypergeometric distribution) in contection with sampling from a finite population. Data were collected by advancing the minimum call size from zero to a maximum which resulted in a tail area probability of 20 percent for sample sizes from 10 to 100 by varying increments. Analysis of the data supported the rejection of the null hypotheses rega… more
Date: August 1982
Creator: Anderson, Randy J. (Randy Jay)
Partner: UNT Libraries
open access

Classification by Neural Network and Statistical Models in Tandem: Does Integration Enhance Performance?

Description: The major purposes of the current research are twofold. The first purpose is to present a composite approach to the general classification problem by using outputs from various parametric statistical procedures and neural networks. The second purpose is to compare several parametric and neural network models on a transportation planning related classification problem and five simulated classification problems.
Date: December 1998
Creator: Mitchell, David
Partner: UNT Libraries
open access

The Comparative Effects of Varying Cell Sizes on Mcnemar's Test with the Χ^2 Test of Independence and T Test for Related Samples

Description: This study compared the results for McNemar's test, the t test for related measures, and the chi-square test of independence as cell sized varied in a two-by-two frequency table. In this study. the probability results for McNemar's rest, the t test for related measures, and the chi-square test of independence were compared for 13,310 different combinations of cell sizes in a two-by-two design. Several conclusions were reached: With very few exceptions, the t test for related measures and McNema… more
Date: August 1980
Creator: Black, Kenneth U.
Partner: UNT Libraries
open access

Comparing Latent Dirichlet Allocation and Latent Semantic Analysis as Classifiers

Description: In the Information Age, a proliferation of unstructured text electronic documents exists. Processing these documents by humans is a daunting task as humans have limited cognitive abilities for processing large volumes of documents that can often be extremely lengthy. To address this problem, text data computer algorithms are being developed. Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are two text data computer algorithms that have received much attention individually… more
Date: December 2011
Creator: Anaya, Leticia H.
Partner: UNT Libraries
open access

Comparing the Powers of Several Proposed Tests for Testing the Equality of the Means of Two Populations When Some Data Are Missing

Description: In comparing the means .of two normally distributed populations with unknown variance, two tests very often used are: the two independent sample and the paired sample t tests. There is a possible gain in the power of the significance test by using the paired sample design instead of the two independent samples design.
Date: May 1994
Creator: Dunu, Emeka Samuel
Partner: UNT Libraries
open access

Derivation of Probability Density Functions for the Relative Differences in the Standard and Poor's 100 Stock Index Over Various Intervals of Time

Description: In this study a two-part mixed probability density function was derived which described the relative changes in the Standard and Poor's 100 Stock Index over various intervals of time. The density function is a mixture of two different halves of normal distributions. Optimal values for the standard deviations for the two halves and the mean are given. Also, a general form of the function is given which uses linear regression models to estimate the standard deviations and the means. The density … more
Date: August 1988
Creator: Bunger, R. C. (Robert Charles)
Partner: UNT Libraries
open access

Developing Criteria for Extracting Principal Components and Assessing Multiple Significance Tests in Knowledge Discovery Applications

Description: With advances in computer technology, organizations are able to store large amounts of data in data warehouses. There are two fundamental issues researchers must address: the dimensionality of data and the interpretation of multiple statistical tests. The first issue addressed by this research is the determination of the number of components to retain in principal components analysis. This research establishes regression, asymptotic theory, and neural network approaches for estimating mean and … more
Date: August 1999
Creator: Keeling, Kellie Bliss
Partner: UNT Libraries
open access

The Development and Evaluation of a Forecasting System that Incorporates ARIMA Modeling with Autoregression and Exponential Smoothing

Description: This research was designed to develop and evaluate an automated alternative to the Box-Jenkins method of forecasting. The study involved two major phases. The first phase was the formulation of an automated ARIMA method; the second was the combination of forecasts from the automated ARIMA with forecasts from two other automated methods, the Holt-Winters method and the Stepwise Autoregressive method. The development of the automated ARIMA, based on a decision criterion suggested by Akaike, borro… more
Date: May 1985
Creator: Simmons, Laurette Poulos
Partner: UNT Libraries
open access

Economic Statistical Design of Inverse Gaussian Distribution Control Charts

Description: Statistical quality control (SQC) is one technique companies are using in the development of a Total Quality Management (TQM) culture. Shewhart control charts, a widely used SQC tool, rely on an underlying normal distribution of the data. Often data are skewed. The inverse Gaussian distribution is a probability distribution that is wellsuited to handling skewed data. This analysis develops models and a set of tools usable by practitioners for the constrained economic statistical design of contr… more
Date: August 1990
Creator: Grayson, James M. (James Morris)
Partner: UNT Libraries
open access

The Effect of Certain Modifications to Mathematical Programming Models for the Two-Group Classification Problem

Description: This research examines certain modifications of the mathematical programming models to improve their classificatory performance. These modifications involve the inclusion of second-order terms and secondary goals in mathematical programming models. A Monte Carlo simulation study is conducted to investigate the performance of two standard parametric models and various mathematical programming models, including the MSD (minimize sum of deviations) model, the MIP (mixed integer programming) model … more
Date: May 1994
Creator: Wanarat, Pradit
Partner: UNT Libraries
open access

The Effect of Value Co-creation and Service Quality on Customer Satisfaction and Commitment in Healthcare Management

Description: Despite much interest in service quality and various other service quality measures, scholars appear to have overlooked the overall concept of quality. More specifically, previous research has yet to integrate the effect of the customer network and customer knowledge into the measurement of quality. In this work, it is posited that the evaluation of quality is based on both the delivered value from the provider as well as the value developed from the relationships among customers and between cu… more
Date: August 2015
Creator: Kwon, Junhyuk
Partner: UNT Libraries
open access

The Establishment of Helicopter Subsystem Design-to-Cost Estimates by Use of Parametric Cost Estimating Models

Description: The purpose of this research was to develop parametric Design-to-Cost models for selected major subsystems of certain helicopters. This was accomplished by analyzing the relationships between historical production costs and certain design parameters which are available during the preliminary design phase of the life cycle. Several potential contributions are identified in the areas of academia, government, and industry. Application of the cost models will provide estimates beneficial to the gov… more
Date: August 1979
Creator: Gilliland, Johnny J.
Partner: UNT Libraries
open access

The Evaluation and Control of the Changes in Basic Statistics Encountered in Grouped Data

Description: This dissertation describes the effect that the construction of frequency tables has on basic statistics computed from those frequency tables. It is directly applicable only to normally distributed data summarized by Sturges' Rule. The purpose of this research was to identify factors tending to bias sample statistics when data are summarized, and thus to allow researchers to avoid such bias. The methodology employed was a large scale simulation where 1000 replications of samples of size n = 2 ᵏ… more
Date: May 1979
Creator: Scott, James P.
Partner: UNT Libraries
open access

Financial Leverage and the Cost of Capital

Description: The objective of the research reported in this dissertation is to conduct an empirical test of the hypothesis that, excluding income tax effects, the cost of capital to a firm is independent of the degree of financial leverage employed by the firm. This hypothesis, set forth by Franco Modigliani and Merton Miller in 1958, represents a challenge to the traditional view on the subject, a challenge which carries implications of considerable importance in the field of finance. The challenge has led… more
Date: December 1977
Creator: Brust, Melvin F.
Partner: UNT Libraries
open access

The Fixed v. Variable Sampling Interval Shewhart X-Bar Control Chart in the Presence of Positively Autocorrelated Data

Description: This study uses simulation to examine differences between fixed sampling interval (FSI) and variable sampling interval (VSI) Shewhart X-bar control charts for processes that produce positively autocorrelated data. The influence of sample size (1 and 5), autocorrelation parameter, shift in process mean, and length of time between samples is investigated by comparing average time (ATS) and average number of samples (ANSS) to produce an out of control signal for FSI and VSI Shewhart X-bar charts. … more
Date: May 1993
Creator: Harvey, Martha M. (Martha Mattern)
Partner: UNT Libraries
open access

A Goal Programming Safety and Health Standards Compliance Model

Description: The purpose of this dissertation was to create a safety compliance model which would advance the state of the art of safety compliance models and provide management with a practical tool which can be used in making safety decisions in an environment where multiple objectives exist. A goal programming safety compliance model (OSHA Model) was developed to fulfill this purpose. The objective function of the OSHA Model was designed to minimize the total deviation from the established goals of the m… more
Date: August 1976
Creator: Ryan, Lanny J.
Partner: UNT Libraries
open access

A Heuristic Procedure for Specifying Parameters in Neural Network Models for Shewhart X-bar Control Chart Applications

Description: This study develops a heuristic procedure for specifying parameters for a neural network configuration (learning rate, momentum, and the number of neurons in a single hidden layer) in Shewhart X-bar control chart applications. Also, this study examines the replicability of the neural network solution when the neural network is retrained several times with different initial weights.
Date: December 1993
Creator: Nam, Kyungdoo T.
Partner: UNT Libraries
open access

The Impact of Culture on the Decision Making Process in Restaurants

Description: Understanding the process of consumers during key purchasing decision points is the margin between success and failure for any business. The cultural differences between the factors that affect consumers in their decision-making process is the motivation of this research. The purpose of this research is to extend the current body of knowledge about decision-making factors by developing and testing a new theoretical model to measure how culture may affect the attitudes and behaviors of consumers… more
Date: August 2015
Creator: Boonme, Kittipong
Partner: UNT Libraries
open access

Impact of Forecasting Method Selection and Information Sharing on Supply Chain Performance.

Description: Effective supply chain management gains much attention from industry and academia because it helps firms across a supply chain to reduce cost and improve customer service level efficiently. Focusing on one of the key challenges of the supply chains, namely, demand uncertainty, this dissertation extends the work of Zhao, Xie, and Leung so as to examine the effects of forecasting method selection coupled with information sharing on supply chain performance in a dynamic business environment. The r… more
Date: December 2009
Creator: Pan, Youqin
Partner: UNT Libraries
open access

The Impact of Quality on Customer Behavioral Intentions Based on the Consumer Decision Making Process As Applied in E-commerce

Description: Perceived quality in the context of e-commerce was defined and examined in numerous studies, but, to date, there are no consistent definitions and measurement scales. Instruments that measure quality in e-commerce industries primarily focus on website quality or service quality during the transaction and delivery phases. Even though some scholars have proposed instruments from different perspectives, these scales do not fully evaluate the level of quality perceived by customers during the entir… more
Date: August 2012
Creator: Wen, Chao
Partner: UNT Libraries
open access

The Impact of Water Pollution Abatement Costs on Financing of Municipal Services in North Central Texas

Description: The purpose of this study is to determine the effects of water pollution control on financing municipal water pollution control facilities in selected cities in North Central Texas. This objective is accomplished by addressing the following topics: (1) the cost to municipalities of meeting federally mandated water pollution control, (2) the sources of funds for financing sewage treatment, and (3) the financial implications of employing these financing tools to satisfy water quality regulations.… more
Date: May 1976
Creator: Rucks, Andrew C.
Partner: UNT Libraries
Back to Top of Screen