48 Matching Results

Search Results

Advanced search parameters have been applied.

The Impact of Medicare on the Distribution of Public Health Care Expenditures in Oklahoma

Description: The purpose of the study is to determine what effect medicare has had on the distribution of public health care expenditures in the state of Oklahoma. The study tests that there is a significant correlation between medical vendor payments and indigency in Oklahoma or in other words that pre-medicare public health care dollars in Oklahoma were distributed to indigents.
Date: December 1972
Creator: Coffey, Vernon Eugene
Partner: UNT Libraries

An Analysis of the Alexander Cooper Report on Housing for the Central Business District of Dallas

Description: The questions arising over the deterioration of the 0BD have brought this topic to the attention of the public. This thesis will discuss the Central Business District of D311a5 and its decline. In order to study alternatives to these downward trends, the Central Business District Association of Dallas commissioned Alexander Cooper to make. an analysis of the possibilities for one alternative, namely, housing. The purpose of this study is to examine the le.rander Cooper Report on housing. The facts presented in this thesis will provide an analytical base of urban theory from which a discussion of housing prospects will be initiated. The feasibility of dointovm housing construction will 'be examined as it is presented in the Cooper Report.
Date: August 1976
Creator: Armstrong, Sonia V.
Partner: UNT Libraries

FORTRAN Optimizations at the Source Code Level

Description: This paper discusses FORTRAN optimizations that the user can perform manually at the source code level to improve object code performance. It makes use of descriptive examples within the text of the paper for explanatory purposes. The paper defines key areas in writing a FORTRAN program and recommends ways to improve efficiency in these areas.
Date: August 1977
Creator: Barber, Willie D.
Partner: UNT Libraries

Michal Kalecki: A Dynamic Analysis of Capitalism

Description: Michal Kalecki was not a mere precursor of Keynes, but a contemporary whose analysis provides insights into the nature of capitalism. His contribution to the understanding of the capitalist economy is central to this four chapter thesis. Chapter one develops a biographical sketch of Kalecki. Chapter two examines the components of his General Theory. Chapter three considers the differences between Kalecki and Keynes. Kalecki's contributions to the Keynesian revolution are presented along with the hopelessness he foresaw in incorporating any basic reforms into a capitalist economy. The final chapter looks to the present fruit of Kalecki's dynamic analysis--Post-Keynesian economics. The Post-Keynesian synthesis reflects the Kaleckian framework and the Keynesian optimism out of which policy may arise to affect the structural problems plaguing capitalism today.
Date: August 1980
Creator: Groves, Miles Edmund
Partner: UNT Libraries

A Comparison of File Organization Techniques

Description: This thesis compares the file organization techniques that are implemented on two different types of computer systems, the large-scale and the small-scale. File organizations from representative computers in each class are examined in detail: the IBM System/370 (OS/370) and the Harris 1600 Distributed Processing System with the Extended Communications Operating System (ECOS). In order to establish the basic framework for comparison, an introduction to file organizations is presented. Additionally, the functional requirements for file organizations are described by their characteristics and user demands. Concluding remarks compare file organization techniques and discuss likely future developments of file systems.
Date: August 1977
Creator: Rogers, Roy Lee
Partner: UNT Libraries

The Comparative Effects of Varying Cell Sizes on Mcnemar's Test with the Χ^2 Test of Independence and T Test for Related Samples

Description: This study compared the results for McNemar's test, the t test for related measures, and the chi-square test of independence as cell sized varied in a two-by-two frequency table. In this study. the probability results for McNemar's rest, the t test for related measures, and the chi-square test of independence were compared for 13,310 different combinations of cell sizes in a two-by-two design. Several conclusions were reached: With very few exceptions, the t test for related measures and McNemar's test yielded probability results within .002 of each other. The chi-square test seemed to equal the other two tests consistently only when low probabilities less than or equal to .001 were attained. It is recommended that the researcher consider using the t test for related measures as a viable option for McNemar's test except when the researcher is certain he/she is only interested in 'changes'. The chi-square test of independence not only tests a different hypothesis than McNemar's test, but it often yields greatly differing results from McNemar's test.
Date: August 1980
Creator: Black, Kenneth U.
Partner: UNT Libraries

The Establishment of Helicopter Subsystem Design-to-Cost Estimates by Use of Parametric Cost Estimating Models

Description: The purpose of this research was to develop parametric Design-to-Cost models for selected major subsystems of certain helicopters. This was accomplished by analyzing the relationships between historical production costs and certain design parameters which are available during the preliminary design phase of the life cycle. Several potential contributions are identified in the areas of academia, government, and industry. Application of the cost models will provide estimates beneficial to the government and DoD by allowing derivation of realistic Design-to-Cost estimates. In addition, companies in the helicopter industry will benefit by using the models for two key purposes: (1) optimizing helicopter design through cost-effective tradeoffs, and (2) justifying a proposal estimate.
Date: August 1979
Creator: Gilliland, Johnny J.
Partner: UNT Libraries

An Examination of the Feasibility of Measuring National Income from Monetary Data

Description: The purpose of the paper is to explore, more fully, one particular aspect of economic accounting, measurement of national income. Since data problems often inhibit attempts to measure national income by conventional methods, particularly in less developed regions, the paper focuses attention on alternative techniques of measurement with major emphasis on procedures employing monetary data.
Date: August 1972
Creator: Repass, William F.
Partner: UNT Libraries

Conceptual Foundation for Human Resource Accounting

Description: With the current strain on the world's material resources and the increase in their cost, a constant pressure is building to increase the productivity of human resources. Adding, to the strain is the increasing demand of society for a higher quality of life through more meaningful work. Responding to both of these pressures requires decisions that simultaneously meet the goals of organizations and the needs and values of employees. To make the kind of decisions demanded by this dual priority of human effectiveness and improved quality of life, information is needed to: 1. Improve understanding of the nature and scope of human resource expenditures; 2. Improve selection, retention, and motivation of employees; 3. Allocate money spent on human resources; 4. Overhaul the approach to communication among managers, between managers and other employees, and between the organization as a whole and outside parties; 5. Expand the scope of internal and external reports to deal with social as well as financial accomplishments. The ultimate objective of this research is to develop a human resource model and a heuristic for developing empirical support which can be useful to businessmen seeking to increase human effectiveness and improve the quality of life. The model merges several previously unrelated theories dealing with human resources and in the process contributes some new concepts.
Date: May 1974
Creator: Flowers, Vincent S.
Partner: UNT Libraries

Mexican Americans: An Economically Significant Ethnic Market Segment

Description: The area of ethnic market segmentation has received little attention from practitioners or academicians of marketing since most minority groups immigrating to the United States have gradually assimilated the cultural norms and values, and thus the market behavior, of the American society as a whole. Preliminary investigation, however, indicates that Mexican Americans are an exception. To discover whether Mexican Americans represent a true ethnic market segment of economic significance, this study examines and analyzes several aspects. First, to determine whether Mexican Americans represent a true ethnic segment, the following aspects of their cultural norms, perceptions, and values are investigated: their distinct and unique identity, the continuity and consistency of their adoption and use, and the degree of their influence. Second, to determine whether Mexican Americans constitute an ethnic market segment, grocery shopping behavioral patterns are examined. Third, to ascertain whether Mexican Americans represent a substantial ethnic market segment in terms of the number of consumers and the amount of money spent, relevant demographic and socioeconomic characteristics are presented and analyzed. Fourth, the impact of an economically significant ethnic market segment on marketers and marketing institutions is assessed. Due to the nature of this study, emphasis is placed on the collection of primary data, which has been obtained through personal interviews with 115 consumer respondents and eighteen grocery store owners and managers. Secondary data, gathered from reports of the Bureau of the Census, various periodicals, journals, and books, are used to establish cultural, demographic, and socioeconomic trends.
Date: December 1972
Creator: Ferguson, Richard Wayne, 1934-
Partner: UNT Libraries

A Comparison of Traditional Norming and Rasch Quick Norming Methods

Description: The simplicity and ease of use of the Rasch procedure is a decided advantage. The test user needs only two numbers: the frequency of persons who answered each item correctly and the Rasch-calibrated item difficulty, usually a part of an existing item bank. Norms can be computed quickly for any specific group of interest. In addition, once the selected items from the calibrated bank are normed, any test, built from the item bank, is automatically norm-referenced. Thus, it was concluded that the Rasch quick norm procedure is a meaningful alternative to traditional classical true score norming for test users who desire normative data.
Date: August 1993
Creator: Bush, Joan Spooner
Partner: UNT Libraries

Antecedents of Power in the Distribution Channel : A Transaction-cost Perspective

Description: A discussion of reward, coercive, expert, legitimate, and referent power bases was the initial focus of this research. A review of the power sources literature suggested that vertical integration within a channel of distribution was a crucial precursor to develop a structure to facilitate the use of power without creating a significant conflict among channel participants. Elements of transaction cost analysis (TCA) were offered as being suitable for determining the existing level of vertical integration among respondent firms. Accordingly, the purpose of this study was to develop a tentative model to determine proper use of power within varying levels of vertical integration.
Date: August 1991
Creator: Erdem, S. Altan (Selim Altan)
Partner: UNT Libraries

The Industrial Representative's Perception of the Impact of Managerial Control Systems on Performance

Description: The objective of this study was to examine whether the factors which constitute the manufacturer/industrial-representative relationship, influence performance as predicted by control theory. In addition, the study evaluated the contribution of selected demographic factors such as size of the firm, and the representative's experience, on performance.
Date: August 1995
Creator: Dunipace, Richard A. (Richard Alan)
Partner: UNT Libraries

A Comparison of Three Criteria Employed in the Selection of Regression Models Using Simulated and Real Data

Description: Researchers who make predictions from educational data are interested in choosing the best regression model possible. Many criteria have been devised for choosing a full or restricted model, and also for selecting the best subset from an all-possible-subsets regression. The relative practical usefulness of three of the criteria used in selecting a regression model was compared in this study: (a) Mallows' C_p, (b) Amemiya's prediction criterion, and (c) Hagerty and Srinivasan's method involving predictive power. Target correlation matrices with 10,000 cases were simulated so that the matrices had varying degrees of effect sizes. The amount of power for each matrix was calculated after one or two predictors was dropped from the full regression model, for sample sizes ranging from n = 25 to n = 150. Also, the null case, when one predictor was uncorrelated with the other predictors, was considered. In addition, comparisons for regression models selected using C_p and prediction criterion were performed using data from the National Educational Longitudinal Study of 1988.
Date: December 1994
Creator: Graham, D. Scott
Partner: UNT Libraries

The Effect of Psychometric Parallelism among Predictors on the Efficiency of Equal Weights and Least Squares Weights in Multiple Regression

Description: There are several conditions for applying equal weights as an alternative to least squares weights. Psychometric parallelism, one of the conditions, has been suggested as a necessary and sufficient condition for equal-weights aggregation. The purpose of this study is to investigate the effect of psychometric parallelism among predictors on the efficiency of equal weights and least squares weights. Target correlation matrices with 10,000 cases were simulated so that the matrices had varying degrees of psychometric parallelism. Five hundred samples with six ratios of observation to predictor = 5/1, 10/1, 20/1, 30/1, 40/1, and 50/1 were drawn from each population. The efficiency is interpreted as the accuracy and the predictive power estimated by the weighting methods. The accuracy is defined by the deviation between the population R² and the sample R² . The predictive power is referred to as the population cross-validated R² and the population mean square error of prediction. The findings indicate there is no statistically significant relationship between the level of psychometric parallelism and the accuracy of least squares weights. In contrast, the correlation between the level of psychometric parallelism and the accuracy of equal weights is significantly negative. Under different conditions, the minimum p value of χ² for testing psychometric parallelism among predictors is also different in order to prove equal weights more powerful than least squares weights. The higher the number of predictors is, the higher the minimum p value. The higher the ratio of observation to predictor is, the higher the minimum p value. The higher the magnitude of intercorrelations among predictors is, the lower the minimum p value. This study demonstrates that the most frequently used levels of significance, 0.05 and 0.01, are no longer the only p values for testing the null hypotheses of psychometric parallelism among predictors when replacing least squares weights ...
Date: May 1996
Creator: Zhang, Desheng
Partner: UNT Libraries

Derivation of Probability Density Functions for the Relative Differences in the Standard and Poor's 100 Stock Index Over Various Intervals of Time

Description: In this study a two-part mixed probability density function was derived which described the relative changes in the Standard and Poor's 100 Stock Index over various intervals of time. The density function is a mixture of two different halves of normal distributions. Optimal values for the standard deviations for the two halves and the mean are given. Also, a general form of the function is given which uses linear regression models to estimate the standard deviations and the means. The density functions allow stock market participants trading index options and futures contracts on the S & P 100 Stock Index to determine probabilities of success or failure of trades involving price movements of certain magnitudes in given lengths of time.
Date: August 1988
Creator: Bunger, R. C. (Robert Charles)
Partner: UNT Libraries

A Study of Factors Influencing Plant Location Decisions in Texas as Viewed by Texas Community Leaders and Out-of-State Manufacturing Executives

Description: This dissertation has two major sections. The first section focuses on analyzing objective data gathered from public sources to investigate factors influencing industrial location to Texas. Areas of investigation include (1) where Texas stands--on economic, demographic, sociologic, climatic, and technological terms--in relation to the remaining forty-seven contiguous states; (2) what are the locational characteristics of Texas compared to other states; and (3) what types of industry move to Texas and from where. Regional and state comparisons are also made in terms of factors that can influence business success. The second section is concerned with analyzing survey data gathered from three test groups. The three groups are (1) civic interest groups consisting of Texas mayors, city managers, and chamber of commerce executives; (2) manufacturing executives who have located a new plant in Texas from outside the state since 1978; and (3) out-of-state manufacturing executives who have considered Texas as a possible location but decided not to locate within the state during the period 1978-1983. The major purposes of this section are to determine (1) whether manufacturing executives and Texas community leaders possess different views concerning the relative importance of location factors and factors that are specifically advantageous to the state of Texas, (2) what factors motivate out-of-state manufacturers to select Texas as a location for their plant, and (3) what factors they see as disadvantages. A comparison is made between the findings of the survey data and the objective data. A variety of nonparametric statistical tests are used in testing the hypotheses.
Date: December 1984
Creator: Mekhaimer, Abdelaziz G. (Abdelaziz Gamil)
Partner: UNT Libraries

Impact of Tax Complexity on Taxpayer Understanding

Description: The purpose of this study is to determine the effect tax complexity has on taxpayers' understanding of the tax law. The individual income tax system in the United States is based on self assessment by the taxpayer. A self assessing system requires a high level of voluntary compliance by the participants. Taxpayers who file returns on time and file correctly are considered to be in compliance with the tax law. A taxpayer who cannot understand the rules for tax reporting logically does not have the ability to comply with the law. A tax system predicated on the presumption that the average taxpayer can understand and comply with the tax rules must not become so complex that the taxpayer is forced into either seeking external help or not fully complying. The question arises, does complexity affect the ability of taxpayers to understand, and thus comply with, the tax system?
Date: December 1989
Creator: Martindale, Bobbie Cook
Partner: UNT Libraries

Alternative Social Security Taxing Schemes: an Analysis of Vertical and Horizontal Equity in the Federal Tax System

Description: The objectives of this study were twofold. One objective was to analyze the effects of growth in the social security tax, when combined with recent changes in U.S. income tax law, on the distribution of the combined income and social security tax burden during the 1980s. The second objective was to estimate the effects of certain proposals for social security tax reform upon that distribution. The above analyses were performed using simulation techniques applied to the 1984 IRS Individual Tax Model File. The data from this file were used to estimate the income and social security tax liabilities for sample taxpayers under tax law in effect in 1980, 1984 and 1988 and under fourteen proposals for social security reform (under 1988 law). The results indicated that the income tax distribution was almost 25 percent more progressive under 1988 tax law than under 1980 tax law. In contrast, the combined distribution of income and social security taxes was almost 25 percent less progressive under 1988 income and social security tax law relative to 1980. Two types of social security tax reform were analyzed. One type consisted of reforms to the basic social security tax structure, such as removal of the earnings ceiling, provision of exemptions and replacement of the current single tax rate with a two-tiered graduated rate structure. The second type of reform consisted of proposals to expand the theoretical tax base subject to the social security levy. The results suggested that these reforms could generate substantial increases in progressivity in the combined tax distribution. In general, it would appear that changes in the social security tax structure could generate greater improvements in progressivity than expansion of the theoretical tax base, although the greatest improvement was associated with a combination of these two reforms. With regard to horizontal equity, expansion ...
Date: December 1988
Creator: Ricketts, Robert C. (Robert Carlton)
Partner: UNT Libraries

An Exploratory Field Study of Adolescent Consumer Behavior: The Family Purchasing Agent

Description: An exploratory field study was conducted to examine internal and external factors that influence adolescents' consumer behavior when serving as the family purchasing agents. Demographic, lifestyle, and marketing activities were examined to determine the influences that affect whether the adolescent will purchase the preferred family brands or other brands. Participating adolescents were sent by their parents to the grocery store on two separate occasions to purchase four preselected grocery items. The brands purchased were recorded and compared to the preferred brand names provided by the parents. While no statistical significance was found, occasional trends were observed. The analysis indicated that adolescents who experience a pluralistic family communication style will purchase products other than the preferred household brands. Adolescents who are exposed to television and radio tend to deviate more from the preferred family brands more often than do adolescents with less media exposure. Adolescents who work are more likely to go to the grocery store more often for their families than do nonworking adolescents. Also, adolescents seem to possess a price sensitivity to both high and low-involvement grocery items.
Date: August 1989
Creator: Hardy, Jane P.
Partner: UNT Libraries

Foreign Exchange Risk Management in U.S. Multinationals Under SFAS no. 52: Change in Management Decision Making in Response to Accounting Policy Change

Description: SFAS No. 52, Foreign Currency Translation, was issued in December, 1981, replacing SFAS No. 8, Accounting For the Translation of Foreign Currency Transactions and Foreign Currency Financial Statements. SFAS No. 52 has shifted the impact of translation gains and losses from the income statement to the balance sheet. It was expected that SFAS No. 52 would eliminate the incentive for multinationals to engage in various hedging activities to reduce the effect of the translation process in reported earnings. It was also expected that multinationals would change their foreign exchange risk management practices. The major purpose of this study was to investigate the effect of SFAS No. 52 on foreign exchange risk management practices of U.S. based multinationals.
Date: August 1986
Creator: El-Refadi, Idris Abdulsalam
Partner: UNT Libraries

An Empirical Investigation of the Discriminant and Predictive Ability of the SFAS No. 69 Signals for Business Failure in the Oil and Gas Industry

Description: In 1982, the Financial Accounting Board (FASB) issued Statment of Financial Accounting Standards No. 69 (SFAS No. 69) which required oil and gas producing companies to disclose supplementary information to the basic financial statements. These disclosures include, costs incurred, capitalized costs, reserve quantities, and a standardized measure of discounted cash flows. The FASB considered these disclosures to be necessary to compensate for the deficiencies in historical cost financial statements. The usefulness of the new signals created by SFAS No. 69, however, is an empirical question and research regarding that objective is lacking. The objective of the study is to test the usefulness of SFAS No. 69. The research strategy used to achieve that objective is to compare the discriminant and predictive power of SFAS No. 69 signals or SFAS No. 69 signals combined with financial signals to that of financial signals alone. The research hypothesized that SFAS No. 69 signals by themselves or as supplmentary to financial signals have more discriminant and predictive ability for business failure in oil and gas industry than do financial signals alone. In order to test that hypothesis, the study used the multiple discriminant analysis technique (MDA) to develop three equations. The first is based on SFAS NO. 69 signals, the second on financial statement signals, and the third on joint financial and SFAS No. 69 signals. Data were collected from the 10-K's arid the annual reports of 28 oil and gas companies (14 failed and 14 nonfailed). The analysis was repeated for four time bases, one year before failure, two years before failure, three years before failure, and the average of the three years immediately before failure. After assessing the discriminant and predictive ability of each equation in the four time bases, a t-test was used to determine a significant difference in the discriminant ...
Date: December 1985
Creator: Eldahrawy, Kamal
Partner: UNT Libraries

The Applicability of Conjoint Measurement to the Selection Process of Professional Sales Personnel

Description: The study examines the potential of conjoint analysis to provide and apply quantitative data to situations previously limited to non-quantitative analysis within the selection process. Chapter I presents a brief introduction to the sales force selection process. A discussion of the importance of effective selection to the organization as well as an explanation of the objectives, methodology, research questions, and limitations complete the chapter. Chapter II provides a detailed description of the contemporary sales force selection process. The chapter explains the objective and subjective activities and techniques utilized by management in selection decisions. Chapter III describes the steps involved in conjoint analysis and the specific conjoint measurement technique employed in the study. The questionnaire employed and the source of data are described in Chapter IV. An analysis of the results of the research completes the chapter. Chapter V presents the summary, conclusions, and recommendations of the study.
Date: August 1984
Creator: Light, C. David
Partner: UNT Libraries

The Normal Curve Approximation to the Hypergeometric Probability Distribution

Description: The classical normal curve approximation to cumulative hypergeometric probabilities requires that the standard deviation of the hypergeometric distribution be larger than three which limits the usefulness of the approximation for small populations. The purposes of this study are to develop clearly-defined rules which specify when the normal curve approximation to the cumulative hypergeometric probability distribution may be successfully utilized and to determine where maximum absolute differences between the cumulative hypergeometric and normal curve approximation of 0.01 and 0.05 occur in relation to the proportion of the population sampled.
Date: December 1981
Creator: Willman, Edward N. (Edward Nicholas)
Partner: UNT Libraries