40 Matching Results

Search Results

Advanced search parameters have been applied.

FORTRAN Optimizations at the Source Code Level

Description: This paper discusses FORTRAN optimizations that the user can perform manually at the source code level to improve object code performance. It makes use of descriptive examples within the text of the paper for explanatory purposes. The paper defines key areas in writing a FORTRAN program and recommends ways to improve efficiency in these areas.
Date: August 1977
Creator: Barber, Willie D.
Partner: UNT Libraries

Execution Time Analysis through Software Monitors

Description: The analysis of an executing program and the isolation of critical code has been a problem since the first program was written. This thesis examines the process of program analysis through the use of a software monitoring system. Since there is a trend toward structured languages a subset of PL/I was developed t~o exhibit source statement monitoring and costing techniques. By filtering a PL/W program through a preorocessor which determines the cost of source statements and inserts monitoring code, a post-execution analysis of the program can be obtained. This analysis displays an estimated time cost for each source statements the number of times the statement w3s executed, and the product of these values. Additionally, a bar graph is printed in order to quickly locate very active code.
Date: December 1977
Creator: Whistler, Wayne C.
Partner: UNT Libraries

The Applications of Regression Analysis in Auditing and Computer Systems

Description: This thesis describes regression analysis and shows how it can be used in account auditing and in computer system performance analysis. The study first introduces regression analysis techniques and statistics. Then, the use of regression analysis in auditing to detect "out of line" accounts and to determine audit sample size is discussed. These applications led to the concept of using regression analysis to predict job completion times in a computer system. The feasibility of this application of regression analysis was tested by constructing a predictive model to estimate job completion times using a computer system simulator. The predictive model's performance for the various job streams simulated shows that job completion time prediction is a feasible application for regression analysis.
Date: May 1977
Creator: Hubbard, Larry D.
Partner: UNT Libraries

A General Economic Study of Patterns of Government Expenditures in Thailand

Description: An analysis of Thai government expenditures demonstrates that they expanded rapidly between 1900 and 1969, due primarily to rising prices and the extension of government functions, particularly during the post-war period. In contrast, the war effect had little influence on the growth of expenditures. During the period under study, Thai government expenditures were devoted largely to general, social, and economic services, with emphasis on transportation and communication, defense, agriculture, and education. Current expenditures (for defense, education, etc.) represented a higher percentage of total government expenditures than did capital expenditures (for public construction, social services, etc.). In general, the case of Thailand indicates that levels of government expenditure were higher in conjunction with greater emphasis on economic and social development.
Date: August 1984
Creator: Chandraprasert, Poch
Partner: UNT Libraries

A Proposed Technical Communication Degree Program for Texas Colleges and Universities

Description: This investigation is concerned with the problem of Texas employers' inability to hire adequately trained technical communication personnel because Texas universities and colleges do not offer a bachelor's degree program for that career field. This study contains the results of five separate surveys that investigate the backgrounds and training of present technical communication personnel and the training desired by supervisory personnel. The study also recommends a bachelor's degree program in technical communication with three technological specialties: electronics, mechanical, and chemical/petroleum. Anticipated problems in setting up such a degree program and possible solutions to the problems are discussed in the study. The suggested freshman and sophomore curriculum could be used as a guideline for a junior college associate program.
Date: May 1978
Creator: Walker, Ronald O.
Partner: UNT Libraries

A Comparison of File Organization Techniques

Description: This thesis compares the file organization techniques that are implemented on two different types of computer systems, the large-scale and the small-scale. File organizations from representative computers in each class are examined in detail: the IBM System/370 (OS/370) and the Harris 1600 Distributed Processing System with the Extended Communications Operating System (ECOS). In order to establish the basic framework for comparison, an introduction to file organizations is presented. Additionally, the functional requirements for file organizations are described by their characteristics and user demands. Concluding remarks compare file organization techniques and discuss likely future developments of file systems.
Date: August 1977
Creator: Rogers, Roy Lee
Partner: UNT Libraries

Learned Helplessness: The Result of the Uncontrollability of Reinforcement or the Result of the Uncontrollability of Aversive Stimuli?

Description: This research demonstrates that experience with uncontrollable reinforcement, here defined as continuous non-contingent positive feedback to solution attempts of insoluble problems, fails to produce the proactive interference phenomenon, learned helplessness, while uncontrollable aversive events, here defined as negative feedback to solution attempts of insoluble problems, produces that phenomenon. These results partially support the "learned helplessness" hypothesis of Seligman (1975) which predicts that experience with uncontrollable reinforcement, the offset of negative events or the onset of positive ones, results in learning that responding is independent of reinforcement and that learning transfers to subsequent situations. This research further demonstrates that experience with controllability, here defined as solubility, results in enhanced competence.
Date: August 1975
Creator: Benson, James S.
Partner: UNT Libraries

RADIX 95n: Binary-to-Text Data Conversion

Description: This paper presents Radix 95n, a binary to text data conversion algorithm. Radix 95n (base 95) is a variable length encoding scheme that offers slightly better efficiency than is available with conventional fixed length encoding procedures. Radix 95n advances previous techniques by allowing a greater pool of 7-bit combinations to be made available for 8-bit data translation. Since 8-bit data (i.e. binary files) can prove to be difficult to transfer over 7-bit networks, the Radix 95n conversion technique provides a way to convert data such as compiled programs or graphic images to printable ASCII characters and allows for their transfer over 7-bit networks.
Date: August 1991
Creator: Jones, Greg, 1963-2017.
Partner: UNT Libraries

A C Navigational System

Description: The C Navigational System (CNS) is a proposed programming environment for the C programming language. The introduction covers the major influences of programming environments and the components of a programming environment. The system is designed to support the design, coding and maintenance phases of software development. CNS provides multiple views to both the source and documentation for a programming project. User-defined and system-defined links allow the source and documentation to be hierarchically searched. CNS also creates a history list and function interface for each function in a module. The final chapter compares CNS and several other programming environments (Microscope, Rn, Cedar, PECAN, and Marvel).
Date: May 1989
Creator: Hammerquist, James D. (James Daniel)
Partner: UNT Libraries

Computer Graphics Primitives and the Scan-Line Algorithm

Description: This paper presents the scan-line algorithm which has been implemented on the Lisp Machine. The scan-line algorithm resides beneath a library of primitive software routines which draw more fundamental objects: lines, triangles and rectangles. This routine, implemented in microcode, applies the A(BC)*D approach to word boundary alignments in order to create an extremely fast, efficient, and general purpose drawing primitive. The scan-line algorithm improves on previous methodologies by limiting the number of CPU intensive instructions and by minimizing the number of words referenced. This paper will describe how to draw scan-lines and the constraints imposed upon the scan-line algorithm by the Lisp Machine's hardware and software.
Date: December 1988
Creator: Myjak, Michael D. (Michael David)
Partner: UNT Libraries

The Cantatas of Jean-Philippe Rameau

Description: By the early eighteenth century, French music was tangibly influenced by the Italian style which had already permeated much of Europe. The French Cantata is symptomatic of that often disparaged influx. The cantatas of Rameau are a significant contribution to an important form. Written almost entirely in the early years of the artist's career, they hold details of his stylistic development. In the present study of Rameau's cantatas several aspects of his style are discussed as they relate both to his theoretic writings and to the various influences of the time. Examples of those stylistic elements found in the cantatas are cited and discussed. There is, as well, a comparison of the works to the poetic form standardized by Rousseau.
Date: May 1991
Creator: McManus, Catherine
Partner: UNT Libraries

Linear Unification

Description: Efficient unification is considered within the context of logic programming. Unification is explained in terms of equivalence classes made up of terms, where there is a constraint that no equivalence class may contain more than one function term. It is demonstrated that several well-known "efficient" but nonlinear unification algorithms continually maintain the said constraint as a consequence of their choice of data structure for representing equivalence classes. The linearity of the Paterson-Wegman unification algorithm is shown largely to be a consequence of its use of unbounded lists of pointers for representing equivalences between terms, which allows it to avoid the nonlinearity of "union-find".
Date: December 1989
Creator: Wilbanks, John W. (John Winston)
Partner: UNT Libraries

A Simulation Study Comparing Various Confidence Intervals for the Mean of Voucher Populations in Accounting

Description: This research examined the performance of three parametric methods for confidence intervals: the classical, the Bonferroni, and the bootstrap-t method, as applied to estimating the mean of voucher populations in accounting. Usually auditing populations do not follow standard models. The population for accounting audits generally is a nonstandard mixture distribution in which the audit data set contains a large number of zero values and a comparatively small number of nonzero errors. This study assumed a situation in which only overstatement errors exist. The nonzero errors were assumed to be normally, exponentially, and uniformly distributed. Five indicators of performance were used. The classical method was found to be unreliable. The Bonferroni method was conservative for all population conditions. The bootstrap-t method was excellent in terms of reliability, but the lower limit of the confidence intervals produced by this method was unstable for all population conditions. The classical method provided the shortest average width of the confidence intervals among the three methods. This study provided initial evidence as to how the parametric bootstrap-t method performs when applied to the nonstandard distribution of audit populations of line items. Further research should provide a reliable confidence interval for a wider variety of accounting populations.
Date: December 1992
Creator: Lee, Ihn Shik
Partner: UNT Libraries

Independent Quadtrees

Description: This dissertation deals with the problem of manipulating and storing an image using quadtrees. A quadtree is a tree in which each node has four ordered children or is a leaf. It can be used to represent an image via hierarchical decomposition. The image is broken into four regions. A region can be a solid color (homogeneous) or a mixture of colors (heterogeneous). If a region is heterogeneous it is broken into four subregions, and the process continues recursively until all subregions are homogeneous. The traditional quadtree suffers from dependence on the underlying grid. The grid coordinate system is implicit, and therefore fixed. The fixed coordinate system implies a rigid tree. A rigid tree cannot be translated, scaled, or rotated. Instead, a new tree must be built which is the result of one of these transformations. This dissertation introduces the independent quadtree. The independent quadtree is free of any underlying coordinate system. The tree is no longer rigid and can be easily translated, scaled, or rotated. Algorithms to perform these operations axe presented. The translation and rotation algorithms take constant time. The scaling algorithm has linear time in the number nodes in the tree. The disadvantage of independent quadtrees is the longer generation and display time. This dissertation also introduces an alternate method of hierarchical decomposition. This new method finds the largest homogeneous block with respect to the corners of the image. This block defines the division point for the decomposition. If the size of the block is below some cutoff point, it is deemed to be to small to make the overhead worthwhile and the traditional method is used instead. This new method is compared to the traditional method on randomly generated rectangles, triangles, and circles. The new method is shown to use significantly less space for all three ...
Date: December 1986
Creator: Atwood, Larry D. (Larry Dale)
Partner: UNT Libraries

Vox Organalis

Description: Vox Organalis is a concerto for organ and orchestra. It employs an ensemble comprising the compliment of wind, percussion, and string instruments normally available within a contemporary symphony orchestra with augmented brass and woodwind sections. It is intended to be performed with a large organ such as might be found in a symphony hall or large church. The work is in two movements, and its intended performance time is twenty-five minutes. Use of the concerto format within Vox organalis results in a new approach to organizing the interaction between the solo part and the orchestral accompaniment. The organ part is notated in traditional metered notation, but the orchestral notation is organized in units of clock time (seconds). The horizontal spatial arrangement of the orchestral notation corresponds to the timing of the metered organ part. Pitch organization in Vox Oraanalis is derived from a twelve-tone row based upon the natural harmonic series. Several techniques of serial composition were used to organize and select elements of the tone row for use in the construction of the work. Use of the tone row for horizontal and vertical pitch structures provides unity to the pitch organization of the work. Vox Organalis is constructed in 12 sections which help define the formal shape of the work. Four of these sections comprise Movement I, and eight are contained by Movement II. The length of the formal sections are based upon the series of natural harmonic numbers from which the tone row was derived.
Date: December 1989
Creator: Baczewski, Philip
Partner: UNT Libraries

The Wang Institute of Graduate Studies: A Historical Perspective

Description: The Wang Institute of Graduate Studies was an independent, non-profit corporate college located Tyngsboro, Massachusetts originated through the benevolence of An Wang. This study focuses on the problems in education and industry that acted as the impetus for this institute and develops a historical perspective of Wang Institute from its inception in 1979 until its end in August, 1987. The study describes the philosophy, organizational structure, curriculum, faculty, and students of Wang Institute. Wang Institute of Graduate Studies no longer exists. The facility used by Wang Institute of Graduate Studies is now known as Wang Institute of Boston University.
Date: December 1987
Creator: Green, Patricia Ann Naizer
Partner: UNT Libraries

A Comparison of Recall by University Bible Students After Discussion and After Self-Study

Description: Recall of expository prose after one of two learning techniques was determined. Pearson correlation did not discover a significant difference between the recall writings of the examinees who studied by discussion and those who studied by underlining. The significance of the difference between two proportions found that the group which underlined recalled significantly better than the group which discussed what they had read. This highly significant difference was almost identical when all synonyms from the Turbo Lightning computer program were considered correct recall and analyzed by the significance of the difference between two proportions.
Date: May 1987
Creator: Stovall, Johnny Harold
Partner: UNT Libraries

The Development and Evaluation of a Forecasting System that Incorporates ARIMA Modeling with Autoregression and Exponential Smoothing

Description: This research was designed to develop and evaluate an automated alternative to the Box-Jenkins method of forecasting. The study involved two major phases. The first phase was the formulation of an automated ARIMA method; the second was the combination of forecasts from the automated ARIMA with forecasts from two other automated methods, the Holt-Winters method and the Stepwise Autoregressive method. The development of the automated ARIMA, based on a decision criterion suggested by Akaike, borrows heavily from the work of Ang, Chuaa and Fatema. Seasonality and small data set handling were some of the modifications made to the original method to make it suitable for use with a broad range of time series. Forecasts were combined by means of both the simple average and a weighted averaging scheme. Empirical and generated data were employed to perform the forecasting evaluation. The 111 sets of empirical data came from the M-Competition. The twenty-one sets of generated data arose from ARIMA models that Box, Taio and Pack analyzed using the Box-Jenkins method. To compare the forecasting abilities of the Box-Jenkins and the automated ARIMA alone and in combination with the other two methods, two accuracy measures were used. These measures, which are free of magnitude bias, are the mean absolute percentage error (MAPE) and the median absolute percentage error (Md APE).
Date: May 1985
Creator: Simmons, Laurette Poulos
Partner: UNT Libraries

Cenotaph: A Composition for Computer-Generated Sound

Description: Cenotaph is a work of fifteen minutes duration for solo tape realized on the Synclavier Digital Music System at the Center for Experimental Music and Intermedia. All of the sound materials in the work consist of resynthesized timbres derived from the analysis of digital recordings of seven different human voices, each speaking the last name of one of the Challenger astronauts. The work's harmonic resources are derived in a unique way involving partitioning of the octave by powers of the Golden Section. The work is in a single movement divided into three sections which function as prologue, action, and epilogue, respectively. This formal structure is reinforced by differentiation of harmonicmaterials and texture. Although Cenotaph cannot be performed "live" and exists only as a recording, a graphic score is included to assist analysis and study.
Date: August 1990
Creator: Rogers, Rowell S. (Rowell Seldon)
Partner: UNT Libraries

Validation and Investigation of the Four Aspects of Cycle Regression: A New Algorithm for Extracting Cycles

Description: The cycle regression analysis algorithm is the most recent addition to a group of techniques developed to detect "hidden periodicities." This dissertation investigates four major aspects of the algorithm. The objectives of this research are 1. To develop an objective method of obtaining an initial estimate of the cycle period? the present procedure of obtaining this estimate involves considerable subjective judgment; 2. To validate the algorithm's success in extracting cycles from multi-cylical data; 3. To determine if a consistent relationship exists among the smallest amplitude, the error standard deviation, and the number of replications of a cycle contained in the data; 4. To investigate the behavior of the algorithm in the predictions of major drops.
Date: December 1982
Creator: Mehta, Mayur Ravishanker
Partner: UNT Libraries

Retail Site Selection Using Multiple Regression Analysis

Description: Samples of stores were drawn from two chains, Pizza Hut and Zale Corporation. Two different samples were taken from Pizza Hut. Site specific material and sales data were furnished by the companies and demographic material relative to each site was gathered. Analysis of variance tests for linearity were run on the three regression equations developed from the data and each of the three regressions equations were found to have a statistically significant linear relationship. Statistically significant differences were found among similar variables used in the prediction of sales by using Fisher's Z' Transformations on the correlation coefficients. Eight of the eighteen variables used in the Pizza Hut study were found to be statistically different between the two regions used in the study. Additionally, analysis of variance tests were used to show that traffic pattern variables were not better predictors than demographic variables.
Date: December 1978
Creator: Taylor, Ronald D. (Ronald Dean)
Partner: UNT Libraries

Environmental Pollution, Material Scarcity and the Development of Aluminum Recycling Reverse Channels of Distribution

Description: The purpose of this study was to analyze the developing organizational and management paradigms in the aluminum packaging and container industry, where reverse channels of distribution offer an excellent vehicle for studying organizations which are "closing the distribution circle." Based on the analysis, several conclusions are offered. 1. The extent to which primary manufacturers have entered aluminum packaging and container recycling and subsequently developed effective reverse channels of distribution is contingent upon needs for resources. 2. The most successful recycling programs are those which have decentralized organizations. 3. Central to beverage producers' decisions to develop extensive reverse channels of distribution is the belief that recycling is (1) a deterrent to container legislation, (2) a source of favorable publicity, (3) a source of company profits, and (4) can improve supply relationships with primary aluminum suppliers. 4. Regional beverage companies in the environmentally conscious Far West have the most successful and comprehensive recycling operations. 5. Loose organizational federations such as those of the soft drink franchise do not seem amenable to the development of reverse channels of distribution. 6. Where i t serves the needs of the enterprise, firms are developing sophisticated and efficient reverse channels of distribution. The institution of reverse channel intermediary functions reflects a new management and organizational paradigm based on environmental considerations. 7. A major stumbling block to further reverse channel development is the uncertainty caused by proposed container legislation.
Date: August 1977
Creator: Ginter, Peter M.
Partner: UNT Libraries