84 Matching Results

Search Results

Advanced search parameters have been applied.

Internal and External Drivers of Consumers’ Product Return Behaviors

Description: Product return is a necessary part of the exchange process between companies and customers. It accounts for approximately 16% of total sales and a reduction in retailer / manufacturer profits by 3.8% on average. However, industry data also indicates that a significant portion of products are returned for reasons other than product failure – e.g., change of mind, found a lower price elsewhere, or fraudulent and unethical reasons. Consequently, many firms (e.g., REI) have altered their generous return policies to protect their profits. However, it’s been found that the restricted return policy could also reduce customer satisfaction, increase the perceived risk, and thus negatively affect customers’ loyalty towards a particular store or firm. Thus having a restrictive return policy does not help either. Extant literature mainly focuses on return policies. Little attention has been devoted to the product return behavior itself, thus missing the consumer’s perspective. This study, therefore, focuses on understanding consumers’ return behaviors, including different types of return behaviors, and the drivers and consequences of these different return behaviors. Towards this goal, this study first categorizes all possible types of consumers’ return behaviors into two broad categories - legitimate return behaviors and opportunistic return behaviors. Second, both internal (i.e., variety seeking, impulsiveness, perceived uniqueness, materialism, level of morality, and self-monitoring) and external drivers (i.e., product compatibility, returning cost, perceived risk, complexity of procedure, and social group influence) of consumers’ product return behaviors are identified. Third, the relationship between these drivers of return behavior and the type of return behavior are examined. Finally, the influence of these two different types of return behaviors on consumer’s re-patronage intention is examined. This study uses a survey method to collect data in two phases - pilot phase and main study. In the pilot phase, data were collected from students and used to ...
Date: August 2015
Creator: Pei, Zhi
Partner: UNT Libraries

Customer Induced Uncertainty and Its Impact on Organizational Design

Description: How firms facing environmental uncertainty should organize their activities remains an important and challenging question for today's managers and organizational researchers. Proponents of contingency theory have argued that organizations must adjust their activities to fit the level of environmental uncertainty to ensure long-term survival. Although much work has been done on contingency theory, it is clear that our understanding of uncertainty is far from complete. One important aspect of today's organizations is their focus on service, mass customization, and continuous innovation. This focus often results in the customer being brought either into the organization or at least into closer contact with it. Even though the literature provides numerous evidences of the increasing customer focus, it is yet to empirically explain how the complications of customer-organizational interactions might create uncertainty for contemporary organizations. The traditional measure of uncertainty still considers customers as an environmental factor causing demand uncertainty while ignoring the complex nature of customer and organizational encounters. Seeking to further refine the concept of uncertainty and focusing on the contemporary business phenomena, this study develops measures aspects of customer induced uncertainty and examines their relationships with three organizational design variables. Specifically, this study explains the complicated nature of customer - organizational encounters that creates organizational uncertainty. Also, this study develops three operational measurement instruments for the three aspects of customer induced uncertainty. Finally, this study shows specific relationships between aspects of customer induced uncertainty and specific organizational design variables. This study conducted a mail survey of middle level managers. With a sample size of 118 the measurement instruments were shown to have validity and reliability using factor analysis and Cronbach's alpha. Regression analyses indicate the presence of specific rather than general relationship between customer induced uncertainty variables and organizational design variables. Regression results suggested that the relationships between customer induced ...
Date: August 1999
Creator: Chowdhury, Sanjib Kumar
Partner: UNT Libraries

Determinants of Corporate Governance Choices: Evidence from Listed Foreign Firms on U.S. Stock Exchanges

Description: This study analyzes corporate governance practices of foreign (non-U.S.) issuers listed on the New York Stock Exchange (NYSE) and Nasdaq. Specifically, I examine the extent to which these foreign issuers voluntarily comply with U.S. stock exchange corporate governance requirements applicable to domestic issuers. My sample consists of 201 foreign companies primarily domiciled in Brazil, China, Israel, and the United Kingdom. I find that 151 (75 per cent) of the sample firms do not elect to comply with any of the U.S. corporate governance requirements. Logistic regression analysis generally supports the hypotheses that conformance with U.S. GAAP and percentage of managerial ownership are positively associated, and that percentage ownership by major shareholders is negatively associated with foreign firms electing to comply with U.S. corporate governance rules. This evidence is relevant for regulators and investors.
Access: This item is restricted to UNT Community Members. Login required if off-campus.
Date: May 2017
Creator: Attachot, Weerapat
Partner: UNT Libraries

Optimal design of Dutch auctions with discrete bid levels.

Description: The theory of auction has become an active research area spanning multiple disciplines such as economics, finance, marketing and management science. But a close examination of it reveals that most of the existing studies deal with ascending (i.e., English) auctions in which it is assumed that the bid increments are continuous. There is a clear lack of research on optimal descending (i.e., Dutch) auction design with discrete bid levels. This dissertation aims to fill this void by considering single-unit, open-bid, first price Dutch auctions in which the bid levels are restricted to a finite set of values, the number of bidders may be certain or uncertain, and a secret reserve price may be present or absent. These types of auctions are most attractive for selling products that are perishable (e.g., flowers) or whose value decreases with time (e.g., air flight seats and concert tickets) (Carare and Rothkopf, 2005). I began by conducting a comprehensive survey of the current literature to identify the key dimensions of an auction model. I then zeroed in on the particular combination of parameters that characterize the Dutch auctions of interest. As a significant departure from the traditional methods employed by applied economists and game theorists, a novel approach is taken by formulating the auctioning problem as a constrained mathematical program and applying standard nonlinear optimization techniques to solve it. In each of the basic Dutch auction model and its two extensions, interesting properties possessed by the optimal bid levels and the auctioneer's maximum expected revenue are uncovered. Numerical examples are provided to illustrate the major propositions where appropriate. The superiority of the optimal strategy recommended in this study over two commonly-used heuristic procedures for setting bid levels is also demonstrated both theoretically and empirically. Finally, economic as well as managerial implications of the findings reported ...
Date: May 2010
Creator: Li, Zhen
Partner: UNT Libraries

Effects of Auditor-provided Tax Services on Book-tax Differences and Investors’ Mispricing of Book-tax Differences

Description: In this study, I investigate the effect of auditor-provided tax services (ATS) on firms’ levels of book-tax differences and investors’ mispricing of book-tax differences. The joint provision of audit and tax services has been a controversial issue among regulators and academic researchers. Evidence on whether ATS improve or impair the overall accounting quality is inconclusive as a result of the specific testing circumstances involved in different studies. Book-tax differences capture managers’ earnings management and/or tax avoidance intended to maximize reported financial income and to minimize tax expense. Therefore, my first research question investigates whether ATS improve or impair audit quality by examining the relation between ATS and firms’ levels of book-tax differences. My results show that ATS are negatively related to book-tax differences, suggesting that ATS improve the overall audit quality and reduce aggressive financial and/or tax reporting. My second research question examines whether the improved earnings quality for firms acquiring ATS leads to reduced mispricing of book-tax differences among investors. Recent studies document that despite the rich information about firms’ future earnings contained in book-tax differences, investors process such information inefficiently, leading to systematic pricing errors among firms with large book-tax differences. My empirical evidence indicates that ATS mitigate such mispricing, with pricing errors being lower among firms acquiring ATS compared with firms without ATS. Collectively, these results support the notion that ATS improve audit quality through knowledge spillover. Moreover, the improved earnings quality among firms acquiring ATS in turn helps reduce investors’ mispricing of book-tax differences.
Access: This item is restricted to UNT Community Members. Login required if off-campus.
Date: May 2015
Creator: Luo, Bing
Partner: UNT Libraries

Information Content of Non-GAAP Earnings of Cross-Listed Companies

Description: To supplement earnings reported under generally accepted accounting principles (GAAP), public companies often voluntarily report alternative measures of earnings called non-GAAP earnings (NGE). These companies assert that NGE exclude the effect of non-recurring transactions, thereby helping users of financial information to better assess the company's past performance and prospects. Because NGE measures are not well defined, managers can exploit the inherent discretion in calculating NGE to mislead users. Prior studies provide arguments and evidence on the informative as well as opportunistic use of NGE. However, the studies have examined the characteristics and informativeness of NGE with a focus on U.S. companies. The results of studies that consider the NGE disclosure by U.S. companies may not be generalizable to the cross-listed companies because foreign financial reporting standards are different from the U.S. GAAP. Further, prior studies report a difference in earnings quality of U.S. firms and cross-listed firms, which can also result in a difference in the informativeness of their NGE. To fill this gap in literature, I examine whether the informativeness of NGE of cross-listed companies is different from that of U.S. companies. This study contributes to the debate on the informativeness of NGE. It provides evidence that in general, NGE are equally informative for U.S. and foreign companies but foreign companies are more opportunistic in excluding recurring items from NGE. The results of this study are of potential interest to investors, regulators, and academics who are interested in and interact with cross-listed companies.
Date: May 2018
Creator: Adhikari, Subash
Partner: UNT Libraries

Creating a Criterion-Based Information Agent Through Data Mining for Automated Identification of Scholarly Research on the World Wide Web

Description: This dissertation creates an information agent that correctly identifies Web pages containing scholarly research approximately 96% of the time. It does this by analyzing the Web page with a set of criteria, and then uses a classification tree to arrive at a decision. The criteria were gathered from the literature on selecting print and electronic materials for academic libraries. A Delphi study was done with an international panel of librarians to expand and refine the criteria until a list of 41 operationalizable criteria was agreed upon. A Perl program was then designed to analyze a Web page and determine a numerical value for each criterion. A large collection of Web pages was gathered comprising 5,000 pages that contain the full work of scholarly research and 5,000 random pages, representative of user searches, which do not contain scholarly research. Datasets were built by running the Perl program on these Web pages. The datasets were split into model building and testing sets. Data mining was then used to create different classification models. Four techniques were used: logistic regression, nonparametric discriminant analysis, classification trees, and neural networks. The models were created with the model datasets and then tested against the test dataset. Precision and recall were used to judge the effectiveness of each model. In addition, a set of pages that were difficult to classify because of their similarity to scholarly research was gathered and classified with the models. The classification tree created the most effective classification model, with a precision ratio of 96% and a recall ratio of 95.6%. However, logistic regression created a model that was able to correctly classify more of the problematic pages. This agent can be used to create a database of scholarly research published on the Web. In addition, the technique can be used to create a ...
Date: May 2000
Creator: Nicholson, Scott
Partner: UNT Libraries

Developing Criteria for Extracting Principal Components and Assessing Multiple Significance Tests in Knowledge Discovery Applications

Description: With advances in computer technology, organizations are able to store large amounts of data in data warehouses. There are two fundamental issues researchers must address: the dimensionality of data and the interpretation of multiple statistical tests. The first issue addressed by this research is the determination of the number of components to retain in principal components analysis. This research establishes regression, asymptotic theory, and neural network approaches for estimating mean and 95th percentile eigenvalues for implementing Horn's parallel analysis procedure for retaining components. Certain methods perform better for specific combinations of sample size and numbers of variables. The adjusted normal order statistic estimator (ANOSE), an asymptotic procedure, performs the best overall. Future research is warranted on combining methods to increase accuracy. The second issue involves interpreting multiple statistical tests. This study uses simulation to show that Parker and Rothenberg's technique using a density function with a mixture of betas to model p-values is viable for p-values from central and non-central t distributions. The simulation study shows that final estimates obtained in the proposed mixture approach reliably estimate the true proportion of the distributions associated with the null and nonnull hypotheses. Modeling the density of p-values allows for better control of the true experimentwise error rate and is used to provide insight into grouping hypothesis tests for clustering purposes. Future research will expand the simulation to include p-values generated from additional distributions. The techniques presented are applied to data from Lake Texoma where the size of the database and the number of hypotheses of interest call for nontraditional data mining techniques. The issue is to determine if information technology can be used to monitor the chlorophyll levels in the lake as chloride is removed upstream. A relationship established between chlorophyll and the energy reflectance, which can be measured by satellites, enables ...
Date: August 1999
Creator: Keeling, Kellie Bliss
Partner: UNT Libraries

Structural Holes and Simmelian Ties: Exploring Social Capital, Task Interdependence, and Individual Effectiveness

Description: Two contrasting notions have been put forward on how social capital may influence individual effectiveness in organizations. Burt (1992) sets forth the informational and control advantages that are possible by building an open network characterized by large numbers of structural holes. In contrast, Coleman (1990) and Simmel (1950) have suggested that network closure, exemplified by large numbers of Simmelian ties, enables actors to develop trust, cohesiveness, and norms which contribute to effectiveness. Simmelian ties are strong, reciprocal ties shared by three actors. It is proposed that an actor's network cannot be dominated by both structural holes and Simmelian ties. Thus, this study examines whether a moderating variable is at work. It is proposed that the actor's task interdependence in the workplace influences the relationship between network closure and individual effectiveness. Actors in less task interdependent environments will benefit especially from the information and control benefits afforded by a network characterized by structural holes. Conversely, actors in highly interdependent environments will benefit especially from the creation of trust and cooperation that result from large numbers of Simmelian ties. Data was collected on 113 subjects in three organizations. Subjects were asked to rate the strength of their relationship with all organization members and their own level of task interdependence. Contrary to expectations, nearly all subjects reported high levels of task interdependence. Raters in each organization provided individual effectiveness measures for all subjects. Hypotheses were tested using hierarchical set regression and bivariate correlation. The results indicated support for the hypothesized relationship of Simmelian ties with task interdependence. When examining all cases, no support was found for the hypothesized relationship of structural holes and Simmelian ties with individual effectiveness and of structural holes with task interdependence. Nonetheless, additional analyses provided some indication of an association between Simmelian ties and individual effectiveness. Task interdependence did ...
Date: December 1999
Creator: Engle, Scott L.
Partner: UNT Libraries

An investigation of technical support issues influencing user satisfaction

Description: The widespread distribution of personal computers (PCs) throughout organizations has made a substantial impact on information systems. Additionally, the tremendous growth of the Internet has changed the way business is carried out. As the user population evolves into a much more technical and demanding group, their needs are also changing. With this change, Management Information Systems (MIS) departments must develop new ways of providing service and support to the user community. This study investigates the relationship between information systems support structures, support services, service quality and the characteristics of a diverse user population. This includes investigating technical support issues influencing user satisfaction. This study attempts to improve the understanding of the support function within MIS. The results of this study clarify the support needs of the users and identify user satisfaction factors, as well as factors relative to the quality of the support received. Six streams of prior research were reviewed when developing the research framework. These include: user support, end users and end-user computing, identifying and classifying user types, information centers, user satisfaction, service quality and other sources of computer support. A survey instrument was designed using the (UIS) user satisfaction instrument developed by Doll and Torkzadeh (1988) and the SERVQUAL instrument as modified by Kettinger and Lee (1994). The survey was distributed to 720 individuals. A total of 155 usable responses were analyzed providing mixed results. Of the ten hypotheses, only four were rejected. The finding of this study differ from those in earlier studies. The variables that were found to be significant to the users for service quality are the method of support that is provided to the user, i.e., help desk or local MIS support and the support technician's experience level. For user satisfaction the location of the service personnel made a difference to the end ...
Date: May 2000
Creator: Gutierrez, Charletta Frances
Partner: UNT Libraries

A Discrimination of Software Implementation Success Criteria

Description: Software implementation projects struggle with the delicate balance of low cost, on-time delivery and quality. The methodologies and processes used to create and maintain a quality software system are expensive to deploy and result in long development cycle-time. However, without their deployment into the software implementation life-cycle, a software system will be undependable, unsuccessful. The purpose of this research is to identify a succinct set of software implementation success criteria and assess the key independent constructs, activities, carried out to ensure a successful implementation project. The research will assess the success of a software implementation project as the dependent construct of interest and use the software process model (methodology) as the independent construct. This field research involved three phases: (1) criteria development, (2) data collection, and (3) testing of hypotheses and discriminant analysis. The first phase resulted in the development of the measurement instruments for the independent and dependent constructs. The measurement instrument for the independent construct was representative of the criteria from highly regarded software implementation process models and methodologies, e.g., ISO9000, Software Engineering Institute's Capability Maturity Model (SEI CMM). The dependent construct was developed from the categories and criteria from the Delone and McLean (1992) MIS List of Success Measures. The data collection and assessment phase employed a field survey research strategy to 80 companies involved in internal software implementation. Both successful and unsuccessful software implementation projects (identified by the Delone/McLean model) participated. Results from 165 projects were collected, 28 unsuccessful and 137 successful. The third phase used ANOVA to test the first 11 hypotheses and employed discriminant analysis for the 12th hypothesis to identify the "best set" of variables, criteria, that discriminate between successful and unsuccessful software implementation projects. Twelve discriminating variables out of 67 were identified and supported as significant discriminators between successful and unsuccessful projects. ...
Date: August 1999
Creator: Pryor, Alan N.
Partner: UNT Libraries

The Cluster Hypothesis: A Visual/Statistical Analysis

Description: By allowing judgments based on a small number of exemplar documents to be applied to a larger number of unexamined documents, clustered presentation of search results represents an intuitively attractive possibility for reducing the cognitive resource demands on human users of information retrieval systems. However, clustered presentation of search results is sensible only to the extent that naturally occurring similarity relationships among documents correspond to topically coherent clusters. The Cluster Hypothesis posits just such a systematic relationship between document similarity and topical relevance. To date, experimental validation of the Cluster Hypothesis has proved problematic, with collection-specific results both supporting and failing to support this fundamental theoretical postulate. The present study consists of two computational information visualization experiments, representing a two-tiered test of the Cluster Hypothesis under adverse conditions. Both experiments rely on multidimensionally scaled representations of interdocument similarity matrices. Experiment 1 is a term-reduction condition, in which descriptive titles are extracted from Associated Press news stories drawn from the TREC information retrieval test collection. The clustering behavior of these titles is compared to the behavior of the corresponding full text via statistical analysis of the visual characteristics of a two-dimensional similarity map. Experiment 2 is a dimensionality reduction condition, in which inter-item similarity coefficients for full text documents are scaled into a single dimension and then rendered as a two-dimensional visualization; the clustering behavior of relevant documents within these unidimensionally scaled representations is examined via visual and statistical methods. Taken as a whole, results of both experiments lend strong though not unqualified support to the Cluster Hypothesis. In Experiment 1, semantically meaningful 6.6-word document surrogates systematically conform to the predictions of the Cluster Hypothesis. In Experiment 2, the majority of the unidimensionally scaled datasets exhibit a marked nonuniformity of distribution of relevant documents, further supporting the Cluster Hypothesis. Results of ...
Access: This item is restricted to UNT Community Members. Login required if off-campus.
Date: May 2000
Creator: Sullivan, Terry
Partner: UNT Libraries

Firm Performance and Analyst Forecast Accuracy Following Discontinued Operations: Evidence from the Pre-SFAS 144 and SFAS 144 Eras

Description: Because of the non-recurring and transitory nature of discontinued operations, accounting standards require that the results of discontinued operations be separately reported on the income statement. Prior accounting literature supports the view that discontinued operations are non-recurring or transitory in nature, and also suggests that income classified as transitory has minimal relevance in firm valuation. Finance and management literature, however, suggest that firms discontinue operations to strategically utilize their scarce resources. Assuming that discontinued operations are a result of managerial motives to strategically concentrate resources into remaining continued operations, this dissertation examines the informativeness of discontinued operations. In doing so, this dissertation empirically tests the financial performance, investment efficiency, valuation, and analyst forecast accuracy effects of discontinued operations. In 2001, Financial Accounting Standards Board's (FASB) Statement of Financial Accounting Standards (SFAS) 144 (hereafter SFAS 144) replaced Accounting Principles Board's Opinion 30 (hereafter APB 30) and broadened the scope of divestiture transactions to be presented in discontinued operations. Some stakeholders of financial statements argued that discontinued operations were less decision-useful in the SFAS 144 era because too many transactions that do not represent a strategic shift in operations were separately stated as discontinued operations on the income statement. With the possibility that the discontinued operations reported in SFAS 144 era may not reflect a major strategic reallocation of resources, this dissertation examines whether the relationship between discontinued operations, firm performance, investment efficiency, and analyst forecast accuracy are different in the pre-SFAS 144 and SFAS 144 era. Using a sample of firms that discontinued operations between 1990 and 2012, this dissertation study finds limited evidence that firms experience improvement in financial performance following discontinued operations and that such improvement is only observed in pre-SFAS 144 era. The results also suggest that any improvement in financial performance documented is conditional on the profitability ...
Date: May 2017
Creator: Guragai, Binod
Partner: UNT Libraries

Framework to Evaluate Entropy Based Data Fusion Methods in Supply Chain Management

Description: This dissertation explores data fusion methodology to deduce an overall inference from the data gathered from multiple heterogeneous sources. Typically, if there existed a data source in which the data were reliable and unbiased, then data fusion would not be necessary. Data fusion methodology combines data form multiple diverse sources so that the desired information - such as the population mean - is improved despite redundancies, inaccuracies, biases, and inflated variability in the data. Examples of data fusion include estimating average demand from similar sources, and integrating fatality counts from different media sources after a catastrophe. The approach in this study combines "inputs" from distinct sources so that the information is "fused." Another way of describing this process is "data integration." Important assumptions are 1. Several sources provide "inputs" for information used to estimate parameters of a probability distribution. 2. Since distributions for the data from the sources are heterogeneous, some sources are less reliable. 3. Distortions, bias, censorship, and systematic errors may be more prominent in data from certain sources. 4. The sample size of sources data, number of "inputs," may be very small. Examples of information from multiple sources are abundant: traffic information from sensors at intersections, multiple economic indicators from various sources, demand data for product using similar retail stores as sources, polling data from various sources, and disaster count of fatalities from different media sources after a catastrophic event. This dissertation seeks to address a gap in the operations literature by addressing three research questions regarding entropy base data fusion (EBDF) approaches to estimation. Three separate, but unifying, essays address the research questions for this dissertation. Essay 1 provides an overview of supporting literature for the research questions. A numerical analysis of airline maximum wait time data illustrates the underlying issues involved in EBDF methods. This ...
Date: December 2016
Creator: Tran, Huong Thi
Partner: UNT Libraries

Leader Emergence and Effectiveness in Virtual Workgroups: Dispositional and Social Identity Perspectives

Description: In today's global competitive environment, many organizations utilize virtual workgroups to overcome geographic and organizational boundaries. Research into their dynamics has received the attention of scholars within multiple disciplines, and the potential for an integrative approach to the study of virtual workgroups exists. This dissertation is a first step towards such an approach. The primary aim of this research is to examine antecedent and contextual factors that affect the emergence and effectiveness of leaders in virtual workgroups. To achieve this aim, an integrative model assembled from theory and empirical findings in leadership, management, social identity, and communications research is posited. Hypothesized relationships depicted in the model identify key dispositional and contextual variables linked to leader emergence, member behavior, and leader effectiveness within virtual workgroups. This study employed a nonexperimental research design, in which leader emergence and social identity manifest as naturally occurring phenomena. Data collection occurred via two web-based surveys administered at different points in time. Hypothesized relationships were tested utilizing correlational and hierarchical moderated multiple regression analyses. The findings of this dissertation suggest that traits, such as personality and cognitive ability, are not associated with leader emergence in virtual workgroups. In addition, the results indicate that the exhibition of relationship-oriented leader behaviors enhances group identity. In turn, identification is associated with increases in perceptions of leader effectiveness and decreases in counterproductive behavior exhibited by group members. This dissertation exposes an important limitation to the application of trait leadership theory. It also demonstrates the importance of relationship-oriented behavior and social identity in virtual contexts. Further, it advances an integrative theoretical model for the study of virtual workgroup phenomena. These contributions should assist and inform other researchers, as well as practitioners, interested in leadership and group member behavior in virtual workgroups.
Date: August 2009
Creator: Hite, Dwight M.
Partner: UNT Libraries

Accident versus Essence: Investigating the Relationship Among Information Systems Development and Requirements Capabilities and Perceptions of Enterprise Architecture

Description: Information systems (IS) are indelibly linked to the global economy and are indispensable to society and organizations. Despite the decisive function of IS in organizations today, IS development problems continue to plague organizations. The failure to get the system requirements right is considered to be one of the primary, if not the most significant, reasons for this high IS failure rate. Getting requirements right is most notably identified with Frederick Brooks' contention that requirements are the essence of what IT professionals do, all the rest being accidents or risk management. However, enterprise architecture (EA) may also provide the discipline to bridge the gap between effective requirements, organizational objectives, and the actual IS implementations. The intent of this research is to examine the relationship between IS development capabilities and requirements analysis and design capabilities within the context of enterprise architecture. To accomplish this, a survey of IT professionals within the Society for Information Management (SIM) was conducted. Results indicate support for the hypothesized relationship between IS development and requirements capabilities. The hypothesized relationships with the organizational demographics were not supported nor was the hypothesized positive relationship between requirements capabilities and EA perceptions. However, the nature of the relationship of requirements and EA provided important insight into the relationship leading to several explanations as to its meaning and contributions to research and practice. This research contributes to IS development knowledge by providing evidence of the essential role of requirements in IS development capabilities and in IS development maturity. Furthermore, contributions to the nascent field of EA research and practice include key insight into EA maturity, EA implementation success, and the role of IT professionals in EA teams. Moreover, these results provide a template and research plan of action to pursue further EA research in exploring EA maturity models and critical success factors, ...
Date: August 2009
Creator: Salmans, Brian R.
Partner: UNT Libraries

Monitoring or moral hazard? Evidence from real activities manipulation by venture-backed companies.

Description: Prior literature suggests two competing theories regarding the role of venture capitalists (VCs) in their portfolio companies. The VC monitoring hypothesis argues that VCs effectively resolve the managerial agency problem through close monitoring and restraining managers' earnings management behavior. The VC moral hazard hypothesis argues that VCs aggravate the private benefits agency problem by exerting influence over managers to artificially inflate exit stock price through earnings management. Using a sample of IPO firms between 1987 and 2002, after controlling for the magnitude of accruals manipulation (AM), I compare the magnitude of real activities manipulation (RM) between venture-backed and non-venture-backed companies. I find that relative to non-venture-backed companies, venture-backed companies show significantly less RM in the first post-IPO fiscal year. The results are robust after controlling for the VC selection endogeneity. The finding supports the VC monitoring hypothesis that VCs restrain managers' RM behavior. Furthermore, I document that venture-backed companies exhibit a significant difference from non-venture-backed companies only in the first post-IPO fiscal year. The difference between the two groups in either the IPO year or the second post-IPO fiscal year is not significant, or at best, is weak. This finding is consistent with the argument that VCs tighten their control during the lockup expiration period when insiders such as managers or founders have strong incentives to inflate earnings. By the end of the second post-IPO fiscal year when VCs exit the portfolio companies, their impact on portfolio companies' RM decreases dramatically which makes the difference between the two groups less significant. In addition, using a sample of venture-backed IPOs from 1987 to 2002, I find that companies backed by high-reputation VCs show significantly less RM than those backed by low-reputation VCs in the first post-IPO fiscal year. The results are robust to alternative VC reputation proxies. This finding is consistent ...
Date: December 2009
Creator: Liu, Xiang
Partner: UNT Libraries

Tsunami disaster response: A case analysis of the information society in Thailand.

Description: The December 2004 Indian Ocean Tsunami wrecked thousands of lives, homes, and livelihoods - losses that could have been avoided with timely and better information. A resource such as information is needed at a fundamental level much like water, food, medicine, or shelter. This dissertation examines the development of the Thai information society, in terms of the share of information workforce and the level of diffusion of information and communication technologies (ICT), as well as, the role of the Thai information society in response to the tsunami disaster. The study combined the historical and political economy analyses in explaining factors influencing the growth of information workforce and the development of ICT in Thailand. Interviews conducted in 2007-08 revealed the Thai information society responded to the 2004 Tsunami - the first global internet-mediated natural disaster - in two areas: on-site assistance in collecting and recording identification information of tsunami disaster victims and on-line dissemination of disaster relief information. The effectiveness of ICT institutions in providing the tsunami disaster relief efforts and increasing the development of the information society were assessed using statistical procedures analyzing the perceptions of the Internet-based survey respondents. The disaster effects on survey respondents were also assessed. The study's findings include: (1) the Thai information sector development pattern confirmed a key difference between development patterns of information sectors in developed and developing countries, (2) the increasing number of Thai information workers was due more to the expansion of government than the expansion in the manufacturing and service sectors during the 1997-98 Asian financial crisis, (3) Thailand's expansion of ICT infrastructure was influenced not only on the basis of economic profitability but also by political desirability, and (4) volunteers were crucial in humanitarian aid and disaster relief.
Date: December 2009
Creator: Aswalap, Supaluk Joy
Partner: UNT Libraries

Team performance: Using financial measures to evaluate the effect of support systems on team performance.

Description: Organizations invest in team-based systems in order to generate innovative practices that will give them a competitive edge. High-performing teams require training and other support systems to gain the skills they need as well as to create and maintain an environment conducive to their success. The challenge for managers is to make resource allocation decisions among investment alternatives to maximize team effectiveness and still ensure a financial return for company investors. This study has three objectives. The first objective is to investigate whether there is a positive relationship among organizational environment, team potency (the team's collective belief it will succeed) and team performance. Results indicate that the presence of four organizational support systems influences team potency and performance. These support systems are the Design and Measurement, Rewards, Training and Communications Systems. In addition, results indicate that team potency is a mediating variable between the Design and Measurement and Communications Systems and team performance. These results suggest that companies are able to influence team performance by investing in environmental support systems. The second objective is to examine whether team members and managers view the organizational environment differently. Results indicate that managers view the Training and Communications Systems as more important, while teams perceive the Design and Measurement System and the Rewards System to be more important to their success. Since the systems that team managers view as important may influence their investment decisions, these differences may suggest a resource alignment issue. Third, a measure of team effectiveness based on financial measures is introduced. Published literature emphasizes attitudinal, behavioral and operational measures of performance. A financial measure offers a method of evaluating performance that is similar to methods used in capital budgeting and may be consistently applied across different types of teams with different purposes. The data collection process was performed by ...
Date: May 2002
Creator: Kennedy, Frances Anne
Partner: UNT Libraries

When and Where Does It Pay to Be Green: Intra- and Inter-organizational Factors Influencing the Environmental/Financial Performance Link

Description: Managers are coming under increasing pressure from a wide array of stakeholders to improve the environmental performance of their firms while still achieving financial performance objectives. One of the most researched questions in the business and the natural environment (B&NE) literature is whether it pays to be green. Despite more than three decades of research, scholars have been unable to clearly answer this question. The purpose of this dissertation was to attempt to identify the antecedents that lead to increased, firm-level environmental performance and the conditions in which firms are then able to profit from enhanced environmental performance. First, I assessed three intra-organizational factors of top management teams (i.e. female representation, concern for non-financial stakeholders, and risk-seeking propensity) that theory indicated are associated with increased corporate environmental performance (CEP). Theory also leads us to believe that top management teams with these attributes should perform better in dynamic settings, so I tested to see if industry dynamism moderates these relationships. Second, I then examined industry-level forces that theory indicates would moderate the relationship between CEP and corporate financial performance (CFP). These moderating forces include industry profitability, industry dynamism, and the degree of industry environmental regulation. Hypotheses were tested using panel data obtained from the KLD, Compustat, and Environmental Protection Agency databases for the years 2000 to 2011. The sample consists of firms comprising the Standard and Poor’s 500 and was analyzed using fixed-effect regression and moderating variables were analyzed using the Johnson-Neyman technique.
Date: May 2014
Creator: Cox, Marcus Z.
Partner: UNT Libraries

A Comparison of Permanent and Measured Income Inequality

Description: The degree of inequality present in the distribution of income may be measured with a gini coefficient. If the distribution is found to empirically fit a particular distribution function, then the gini coefficient may be derived from the mean value of income and the variation from the mean. For the purpose of this study, the Beta II distribution was used as the function which most closely approximates the actual distribution of income. The Beta II function provides the skewness which is normally found in an income distribution as well as fulfilling other required characteristics. The degree of inequality was approximated for the distribution of income from all sources and from ten separate components of income sources in constant (1973) dollars. Next, permanent income from all sources and from the ten component sources was estimated based upon actual income using the double exponential smoothing forecasting technique. The estimations of permanent income, which can be thought of as expected income, were used to derive measures of permanent income inequality. The degree of actual income inequality and the degree of permanent income inequality, both being represented by the hypothetical gini coefficient , were compared and tested for statistical differences. For the entire period under investigation, 1952 to 1979, the net effect was no statistically significant difference between permanent and actual income inequality, as was expected. However, significant differences were found in comparing year by year. Relating permanent income inequality to the underlying, structural inequality present in a given distribution, conclusions were drawn regarding the role of mobility in its ability to alter the actual distribution of income. The impact of business fluctuations on the distribution of permanent income relative to the distribution of actual income was studied in an effort to reach general conclusions. In general, cyclical upswings tend to reduce permanent inequality ...
Date: August 1986
Creator: McHargue, Susan L. (Susan Layne)
Partner: UNT Libraries

The Effect of SFAS No. 141 and SFAS No. 142 on the Accuracy of Financial Analysts' Earnings Forecasts after Mergers

Description: This study examines the impact of Statements of Financial Accounting Standards No. 141 and No. 142 (hereafter SFAS 141, 142) on the characteristics of financial analysts' earnings forecasts after mergers. Specifically, I predict lower forecast errors for firms that experienced mergers after the enactment of SFAS 141, 142 than for firms that went through business combinations before those accounting changes. Study results present strong evidence that earnings forecast errors for companies involved in merging and acquisition activity decreased after the adoption of SFAS 141, 142. Test results also suggest that lower earnings forecast errors are attributable to factors specific to merging companies such as SFAS 141, 142 but not common to merging and non-merging companies. In addition, evidence implies that information in corporate annual reports of merging companies plays the critical role in this decrease of earnings forecast error. Summarily, I report that SFAS 141, 142 were effective in achieving greater transparency of financial reporting after mergers. In my complementary analysis, I also document the structure of corporate analysts' coverage in "leaders/followers" terms and conduct tests for differences in this structure: (1) across post-SFAS 141,142/pre-SFAS 141, 142 environments, and (2) between merging and non-merging firms. Although I do not identify any significant differences in coverage structure across environments, my findings suggest that lead analysts are not as accurate as followers when predicting earnings for firms actively involved in mergers. I also detect a significant interaction between the SFAS-environment code and leader/follower classification, which indicates greater improvement of lead analyst forecast accuracy in the post-SFAS 141, 142 environment relative to their followers. This interesting discovery demands future investigation and confirms the importance of financial reporting transparency for the accounting treatment of business combinations.
Date: May 2005
Creator: Mintchik, Natalia Maksimovna
Partner: UNT Libraries

Development and Validation of an Instrument to Operationalize Information System Requirements Capabilities

Description: As a discipline, information systems (IS) has struggled with the challenge of alignment of product (primarily software and the infrastructure needed to run it) with the needs of the organization it supports. This has been characterized as the pursuit of alignment of information technology (IT) with the business or organization, which begins with the gathering of the requirements of the organization, which then guide the creation of the IS requirements, which in turn guide the creation of the IT solution itself. This research is primarily focused on developing and validating an instrument to operationalize such requirements capabilities. Requirements capabilities at the development of software or the implementation of a specific IT solution are referred to as capabilities for software requirements or more commonly systems analysis and design (SA&D) capabilities. This research describes and validates an instrument for SA&D capabilities for content validity, construct validity, internal consistency, and an exploratory factor analysis. SA&D capabilities were expected to coalesce strongly around a single dimension. Yet in validating the SA&D capabilities instrument, it became apparent that SA&D capabilities are not the unidimensional construct traditionally perceived. Instead it appears that four dimensions underlie SA&D capabilities, and these are associated with alignment maturity (governance, partnership, communications, and value). These sub factors of requirements capabilities are described in this research and represent distinct capabilities critical to the successful alignment of IT with the business.
Date: May 2014
Creator: Pettit, Alex Z.
Partner: UNT Libraries