You limited your search to:
- Accessing Information on the World Wide Web: Predicting Usage Based on Involvement
- Advice for Web designers often includes an admonition to use short, scannable, bullet-pointed text, reflecting the common belief that browsing the Web most often involves scanning rather than reading. Literature from several disciplines focuses on the myriad combinations of factors related to online reading but studies of the users' interests and motivations appear to offer a more promising avenue for understanding how users utilize information on Web pages. This study utilized the modified Personal Involvement Inventory (PII), a ten-item instrument used primarily in the marketing and advertising fields, to measure interest and motivation toward a topic presented on the Web. Two sites were constructed from Reader's Digest Association, Inc. online articles and a program written to track students' use of the site. Behavior was measured by the initial choice of short versus longer versions of the main page, the number of pages visited and the amount of time spent on the site. Data were gathered from students at a small, private university in the southwest part of the United States to answer six hypotheses which posited that subjects with higher involvement in a topic presented on the Web and a more positive attitude toward the Web would tend to select the longer text version, visit more pages, and spend more time on the site. While attitude toward the Web did not correlate significantly with any of the behavioral factors, the level of involvement was associated with the use of the sites in two of three hypotheses, but only partially in the manner hypothesized. Increased involvement with a Web topic did correlate with the choice of a longer, more detailed initial Web page, but was inversely related to the number of pages viewed so that the higher the involvement, the fewer pages visited. An additional indicator of usage, the average amount of time spent on each page, was measured and revealed that more involved users spent more time on each page.
- The Adoption and Use of Electronic Information Resources by a Non-Traditional User Group: Automotive Service Technicians.
- The growing complexity of machines has led to a concomitant increase in the amount and complexity of the information needed by those charged with servicing them. This, in turn, has led to a need for more robust methods for storing and distributing information and for a workforce more sophisticated in its use of information resources. As a result, the service trades have "professionalized," adopting more rigorous academic standards and developing ongoing certification programs. The current paper deals with the acceptance of advanced electronic information technology by skilled service personnel, specifically, automotive service technicians. The theoretical basis of the study is Davis' technology acceptance model. The purpose of the study is to determine the effects of three external factors on the operation of the model: age, work experience, and education/certification level. The research design is in two parts, beginning with an onsite observation and interviews to establish the environment. During the second part of the research process a survey was administered to a sample of automotive service technicians. Results indicated significant inverse relationships between age and acceptance and between experience and acceptance. A significant positive relationship was shown between education, particularly certification, and acceptance.
- Are Online Catalogs for Children Giving Them What They Need? Children's Cognitive Development and Information Seeking and Their Impact on Design
- Research shows children in an online environment often search by browsing, which relies heavily on recognition and content knowledge, so catalog systems for children must use effective symbols or pictorial representations, which correspond with children's own cognitive schema and level of recognition knowledge. This study was designed to look at the success of young children (ages 5 to 8) in searching 3 online public library catalogs designed for them, and it focused specifically on the pictorial representations and text descriptors used in the systems' browsing hierarchy. The research sought answer whether young children (ages 5 to 8) are really poor searchers because of cognitive development and lack of technology skills or if system design is the major reason for poor search results; i.e., Do current children's online catalog designs function in a manner that is compatible with information seeking by children? Although these results can not be generalized, this study indicates that there was a disconnect between the cognitive abilities of young users and catalog design. The study looked at search success on the 3 catalogs in relation to the catalog characteristics and individual user characteristics and makes 3 significant contributions to the field of library and information science. The first contribution is the modification of an existing model posed by Cooper and O'Connor and modified by Abbas (2002). The second significant contribution is the proposal of a new model, Creel's second best choice (SBC) model, that addresses the cognitive gap and design flaws that impact the choices participants made. The third significant contribution is that this study addresses and fills a gap in the literature.
- Assessment of a Library Learning Theory by Measuring Library Skills of Students Completing an Online Library Instruction Tutorial
- This study is designed to reveal whether students acquire the domains and levels of library skills discussed in a learning library skills theory after participating in an online library instruction tutorial. The acquisition of the library skills is demonstrated through a review of the scores on online tutorial quizzes, responses to a library skills questionnaire, and bibliographies of course research papers. Additional areas to be studied are the characteristics of the participants enrolled in traditional and online courses at a community college and the possible influence of these characteristics on the demonstrated learning of library skills. Multiple measurement methods, identified through assessment of library instruction literature, are used to verify the effectiveness of the library skills theory and to strengthen the validity and reliability of the study results.
- Children's Color Association for Digital Image Retrieval.
- In the field of information sciences, attention has been focused on developing mature information retrieval systems that abstract information automatically from the contents of information resources, such as books, images and films. As a subset of information retrieval research, content-based image retrieval systems automatically abstract elementary information from images in terms of colors, shapes, and texture. Color is the most commonly used in similarity measurement for content-based image retrieval systems. Human-computer interface design and image retrieval methods benefit from studies based on the understanding of their potential users. Today's children are exposed to digital technology at a very young age, and they will be the major technology users in five to ten years. This study focuses on children's color perception and color association with a controlled set of digital images. The method of survey research was used to gather data for this exploratory study about children's color association from a children's population, third to sixth graders. An online questionnaire with fifteen images was used to collect quantitative data of children's color selections. Face-to-face interviews investigated the rationale and factors affecting the color choices and children's interpretation of the images. The findings in this study indicate that the color children associated with in the images was the one that took the most space or the biggest part of an image. Another powerful factor in color selection was the vividness or saturation of the color. Colors that stood out the most generally attracted the greatest attention. Preferences of color, character, or subject matter in an image also strongly affected children's color association with images. One of the most unexpected findings was that children would choose a color to replace a color in an image. In general, children saw more things than what were actually represented in the images. However, the children's interpretation of the images had little effect on their color selections.
- The Cluster Hypothesis: A visual/statistical analysis
Access: Use of this item is restricted to the UNT Community.
By allowing judgments based on a small number of exemplar documents to be applied to a larger number of unexamined documents, clustered presentation of search results represents an intuitively attractive possibility for reducing the cognitive resource demands on human users of information retrieval systems. However, clustered presentation of search results is sensible only to the extent that naturally occurring similarity relationships among documents correspond to topically coherent clusters. The Cluster Hypothesis posits just such a systematic relationship between document similarity and topical relevance. To date, experimental validation of the Cluster Hypothesis has proved problematic, with collection-specific results both supporting and failing to support this fundamental theoretical postulate. The present study consists of two computational information visualization experiments, representing a two-tiered test of the Cluster Hypothesis under adverse conditions. Both experiments rely on multidimensionally scaled representations of interdocument similarity matrices. Experiment 1 is a term-reduction condition, in which descriptive titles are extracted from Associated Press news stories drawn from the TREC information retrieval test collection. The clustering behavior of these titles is compared to the behavior of the corresponding full text via statistical analysis of the visual characteristics of a two-dimensional similarity map. Experiment 2 is a dimensionality reduction condition, in which inter-item similarity coefficients for full text documents are scaled into a single dimension and then rendered as a two-dimensional visualization; the clustering behavior of relevant documents within these unidimensionally scaled representations is examined via visual and statistical methods. Taken as a whole, results of both experiments lend strong though not unqualified support to the Cluster Hypothesis. In Experiment 1, semantically meaningful 6.6-word document surrogates systematically conform to the predictions of the Cluster Hypothesis. In Experiment 2, the majority of the unidimensionally scaled datasets exhibit a marked nonuniformity of distribution of relevant documents, further supporting the Cluster Hypothesis. Results of the two experiments are profoundly question-specific. Post hoc analyses suggest that it may be possible to predict the success of clustered searching based on the lexical characteristics of users' natural-language expression of their information need.
- Cognitive Playfulness, Innovativeness, and Belief of Essentialness: Characteristics of Educators who have the Ability to Make Enduring Changes in the Integration of Technology into the Classroom Environment.
- Research on the adoption of innovation is largely limited to factors affecting immediate change with few studies focusing on enduring or lasting change. The purpose of the study was to examine the personality characteristics of cognitive playfulness, innovativeness, and essentialness beliefs in educators who were able to make an enduring change in pedagogy based on the use of technology in the curriculum within their assigned classroom settings. The study utilized teachers from 33 school districts and one private school in Texas who were first-year participants in the Intel® Teach to the Future program. The research design focused on how cognitive playfulness, innovativeness, and essentialness beliefs relate to a sustained high level of information technology use in the classroom. The research questions were: 1) Are individuals who are highly playful more likely to continue to demonstrate an ability to integrate technology use in the classroom at a high level than those who are less playful? 2) Are individuals who are highly innovative more likely to continue to demonstrate an ability to integrate technology use in the classroom at a high level than those who are less innovative? 3) Are individuals who believe information technology use is critical and indispensable to their teaching more likely to continue to demonstrate an ability to integrate technology use in the classroom at a high level than those who believe it is supplemental and not essential? The findings of the current study indicated that playfulness, innovativeness, and essentialness scores as defined by the scales used were significantly correlated to an individual's sustained ability to use technology at a high level. Playfulness was related to the educator's level of innovativeness, as well. Also, educators who believed the use of technology was critical and indispensable to their instruction were more likely to be able to demonstrate a sustained high level of technology integration. Further research is recommended to investigate numerous personality traits, such as playfulness, innovativeness, creativity, and risk-taking that might relate to technology adoption. Doing so may lead to modifications of professional development, assisting individuals in adapting better and faster to systemic change.
- A Common Representation Format for Multimedia Documents
- Multimedia documents are composed of multiple file format combinations, such as image and text, image and sound, or image, text and sound. The type of multimedia document determines the form of analysis for knowledge architecture design and retrieval methods. Over the last few decades, theories of text analysis have been proposed and applied effectively. In recent years, theories of image and sound analysis have been proposed to work with text retrieval systems and progressed quickly due in part to rapid progress in computer processing speed. Retrieval of multimedia documents formerly was divided into the categories of image and text, and image and sound. While standard retrieval process begins from text only, methods are developing that allow the retrieval process to be accomplished simultaneously using text and image. Although image processing for feature extraction and text processing for term extractions are well understood, there are no prior methods that can combine these two features into a single data structure. This dissertation will introduce a common representation format for multimedia documents (CRFMD) composed of both images and text. For image and text analysis, two techniques are used: the Lorenz Information Measurement and the Word Code. A new process named Jeong's Transform is demonstrated for extraction of text and image features, combining the two previous measurements to form a single data structure. Finally, this single data measurements to form a single data structure. Finally, this single data structure is analyzed by using multi-dimensional scaling. This allows multimedia objects to be represented on a two-dimensional graph as vectors. The distance between vectors represents the magnitude of the difference between multimedia documents. This study shows that image classification on a given test set is dramatically improved when text features are encoded together with image features. This effect appears to hold true even when the available text is diffused and is not uniform with the image features. This retrieval system works by representing a multimedia document as a single data structure. CRFMD is applicable to other areas of multimedia document retrieval and processing, such as medical image retrieval, World Wide Web searching, and museum collection retrieval.
- A Comparative Analysis of Style of User Interface Look and Feel in a Synchronous Computer Supported Cooperative Work Environment
- The purpose of this study is to determine whether the style of a user interface (i.e., its look and feel) has an effect on the usability of a synchronous computer supported cooperative work (CSCW) environment for delivering Internet-based collaborative content. The problem motivating this study is that people who are located in different places need to be able to communicate with one another. One way to do this is by using complex computer tools that allow users to share information, documents, programs, etc. As an increasing number of business organizations require workers to use these types of complex communication tools, it is important to determine how users regard these types of tools and whether they are perceived to be useful. If a tool, or interface, is not perceived to be useful then it is often not used, or used ineffectively. As organizations strive to improve communication with and among users by providing more Internet-based collaborative environments, the users' experience in this form of delivery may be tied to a style of user interface look and feel that could negatively affect their overall acceptance and satisfaction of the collaborative environment. The significance of this study is that it applies the technology acceptance model (TAM) as a tool for evaluating style of user interface look and feel in a collaborative environment, and attempts to predict which factors of that model, perceived ease of use and/or perceived usefulness, could lead to better acceptance of collaborative tools within an organization.
- A Comparison of Communication Motives of On-Site and Off-Site Students in Videoconference-Based Courses
- The objective of this investigation is to determine whether student site location in an instructional videoconference is related to students' motives for communicating with their instructor. The study is based, in part, on the work of Martin et al. who identify five separate student-teacher communication motives. These motives, or dimensions, are termed relational, functional, excuse, participation, and sycophancy, and are measured by a 30-item questionnaire. Several communication-related theories were used to predict differences between on-site and off-site students, Media richness theory was used, foundationally, to explain differences between mediated and face-to-face communication and other theories such as uncertainty reduction theory were used in conjunction with media richness theory to predict specific differences.Two hundred eighty-one completed questionnaires were obtained from Education and Library and Information Science students in 17 separate course-sections employing interactive video at the University of North Texas during the Spring and Summer semesters of the 2001/2002 school year. This study concludes that off-site students in an instructional videoconference are more likely than their on-site peers to report being motivated to communicate with their instructor for participation reasons. If off-site students are more motivated than on-site students to communicate as a means to participate, then it may be important for instructors to watch for actual differences in participation levels, and instructors may need to be well versed in pedagogical methods that attempt to increase participation, The study also suggests that current teaching methods being employed in interactive video environments may be adequate with regard to functional, excuse-making, relational and sycophantic communication.
- A Complex Systems Model for Understanding the Causes of Corruption: Case Study - Turkey
- It is attempted with this dissertation to draw an explanatory interdisciplinary framework to clarify the causes of systemic corruption. Following an intense review of political sciences, economics, and sociology literatures on the issue, a complex systems theoretical model is constructed. A political system consists of five main components: Society, interest aggregators, legislative, executive and private sector, and the human actors in these domains. It is hypothesized that when the legitimacy level of the system is low and morality of the systemic actors is flawed, selected political, social and economic incentives and opportunities that may exist within the structure of the systemic components might -individually or as a group- trigger corrupt transactions between the actors of the system. If left untouched, corruption might spread through the system by repetition and social learning eventually becoming the source of corruption itself. By eroding the already weak legitimacy and morality, it may increase the risk of corruption even further. This theoretical explanation is used to study causes of systemic corruption in the Turkish political system. Under the guidance of the complex systems theory, initial systemic conditions, -legacy of the predecessor of Turkey Ottoman Empire-, is evaluated first, and then political, social and economic factors that are presumed to be breeding corruption in contemporary Turkey is investigated. In this section, special focus is given on the formation and operation of amoral social networks and their contribution to the entrenchment of corruption within the system. Based upon the findings of the case study, the theoretical model that is informed by the literature is reformed: Thirty five system and actor level variables are identified to be related with systemic corruption and nature of the causality between them and corruption is explained. Although results of this study can not be academically generalized for obvious reasons; the analytical framework proposed here can be referenced by policy makers who are willing to trace the roots of systemic corruption in developing countries.
- Computer Support Interactions: Verifying a Process Model of Problem Trajectory in an Information Technology Support Environment.
- Observations in the information technology (IT) support environment and generalizations from the literature regarding problem resolution behavior indicate that computer support staff seldom store reusable solution information effectively for IT problems. A comprehensive model of the processes encompassing problem arrival and assessment, expertise selection, problem resolution, and solution recording has not been available to facilitate research in this domain. This investigation employed the findings from a qualitative pilot study of IT support staff information behaviors to develop and explicate a detailed model of problem trajectory. Based on a model from clinical studies, this model encompassed a trajectory scheme that included the communication media, characteristics of the problem, decision points in the problem resolution process, and knowledge creation in the form of solution storage. The research design included the administration of an extensive scenario-based online survey to a purposive sample of IT support staff at a medium-sized state-supported university, with additional respondents from online communities of IT support managers and call-tracking software developers. The investigator analyzed 109 completed surveys and conducted email interviews of a stratified nonrandom sample of survey respondents to evaluate the suitability of the model. The investigation employed mixed methods including descriptive statistics, effects size analysis, and content analysis to interpret the results and verify the sufficiency of the problem trajectory model. The study found that expertise selection relied on the factors of credibility, responsibility, and responsiveness. Respondents referred severe new problems for resolution and recorded formal solutions more often than other types of problems, whereas they retained moderate recurring problems for resolution and seldom recorded those solutions. Work experience above and below the 5-year mark affected decisions to retain, refer, or defer problems, as well as solution storage and broadcasting behaviors. The veracity of the problem trajectory model was verified and it was found to be an appropriate tool and explanatory device for research in the IT domain.
- A Conceptual Map for Understanding the Terrorist Recruitment Process: Observation and Analysis of Turkish Hezbollah Terrorist Organizations.
- Terrorism is a historical problem; however, it becomes one of the biggest problems in 21st century. September 11 and the following Madrid, Istanbul and London attacks showed that it is the most significant problem threatening world peace and security. Governments have started to deal with terrorism by improving security measurements and making new investments to stop terrorism. Most of the governments' and scholars' focus is on immediate threats and causes of terrorism, instead of looking at long-term solutions such as root causes and underlying reasons of terrorism, and the recruitment style of terrorist organizations If terrorist recruitment does not stop, then it is safe to say terrorist activities cannot be stopped. This study focused on the recruitment process by observing two different terrorist organizations, DHKP/C and Turkish Hezbollah. The researcher brings 13 years of field experience and first-person data gathered from inside the terrorist organizations. The research questions of this study were: (i) How can an individual be prevented from joining or carrying out terrorist activities?; (ii) What factors are correlated with joining a terrorist organization?; (iii) What are the recruitment processes of the DHKP/C, PKK, and Turkish Hezbollah?; (iv) Is there any common process of being a member of these three terrorist organizations?; and (v) What are the similarities and differences these terrorist organizations? As a result of this analysis, a terrorist recruitment process map was created. With the help of this map, social organizations such as family and schools may be able to identify ways to prevent individuals from joining terrorist organizations. Also, this map will also be helpful for government organizations such as counterterrorism and intelligence to achieve the same goal.
- Constraints on Adoption of Innovations: Internet Availability in the Developing World.
Access: Use of this item is restricted to the UNT Community.
In a world that is increasingly united in time and distance, I examine why the world is increasingly divided socially, economically, and digitally. Using data for 35 variables from 93 countries, I separate the countries into groups of 31 each by gross domestic product per capita. These groups of developed, lesser developed and least developed countries are used in comparative analysis. Through a review of relevant literature and tests of bivariate correlation, I select eight key variables that are significantly related to information communication technology development and to human development. For this research, adoption of the Internet in the developing world is the innovation of particular interest. Thus, for comparative purposes, I chose Internet Users per 1000 persons per country and the Human Development Index as the dependent variables upon which the independent variables are regressed. Although small in numbers among the least developed countries, I find Internet Users as the most powerful influence on human development for the poorest countries. The research focuses on key obstacles as well as variables of opportunity for Internet usage in developing countries. The greatest obstacles are in fact related to Internet availability and the cost/need ratio for infrastructure expansion. However, innovations for expanded Internet usage in developing countries are expected to show positive results for increased Internet usage, as well as for greater human development and human capital. In addition to the diffusion of innovations in terms of the Internet, the diffusion of cultures through migration is also discussed in terms of the effect on social capital and the drain on human capital from developing countries.
- Coyote Ugly Librarian: A Participant Observer Examination of Lnowledge Construction in Reality TV.
- Reality TV is the most popular genre of television programming today. The number of reality television shows has grown exponentially over the last fifteen years since the premier of The Real World in 1992. Although reality TV uses styles similar to those used in documentary film, the “reality” of the shows is questioned by critics and viewers alike. The current study focuses on the “reality” that is presented to viewers and how that “reality” is created and may differ from what the participants of the shows experience. I appeared on two reality shows, Faking It and That's Clever, and learned a great deal as a participant observer. Within the study, I outline my experience and demonstrate how editing changed the reality I experienced into what was presented to the viewers. O'Connor's (1996) representation context web serves as a model for the realities created through reality television. People derive various benefits from watching reality TV. Besides the obvious entertainment value of reality TV, viewers also gather information via this type of programming. Viewers want to see real people on television reacting to unusual circumstances without the use of scripts. By surveying reality TV show viewers and participants, this study gives insight into how real the viewers believe the shows are and how authentic they actually are. If these shows are presented as reality, viewers are probably taking what they see as historical fact. The results of the study indicate more must be done so that the “reality” of reality TV does not misinform viewers.
- Creating a Criterion-Based Information Agent Through Data Mining for Automated Identification of Scholarly Research on the World Wide Web
- This dissertation creates an information agent that correctly identifies Web pages containing scholarly research approximately 96% of the time. It does this by analyzing the Web page with a set of criteria, and then uses a classification tree to arrive at a decision. The criteria were gathered from the literature on selecting print and electronic materials for academic libraries. A Delphi study was done with an international panel of librarians to expand and refine the criteria until a list of 41 operationalizable criteria was agreed upon. A Perl program was then designed to analyze a Web page and determine a numerical value for each criterion. A large collection of Web pages was gathered comprising 5,000 pages that contain the full work of scholarly research and 5,000 random pages, representative of user searches, which do not contain scholarly research. Datasets were built by running the Perl program on these Web pages. The datasets were split into model building and testing sets. Data mining was then used to create different classification models. Four techniques were used: logistic regression, nonparametric discriminant analysis, classification trees, and neural networks. The models were created with the model datasets and then tested against the test dataset. Precision and recall were used to judge the effectiveness of each model. In addition, a set of pages that were difficult to classify because of their similarity to scholarly research was gathered and classified with the models. The classification tree created the most effective classification model, with a precision ratio of 96% and a recall ratio of 95.6%. However, logistic regression created a model that was able to correctly classify more of the problematic pages. This agent can be used to create a database of scholarly research published on the Web. In addition, the technique can be used to create a database of any type of structured electronic information.
- CT3 as an Index of Knowledge Domain Structure: Distributions for Order Analysis and Information Hierarchies
- The problem with which this study is concerned is articulating all possible CT3 and KR21 reliability measures for every case of a 5x5 binary matrix (32,996,500 possible matrices). The study has three purposes. The first purpose is to calculate CT3 for every matrix and compare the results to the proposed optimum range of .3 to .5. The second purpose is to compare the results from the calculation of KR21 and CT3 reliability measures. The third purpose is to calculate CT3 and KR21 on every strand of a class test whose item set has been reduced using the difficulty strata identified by Order Analysis. The study was conducted by writing a computer program to articulate all possible 5 x 5 matrices. The program also calculated CT3 and KR21 reliability measures for each matrix. The nonparametric technique of Order Analysis was applied to two sections of test items to stratify the items into difficulty levels. The difficulty levels were used to reduce the item set from 22 to 9 items. All possible strands or chains of these items were identified so that both reliability measures (CT3 and KR21) could be calculated. One major finding of this study indicates that .3 to .5 is a desirable range for CT3 (cumulative p=.86 to p=.98) if cumulative frequencies are measured. A second major finding is that the KR21 reliability measure produced an invalid result more than half the time. The last major finding is that CT3, rescaled to range between 0 and 1, supports De Vellis' guidelines for reliability measures. The major conclusion is that CT3 is a better measure of reliability since it considers both inter- and intra-item variances.
- Discovering a Descriptive Taxonomy of Attributes of Exemplary School Library Websites
- This descriptive study examines effective online school library practice. A Delphi panel selected a sample of 10 exemplary sites and helped to create two research tools--taxonomies designed to analyze the features and characteristics of school library Websites. Using the expert-identified sites as a sample, a content analysis was conducted to systematically identify site features and characteristics. Anne Clyde's longitudinal content analysis of school library Websites was used as a baseline to examine trends in practice; in addition, the national guidelines document, Information Power: Building Partnerships for Learning, was examined to explore ways in which the traditional mission and roles of school library programs are currently translated online. Results indicated great variation in depth and coverage even among Websites considered exemplary. Sites in the sample are growing more interactive and student-centered, using blogs as supplemental communication strategies. Nevertheless, even these exemplary sites were slow to adopt the advances in technology to meet the learning needs and interests of young adult users. Ideally the study's findings will contribute to understanding of state-of-the-art and will serve to identify trends, as well as serving as a guide to practitioners in planning, developing, and maintaining school library Websites.
- An E-government Readiness Model
- The purpose of this study is to develop an e-government readiness model and to test this model. Consistent with this model several instruments, IS assessment (ISA), IT governance (ITG), and Organization-IS alignment (IS-ALIGN) are examined for their ability to measure the readiness of one organization for e-government and to test the instruments fit in the proposed e-government model. The ISA instrument used is the result of adapting and combining the IS-SERVQUAL instrument proposed by Van Dyke, Kappelman, and Pybutok (1997), and the IS-SUCCESS instrument developed by Kappelman and Chong (2001) for the City of Denton (COD) project at UNT. The IS Success Model was first proposed by DeLone and McLean (1992), but they did not validate this model. The ITG instrument was based on the goals of the COD project for IT governance and was developed by Sanchez and Kappelman (2001) from UNT. The ISALIGN instrument was also developed by Sanchez and Kappelman (2001) for the COD project. It is an instrument based on the Malcolm Baldrige National Quality Award (MBNQA) that measures how effectively a government organization utilizes IT to support its various objectives. The EGOV instrument was adapted from the study of the Action-Audience Model developed by Koh and Balthazrd (1997) to measure how well a government organization is prepared to usher in e-government in terms of various success factors at planning, system and data levels. An on-line survey was conducted with employees of the City of Denton, Texas. An invitation letter to participate in the survey was sent to the 1100 employees of the City of Denton via email, 339 responses were received, yielding a response rate of 31%. About 168 responses were discarded because they were incomplete and had the missing values, leaving 171 usable surveys, for a usable set of responses that had a response rate of 16%. Although the proposed and some alternate models were partially consistent with the hypothesized theory, the confirmation of the relationships among the constructs warrants further research via either by replication of this research or by development a new theoretical model. However, the significant validity and reliability measures obtained in this study indicate that the e-government readiness model has the potential for use in future studies.
- The Effect of Information Literacy Instruction on Library Anxiety Among International Students.
Access: Use of this item is restricted to the UNT Community.
This study explored what effect information literacy instruction (ILI) may have on both a generalized anxiety state and library anxiety specifically. The population studied was international students using resources in a community college. Library anxiety among international students begins with certain barriers that cause anxiety (i.e., language/communication barriers, adjusting to a new education/library system and general cultural adjustments). Library Anxiety is common among college students and is characterized by feelings of negative emotions including, ruminations, tension, fear and mental disorganization (Jiao & Onwuegbuzie, 1999a). This often occurs when a student contemplates conducting research in a library and is due to any number of perceived inabilities about using the library. In order for students to become successful in their information seeking behavior this anxiety needs to be reduced. The study used two groups of international students enrolled in the English for Speakers of other Languages (ESOL) program taking credit courses. Each student completed Bostick's Library Anxiety Scale (LAS) and Spielberger's State-Trait Anxiety Inventory (STAI) to assess anxiety level before and after treatment. Subjects were given a research assignment that required them to use library resources. Treatment: Group 1 (experimental group) attended several library instruction classes (the instruction used Kuhltau's information search process model). Group 2 (control group) was in the library working on assignment but did not receive any formal library instruction. After the treatment the researcher and ESOL program instructor(s) measured the level of anxiety between groups. ANCOVA was used to analyze Hypotheses 1 and 2, which compared pretest and posttest for each group. Research assignment grades were used to analyze Hypothesis 3 comparing outcomes among the two groups. The results of the analysis ascertained that ILI was associated with reducing state and library anxiety among international students when given an assignment using library resources.
- The Effect of Personality Type on the Use of Relevance Criteria for Purposes of Selecting Information Sources.
- Even though information scientists generally recognize that relevance judgments are multidimensional and dynamic, there is still discussion and debate regarding the degree to which certain internal (cognition, personality) and external (situation, social relationships) factors affect the use of criteria in reaching those judgments. Much of the debate centers on the relationship of those factors to the criteria and reliable methods for measuring those relationships. This study researched the use of relevance criteria to select an information source by undergraduate students whose task it is to create a course schedule for a semester. During registration periods, when creating their semester schedules, students filled out a two-part questionnaire. After completion of the questionnaire the students completed a Myers-Briggs Type Indicator instrument in order to determine their personality type. Data was analyzed using one-way ANOVAS and Chi-Square. A positive correlation exists between personality type as expressed by the MBTI and the information source selected as most important by the subject. A correlation also exists between personality type and relevance criteria use. The correlation is stronger for some criteria than for others. Therefore, one can expect personality type to have an effect on the use of relevance criteria while selecting information sources.
- The Effectiveness of Using Lego Mindstorms Robotics Activities to Influence Self-regulated Learning in a University Introductory Computer Programming Course.
- The research described in this dissertation examines the possible link between self-regulated learning and LEGO Mindstorms robotics activities in teaching concepts in an introductory university computer programming course. The areas of student motivation, learning strategies, and mastery of course objectives are investigated. In all three cases analysis failed to reveal any statistically significant differences between the traditional control group and the experimental LEGO Mindstorms group as measured by the Motivated Strategies for Learning Questionnaire and course exams. Possible reasons for the lack of positive results include technical problems and limitations of the LEGO Mindstorms systems, limited number and availability of robots outside of class, limited amount of time during the semester for the robotics activities, and a possible difference in effectiveness based on gender. Responses to student follow-up questions, however, suggest that at least some of the students really enjoyed the LEGO activities. As with any teaching tool or activity, there are numerous ways in which LEGO Mindstorms can be incorporated into learning. This study explores whether or not LEGO Mindstorms are an effective tool for teaching introductory computer programming at the university level and how these systems can best be utilized.
- An Empirical Investigation of Critical Factors that Influence Data Warehouse Implementation Success in Higher Educational Institutions
Access: Use of this item is restricted to the UNT Community.
Data warehousing (DW) in the last decade has become the technology of choice for building data management infrastructures to provide organizations the decision-making capabilities needed to effectively carry out its activities. Despite its phenomenal growth and importance to organizations the rate of DW implementation success has been less than stellar. Many DW implementation projects fail due to technical or organizational reasons. There has been limited research on organizational factors and their role in DW implementations. It is important to understand the role and impact of both technical but organizational factors in DW implementations and their relative importance to implementation performance. A research model was developed to test the significance of technical and organizational factors in the three phases of implementation with DW implementation performance. The independent variables were technical (data, technology, and expertise) and organizational (management, goals, users, organization). The dependent variable was performance (content, accuracy, format, ease of use, and timeliness). The data collection method was a Web based survey of DW implementers and DW users sampled (26) from a population of 108 identified DW implementations. Regression was used as the multivariate statistical technique to analyze the data. The results show that organization factors are significantly related to performance. Also, that some variables in the post-implementation phase have a significant relationship with performance. Based on the results of the tests the model was revised to reflect the relative impact of technical and organizational factors on DW performance. Results suggest that in some cases organizational factors have a significant relationship with DW implementation performance. The implications and interpretation of these results provide researchers and practitioners' insights and a new perspective in the area of DW implementations.
- Empowering agent for Oklahoma school learning Communities: an examination of the Oklahoma Library Improvement Program
- The purposes of this study were to determine the initial impact of the Oklahoma Library Media Improvement Grants on Oklahoma school library media programs; assess whether the Oklahoma Library Media Improvement Grants continue to contribute to Oklahoma school learning communities; and examine possible relationships between school library media programs and student academic success. It also seeks to document the history of the Oklahoma Library Media Improvement Program 1978 - 1994 and increase awareness of its influence upon the Oklahoma school library media programs. Methods of data collection included: examining Oklahoma Library Media Improvement Program archival materials; sending a survey to 1703 school principals in Oklahoma; and interviewing Oklahoma Library Media Improvement Program participants. Data collection took place over a one year period. Data analyses were conducted in three primary phases: descriptive statistics and frequencies were disaggregated to examine mean scores as they related to money spent on school library media programs; opinions of school library media programs; and possible relationships between school library media programs and student academic achievement. Analysis of variance was used in the second phase of data analysis to determine if any variation between means was significant as related to Oklahoma Library Improvement Grants, time spent in the library media center by library media specialists, principal gender, opinions of library media programs, student achievement indicators, and the region of the state in which the respondent was located. The third phase of data analysis compared longitudinal data collected in the 2000 survey with past data. The primary results indicated students in Oklahoma from schools with a centralized library media center, served by a full-time library media specialist, and the school having received one or more Library Media Improvement Grants scored significantly higher academically than students in schools not having a centralized library media center, not served by a full-time library media specialist, and the school not having received one or more Library Media Improvement Grants. Students in schools having even one of these components scored higher academically than students in schools with none of these components.
- An Evaluation of the Effect of Learning Styles and Computer Competency on Students' Satisfaction on Web-Based Distance Learning Environments
Access: Use of this item is restricted to the UNT Community.
This study investigates the correlation between students' learning styles, computer competency and student satisfaction in Web-based distance learning. Three hundred and one graduate students participated in the current study during the Summer and Fall semesters of 2002 at the University of North Texas. Participants took the courses 100% online and came to the campus only once for software training. Computer competency and student satisfaction were measured using the Computer Skill and Use Assessment and the Student Satisfaction Survey questionnaires. Kolb's Learning Style Inventory measured students' learning styles. The study concludes that there is a significant difference among the different learning styles with respect to student satisfaction level when the subjects differ with regard to computer competency. For accommodating amd diverging styles, a higher level of computer competency results in a higher level of student satisfaction. But for converging and assimilating styles, a higher level of computer competency suggests a lower level of student satisfaction. A significant correlation was found between computer competency and student satisfaction level within Web-based courses for accommodating styles and no significant results were found in the other learning styles.
- An Examination Of The Variation In Information Systems Project Cost Estimates: The Case Of Year 2000 Compliance Projects
- The year 2000 (Y2K) problem presented a fortuitous opportunity to explore the relationship between estimated costs of software projects and five cost influence dimensions described by the Year 2000 Enterprise Cost Model (Kappelman, et al., 1998) -- organization, problem, solution, resources, and stage of completion. This research was a field study survey of (Y2K) project managers in industry, government, and education and part of a joint project that began in 1996 between the University of North Texas and the Y2K Working Group of the Society for Information Management (SIM). Evidence was found to support relationships between estimated costs and organization, problem, resources, and project stage but not for the solution dimension. Project stage appears to moderate the relationships for organization, particularly IS practices, and resources. A history of superior IS practices appears to mean lower estimated costs, especially for projects in larger IS organizations. Acquiring resources, especially external skills, appears to increase costs. Moreover, projects apparently have many individual differences, many related to size and to project stage, and their influences on costs appear to be at the sub-dimension or even the individual variable level. A Revised Year 2000 Enterprise Model is presented incorporating this granularity. Two primary conclusions can be drawn from this research: (1) large software projects are very complex and thus cost estimating is also; and (2) the devil of cost estimating is in the details of knowing which of the many possible variables are the important ones for each particular enterprise and project. This points to the importance of organizations keeping software project metrics and the historical calibration of cost-estimating practices. Project managers must understand the relevant details and their interaction and importance in order to successfully develop a cost estimate for a particular project, even when rational cost models are used. This research also indicates that software cost estimating has political as well as rational influences at play.
- An experimental study of teachers' verbal and nonverbal immediacy, student motivation, and cognitive learning in video instruction
- This study used an experimental design and a direct test of recall to provide data about teacher immediacy and student cognitive learning. Four hypotheses and a research question addressed two research problems: first, how verbal and nonverbal immediacy function together and/or separately to enhance learning; and second, how immediacy affects cognitive learning in relation to student motivation. These questions were examined in the context of video instruction to provide insight into distance learning processes and to ensure maximum control over experimental manipulations. Participants (N = 347) were drawn from university students in an undergraduate communication course. Students were randomly assigned to groups, completed a measure of state motivation, and viewed a 15-minute video lecture containing part of the usual course content delivered by a guest instructor. Participants were unaware that the video instructor was actually performing one of four scripted manipulations reflecting higher and lower combinations of specific verbal and nonverbal cues, representing the four cells of the 2x2 research design. Immediately after the lecture, students completed a recall measure, consisting of portions of the video text with blanks in the place of key words. Participants were to fill in the blanks with exact words they recalled from the videotape. Findings strengthened previous research associating teacher nonverbal immediacy with enhanced cognitive learning outcomes. However, higher verbal immediacy, in the presence of higher and lower nonverbal immediacy, was not shown to produce greater learning among participants in this experiment. No interaction effects were found between higher and lower levels of verbal and nonverbal immediacy. Recall scores were comparatively low in the presence of higher verbal and lower nonverbal immediacy, suggesting that nonverbal expectancy violations may have hindered cognitive learning. Student motivation was not found to be a significant source of error in measuring immediacy's effects, and no interaction effects were detected between levels of student motivation, teacher verbal immediacy, and teacher nonverbal immediacy.
- An exploration of the diffusion of a new technology from communities of practice perspective: Web services technologies in digital libraries.
- This study explored and described decision factors related to technology adoption. The research used diffusion of innovations and communities of practice (CoP) theoretical frameworks and a case study of Web services technology in the digital library (DL) environment to develop an understanding of the decision-making process. A qualitative case study approach was used to investigate the research problems and data were collected through semi-structured interviews, documentary evidence (e.g., meeting minutes), and a comprehensive member check. The research conducted face-to-face and phone interviews with seven respondents with different job titles (administraive vs. technical) from five different DL programs selected based on distinctive characteristics such as size of the DL program. Findings of the research suggested that the decision-making process is a complex process in which a number of factors are considered when making technology adoption decisions. These factors are categorized as organizational, individual, and technology specific factors. Further, data showed that DL CoPs played an important role in enabling staff members of a DL program to access up-to-date and experienced-based knowledge, provided a distributed problem solving and learning environment, facilitating informal communication and collaborative activities, and informing the decision-making process.
- An exploratory study of factors that influence student user success in an academic digital library.
- The complex nature of digital libraries calls for appropriate models to study user success. Calls have been made to incorporate into these models factors that capture the interplay between people, organizations, and technology. In order to address this, two research questions were formulated: (1) To what extent does the comprehensive digital library user success model (DLUS), based on a combination of the EUCS and flow models, describe overall user success in a prototype digital library environment; and (2) To what extent does a combined model of DeLone & McLean's reformulated information system success model and comprehensive digital library user success model (DLUS) explain digital library user success in a prototype digital library environment? Participants were asked to complete an online survey questionnaire. A total of 160 completed and useable questionnaires were obtained. Data analyses through exploratory and confirmatory factor analyses and structural equation modeling produced results that support the two models. However, some relationships between latent variables hypothesized in the model were not confirmed. A modified version of the proposed comprehensive plus user success model in a digital library environment was tested and supported through model fit statistics. This model was recommended as a possible alternative model of user success. The dissertation also makes a number of recommendations for future research.
- A Framework of Automatic Subject Term Assignment: An Indexing Conception-Based Approach
- The purpose of dissertation is to examine whether the understandings of subject indexing processes conducted by human indexers have a positive impact on the effectiveness of automatic subject term assignment through text categorization (TC). More specifically, human indexers' subject indexing approaches or conceptions in conjunction with semantic sources were explored in the context of a typical scientific journal article data set. Based on the premise that subject indexing approaches or conceptions with semantic sources are important for automatic subject term assignment through TC, this study proposed an indexing conception-based framework. For the purpose of this study, three hypotheses were tested: 1) the effectiveness of semantic sources, 2) the effectiveness of an indexing conception-based framework, and 3) the effectiveness of each of three indexing conception-based approaches (the content-oriented, the document-oriented, and the domain-oriented approaches). The experiments were conducted using a support vector machine implementation in WEKA (Witten, & Frank, 2000). The experiment results pointed out that cited works, source title, and title were as effective as the full text, while keyword was found more effective than the full text. In addition, the findings showed that an indexing conception-based framework was more effective than the full text. Especially, the content-oriented and the document-oriented indexing approaches were found more effective than the full text. Among three indexing conception-based approaches, the content-oriented approach and the document-oriented approach were more effective than the domain-oriented approach. In other words, in the context of a typical scientific journal article data set, the objective contents and authors' intentions were more focused that the possible users' needs. The research findings of this study support that incorporation of human indexers' indexing approaches or conception in conjunction with semantic sources has a positive impact on the effectiveness of automatic subject term assignment.
- Functional Ontology Construction: A Pragmatic Approach to Addressing Problems Concerning the Individual and the Informing Environment
- Functional ontology construction (FOC) is an approach for modeling the relationships between a user and the informing environment by means of analysis of the user's behavior and the elements of the environment that have behavioral function. The FOC approach is an application of behavior analytic techniques and concepts to problems within information science. The FOC approach is both an alternative and a compliment to the cognitive viewpoint commonly found in models of behavior in information science. The basis for the synthesis of behavior analysis and information science is a shared tradition of pragmatism between the fields. The application of behavior analytic concepts brings with it the notion of selection by consequence. Selection is examined on the biological, behavioral, and cultural levels. Two perspicuous examples of the application of the FOC modeling approach are included. The first example looks at the document functioning as a reinforcer in a human operant experimental setting. The second example is an examination of the verbal behavior of expert film analyst, Raymond Bellour, the structure of a film he analyzed, and the elements of the film's structure that had behavioral function for Bellour. The FOC approach is examined within the ontological space of information science.
- The gathering and use of information by fifth grade students with access to Palm® handhelds.
- Handheld computers may hold the possibility for a one-to-one computer: student ratio. The impact of the use of Palm® (Palm, Inc.) handhelds on information acquisition and use by 5th grade students in a North Texas school during a class research project was investigated. Five research questions were examined using observation, interviews, surveys, and document analysis. Are there differences in information gathering and use with the Palm between gifted, dyslexic, and regular learners? What relevance criteria do students use to evaluate a web site to determine whether to download the site to the Palm and afterwards whether to use the downloaded site's information in the report? How do the Palms affect the writing process? Do the animations and concept maps produced on the Palm demonstrate understanding of the intended concepts? Are there significant differences in results (i.e., final products grade) between Palm users and non-Palm users? Three groups of learners in the class, gifted, dyslexic, and regular learners, participated in the study. The regular and dyslexic students reported using Web sites that had not been downloaded to the Palm. Students reported several factors used to decide whether to download Web sites, but the predominant deciding factor was the amount of information. The students used a combination of writing on paper and the Palm in the preparation of the report. Many students flipped between two programs, FreeWrite and Fling-It, finding information and then writing the facts into the report. The peer review process was more difficult with the Palm. Most students had more grammatical errors in this research report than in previous research projects. By creating animated drawings on the Palm handheld, the students demonstrated their understanding of the invention though sometimes the media or the student's drawing skills limited the quality of the final product. Creating the animations was motivational and addressed different learning styles than a written report alone. No statistically significant difference was found in the scores of the three 6+1 Traits categories, however the Palm users didn't meet the page-length requirement for the research project but the majority of the control class did.
- Global response to cyberterrorism and cybercrime: A matrix for international cooperation and vulnerability assessment.
- Cyberterrorism and cybercrime present new challenges for law enforcement and policy makers. Due to its transnational nature, a real and sound response to such a threat requires international cooperation involving participation of all concerned parties in the international community. However, vulnerability emerges from increased reliance on technology, lack of legal measures, and lack of cooperation at the national and international level represents real obstacle toward effective response to these threats. In sum, lack of global consensus in terms of responding to cyberterrorism and cybercrime is the general problem. Terrorists and cyber criminals will exploit vulnerabilities, including technical, legal, political, and cultural. Such a broad range of vulnerabilities can be dealt with by comprehensive cooperation which requires efforts both at the national and international level. "Vulnerability-Comprehensive Cooperation-Freedom Scale" or "Ozeren Scale" identified variables that constructed the scale based on the expert opinions. Also, the study presented typology of cyberterrorism, which involves three general classifications of cyberterrorism; Disruptive and destructive information attacks, Facilitation of technology to support the ideology, and Communication, Fund raising, Recruitment, Propaganda (C-F-R-P). Such a typology is expected to help those who are in a position of decision-making and investigating activities as well as academicians in the area of terrorism. The matrix for international cooperation and vulnerability assessment is expected to be used as a model for global response to cyberterrorism and cybercrime.
- Human concept cognition and semantic relations in the unified medical language system: A coherence analysis.
- There is almost a universal agreement among scholars in information retrieval (IR) research that knowledge representation needs improvement. As core component of an IR system, improvement of the knowledge representation system has so far involved manipulation of this component based on principles such as vector space, probabilistic approach, inference network, and language modeling, yet the required improvement is still far from fruition. One promising approach that is highly touted to offer a potential solution exists in the cognitive paradigm, where knowledge representation practice should involve, or start from, modeling the human conceptual system. This study based on two related cognitive theories: the theory-based approach to concept representation and the psychological theory of semantic relations, ventured to explore the connection between the human conceptual model and the knowledge representation model (represented by samples of concepts and relations from the unified medical language system, UMLS). Guided by these cognitive theories and based on related and appropriate data-analytic tools, such as nonmetric multidimensional scaling, hierarchical clustering, and content analysis, this study aimed to conduct an exploratory investigation to answer four related questions. Divided into two groups, a total of 89 research participants took part in two sets of cognitive tasks. The first group (49 participants) sorted 60 food names into categories followed by simultaneous description of the derived categories to explain the rationale for category judgment. The second group (40 participants) performed sorting 47 semantic relations (the nonhierarchical associative types) into 5 categories known a priori. Three datasets resulted as a result of the cognitive tasks: food-sorting data, relation-sorting data, and free and unstructured text of category descriptions. Using the data analytic tools mentioned, data analysis was carried out and important results and findings were obtained that offer plausible explanations to the 4 research questions. Major results include the following: (a) through discriminant analysis category members were predicted consistently in 70% of the time; (b) the categorization bases are largely simplified rules, naïve explanations, and feature-based; (c) individuals theoretical explanation remains valid and stays stable across category members; (d) the human conceptual model can be fairly reconstructed in a low-dimensional space where 93% of the variance in the dimensional space is accounted for by the subjects performance; (e) participants consistently classify 29 of the 47 semantic relations; and, (f) individuals perform better in the functional and spatial dimensions of the semantic relations classification task and perform poorly in the conceptual dimension.
- Identification of Remote Leadership Patterns in Academic and Public Libraries
- Seminal works on leadership, including those in librarianship define a traditional model of interaction between leaders and followers without reference to the information technology-driven environment. In addition, remote leadership indicates a different model from the traditional model, one that is focused on the interaction of leaders and their staff through digital technology. Although leaders still use face-to-face interaction, due to varied work schedules or job responsibilities, they also recognize the need to lead employees remotely. Leadership studies in library literature have not addressed how library leaders use information technology to lead employees remotely, nor have these studies addressed remote leadership and remote employees, except for some articles on telecommuting. As a result, this research was conducted to address this gap, providing an exploratory foundation of emergent patterns of remote leadership with its associated leadership dimensions rooted in personality traits, behaviors, and skills. Quantitative and qualitative data were obtained from a small sample size of academic and public-library leaders in the United States who participated in a Web-based survey designed specifically for this study, limiting generalizations. Factor analysis was the principal methodology used to obtain findings. Its composite factor scores were also used in the t-test and chi-square analyses. This study identifies some emergent patterns of remote leadership in the library and information-science field, exploring whether library leaders use information technology to be effective remote leaders in a technology-driven environment, and whether existing leadership attributes could be identified as part of the remote-leadership model. Because this study's findings indicated that library leaders are not quite the traditional leader but are not fully integrated into remote leadership, it becomes apparent that they would function with a blend of both face-to-face and electronic interactions, due to the nature of library work. Additionally, this research revealed underlying issues and challenges faced by library leaders as they transition from a traditional-leadership model to a blended model of face-to-face and remote leadership. Future research could include increasing the sample size and response rate to conduct factor analysis properly, and conducting longitudinal studies.
- Identifying At-Risk Students: An Assessment Instrument for Distributed Learning Courses in Higher Education
- The current period of rapid technological change, particularly in the area of mediated communication, has combined with new philosophies of education and market forces to bring upheaval to the realm of higher education. Technical capabilities exceed our knowledge of whether expenditures on hardware and software lead to corresponding gains in student learning. Educators do not yet possess sophisticated assessments of what we may be gaining or losing as we widen the scope of distributed learning. The purpose of this study was not to draw sweeping conclusions with respect to the costs or benefits of technology in education. The researcher focused on a single issue involved in educational quality: assessing the ability of a student to complete a course. Previous research in this area indicates that attrition rates are often higher in distributed learning environments. Educators and students may benefit from a reliable instrument to identify those students who may encounter difficulty in these learning situations. This study is aligned with research focused on the individual engaged in seeking information, assisted or hindered by the capabilities of the computer information systems that create and provide access to information. Specifically, the study focused on the indicators of completion for students enrolled in video conferencing and Web-based courses. In the final version, the Distributed Learning Survey encompassed thirteen indicators of completion. The results of this study of 396 students indicated that the Distributed Learning Survey represented a reliable and valid instrument for identifying at-risk students in video conferencing and Web-based courses where the student population is similar to the study participants. Educational level, GPA, credit hours taken in the semester, study environment, motivation, computer confidence, and the number of previous distributed learning courses accounted for most of the predictive power in the discriminant function based on student scores from the survey.
- The Impact of Predisposition Towards Group Work on Intention to Use a CSCW System
- Groupware packages are increasingly being used to support content delivery, class discussion, student to student and student to faculty interactions and group work on projects. This research focused on groupware packages that are used to support students who are located in different places, but who are assigned group projects as part of their coursework requirements. In many cases, students are being asked to use unfamiliar technologies that are very different from those that support personal productivity. For example, computer-assisted cooperative work (CSCW) technology is different from other more traditional, stand-alone software applications because it requires the user to interact with the computer as well as other users. However, familiarity with the technology is not the only requirement for successful completion of a group assigned project. For a group to be successful, it must also have a desire to work together on the project. If this pre-requisite is not present within the group, then the technology will only create additional communication and coordination barriers. How much of an impact does each of these factors have on the acceptance of CSCW technology? The significance of this study is threefold. First, this research contributed to how a user's predisposition toward group work affects their acceptance of CSCW technology. Second, it helped identify ways to overcome some of the obstacles associated with group work and the use of CSCW technology in an academic online environment. Finally, it helped identify early adopters of CSCW software and how these users can form the critical mass required to diffuse the technology. This dissertation reports the impact of predisposition toward group work and prior computer experience on the intention to use synchronous CSCW. It was found that predisposition toward group work was not only positively associated to perceived usefulness; it was also related to intention to use. It also found that perceived ease of use, at least in this study, had a direct and positive impact on intention, and was not mediated through perceived usefulness. These findings hold implications for academia and how it uses complex collaborative software. Avenues for further research are identified.
- Implications of the inclusion of document retrieval systems as actors in a social network.
Access: Use of this item is restricted to the UNT Community.
Traditionally, social network analysis (SNA) techniques enable the examination of relationships and the flow of information within networks of human members or groups of humans. This study extended traditional social network analysis to include a nonhuman group member, specifically a document retrieval system. The importance of document retrieval systems as information sources, the changes in business environments that necessitates the use of information and communication technologies, and the attempts to make computer systems more life-like, provide the reasons for considering the information system as a group member. The review of literature for this study does not encompass a single body of knowledge. Instead, several areas combined to inform this study, including social informatics for its consideration of the intersection of people and information technology, network theory and social network analysis, organizations and information, organizational culture, and finally, storytelling in organizations as a means of transferring information. The methodology included distribution of surveys to two small businesses that used the same document retrieval system, followed by semi-structured interviews of selected group members, which allowed elaboration on the survey findings. The group members rated each other and the system on four interaction criteria relating to four social networks of interest, including awareness, access, information flow, and problem solving. Traditional measures of social networks, specifically density, degree, reciprocity, transitivity, distance, degree centrality, and closeness centrality provided insight into the positioning of the nonhuman member within the social group. The human members of the group were able to respond to the survey that included the system but were not ready to consider the system as being equivalent to other human members. SNA measures positioned the system as an average member of the group, not a star, but not isolated either. Examination of the surveys or the interviews in isolation would not have given a complete picture of the system's place within the group.
- Improving Recall of Browsing Sets in Image Retrieval from a Semiotics Perspective
- The purpose of dissertation is to utilize connotative messages for enhancing image retrieval and browsing. By adopting semiotics as a theoretical tool, this study explores problems of image retrieval and proposes an image retrieval model. The semiotics approach conceptually demonstrates that: 1) a fundamental reason for the dissonance between retrieved images and user needs is representation of connotative messages, and 2) the image retrieval model which makes use of denotative index terms is able to facilitate users to browse connotatively related images effectively even when the users' needs are potentially expressed in the form of denotative query. Two experiments are performed for verifying the semiotic-based image retrieval model and evaluating the effectiveness of the model. As data sources, 5,199 records are collected from Artefacts Canada: Humanities by Canadian Heritage Information Network, and the candidate terms of connotation and denotation are extracted from Art & Architecture Thesaurus. The first experiment, by applying term association measures, verifies that the connotative messages of an image can be derived from denotative messages of the image. The second experiment reveals that the association thesaurus which is constructed based on the associations between connotation and denotation facilitates assigning connotative terms to image documents. In addition, the result of relevant judgments presents that the association thesaurus improves the relative recall of retrieved image documents as well as the relative recall of browsing sets. This study concludes that the association thesaurus indicating associations between connotation and denotation is able to improve the accessibility of the connotative messages. The results of the study are hoped to contribute to the conceptual knowledge of image retrieval by providing understandings of connotative messages within an image and to the practical design of image retrieval system by proposing an association thesaurus which can supplement the limitations of the current content-based image retrieval systems (CBIR).
- The Information Environment of Academic Library Directors: Use of Information Resources and Communication Technologies
Access: Use of this item is restricted to the UNT Community.
This study focuses on the use of information resources and communication technologies, both traditional and electronic, by academic library directors. The purpose is to improve understanding of managerial behavior when using information resources and communication technologies within a shared information environment. Taylor's concept of an information use environment is used to capture the elements associated with information use and communication within the context of decision-making styles, managerial roles, organizational environments, and professional communities. This qualitative study uses interviews, observations, questionnaires, and documents. Library directors participating in the study are from doctoral-degree granting universities in the southwestern United States. Data collection involved on-site observations with a PDA (personal digital assistant), structured interviews with library directors and their administrative assistants, the Decision Style Inventory, and a questionnaire based on Mintzberg's managerial roles. Findings show the existence of a continuum in managerial activities between an Administrator and an Administrator/Academic as critical to understanding information use and communication patterns among library directors. There is a gap between self-perception of managerial activities and actual performance, a finding that would not have surfaced without the use of multiple methods. Other findings include the need for a technical ombudsman, a managerial-level position reporting to the library director; the importance of information management as an administrative responsibility; the importance of trust when evaluating information; and the importance of integrating information and communication across formats, time, and managerial activities.
- Information Needs of Art Museum Visitors: Real and Virtual
- Museums and libraries are considered large repositories of human knowledge and human culture. They have similar missions and goals in distributing accumulated knowledge to society. Current digitization projects allow both, museums and libraries to reach a broader audience, share their resources with a variety of users. While studies of information seeking behavior, retrieval systems and metadata in library science have a long history; such research studies in museum environments are at their early experimental stage. There are few studies concerning information seeking behavior and needs of virtual museum visitors, especially with the use of images in the museums' collections available on the Web. The current study identifies preferences of a variety of user groups about the information specifics on current exhibits, museum collections metadata information, and the use of multimedia. The study of information seeking behavior of users groups of museum digital collections or cultural collections allows examination and analysis of users' information needs, and the organization of cultural information, including descriptive metadata and the quantity of information that may be required. In addition, the study delineates information needs that different categories of users may have in common: teachers in high schools, students in colleges and universities, museum professionals, art historians and researchers, and the general public. This research also compares informational and educational needs of real visitors with the needs of virtual visitors. Educational needs of real visitors are based on various studies conducted and summarized by Falk and Dierking (2000), and an evaluation of the art museum websites previously conducted to support the current study.
- Information systems assessment: development of a comprehensive framework and contingency theory to assess the effectiveness of the information systems function.
- The purpose of this research is to develop a comprehensive, IS assessment framework using existing IS assessment theory as a base and incorporating suggestions from other disciplines. To validate the framework and to begin the investigation of current IS assessment practice, a survey instrument was developed. A small group of subject matter experts evaluated and improved the instrument. The instrument was further evaluated using a small sample of IS representatives. Results of this research include a reexamination of the IS function measurement problem using new frameworks of analyses yielding (a) guidance for the IS manager or executive on which IS measures might best fit their organization, (b) a further verification of the important measures most widely used by IS executives, (c) a comprehensive, theoretically-derived, IS assessment framework, and by (d) the enhancement of IS assessment theory by incorporating ideas from actual practice. The body of knowledge gains a comprehensive, IS assessment framework that can be further tested for usefulness and applicability. Future research is recommended to substantiate and improve on these findings. Chapter 2 is a complete survey of prior research, subdivided by relevant literature divisions, such as organizational effectiveness, quality management, and IS assessment. Chapter 3 includes development of and support for the research questions, IS assessment framework, and the research model. Chapter 4 describes how the research was conducted. It includes a brief justification for the research approach, a description of how the framework was evaluated, a description of how the survey instrument was developed and evaluated, a description of the participants and how they were selected, a synopsis of the data collection procedures, a brief description of follow-up procedures, and a summary. Chapter 5 presents the results of the research. Chapter 6 is a summary and conclusion of the research. Finally, included in the appendices are definitions of terms, and copies of the original and improved survey instruments.
- Intangible Qualities of Rare Books: Toward a Decision-Making Framework for Preservation Management in Rare Book Collections, Based Upon the Concept of the Book as Object
- For rare book collections, a considerable challenge is involved in evaluating collection materials in terms of their inherent value, which includes the textual and intangible information the materials provide for the collection's users. Preservation management in rare book collections is a complex and costly process. As digitization and other technological advances in surrogate technology have provided new forms representation, new dilemmas in weighing the rare book's inherently valuable characteristics against the possibly lesser financial costs of surrogates have arisen. No model has been in wide use to guide preservation management decisions. An initial iteration of such a model is developed, based on a Delphi-like iterative questioning of a group of experts in the field of rare books. The results are used to synthesize a preservation management framework for rare book collections, and a small-scale test of the framework has been completed through two independent analyses of five rare books in a functioning collection. Utilizing a standardized template for making preservation decisions offers a variety of benefits. Preservation decisions may include prioritizing action upon the authentic objects, or developing and maintaining surrogates in lieu of retaining costly original collection materials. The framework constructed in this study provides a method for reducing the subjectivity of preservation decision-making and facilitating the development of a standard of practice for preservation management within rare book collections.
- The intersection of social networks in a public service model: A case study.
- Examining human interaction networks contributes to an understanding of factors that improve and constrain collaboration. This study examined multiple network levels of information exchanges within a public service model designed to strengthen community partnerships by connecting city services to the neighborhoods. The research setting was the Neighbourhood Integrated Service Teams (NIST) program in Vancouver, B.C., Canada. A literature review related information dimensions to the municipal structure, including social network theory, social network analysis, social capital, transactive memory theory, public goods theory, and the information environment of the public administration setting. The research method involved multiple instruments and included surveys of two bounded populations. First, the membership of the NIST program received a survey asking for identification of up to 20 people they contact for NIST-related work. Second, a network component of the NIST program, 23 community centre coordinators in the Parks and Recreation Department, completed a survey designed to identify their information exchanges relating to regular work responsibilities and the infusion of NIST issues. Additionally, 25 semi-structured interviews with the coordinators and other program members, collection of organization documents, field observation, and feedback sessions provided valuable insight into the complexity of the model. This research contributes to the application of social network theory and analysis in information environments and provides insight for public administrators into the operation of the model and reasons for the program's network effectiveness.
- Knowledge management in times of change: Tacit and explicit knowledge transfers.
- This study proposed a look at the importance and challenges of knowledge management in times of great change. In order to understand the information phenomena of interest, impacts on knowledge workers and knowledge documents in times of great organizational change, the study is positioned in a major consolidation of state agencies in Texas. It pays special attention to how the changes were perceived by the knowledge workers by interviewing those that were impacted by the changes resulting from the reorganization. The overall goal is to assess knowledge management in times of great organizational change by analyzing the impact of consolidation on knowledge management in Texas's Health and Human Services agencies. The overarching research question is what happened to the knowledge management structure during this time of great change? The first research question was what was the knowledge worker environment during the time of change? The second research question was what was the knowledge management environment of the agencies during the time of change? The last research question was did consolidation of the HHS agencies diminish the ability to transition from tacit to explicit knowledge? Additionally, the study investigates how the bill that mandated the consolidation was covered in the local media as well as the actual budget and employee loss impact of the consolidation in order to better understand the impacts on knowledge workers and knowledge documents as a result of major organizational restructuring. The findings have both theoretical and practical implications for information science, knowledge management and project management.
- Knowledge synthesis in the biomedical literature: Nordihydroguaiaretic acid and breast cancer.
- This dissertation refines knowledge synthesis from publicly accessible databases, based on the model of D.R. Swanson. Knowledge synthesis endeavors bring together two or more non-interactive literatures to create combinatorial research data on a specific topic. In this endeavor the biomedical literature was searched on the anti-neoplastic agent nordihydroguaiaretic acid (NDGA) for its potential role as a functional food in the chemoprevention of breast cancer. Bibliometric cocitation was utilized to identify complementary but non-interactive literatures in the disciplines of biomedicine and dietary science. The continuing specialization and fragmentation of the cancer literature degenerates the potential usefulness of cross-disciplinary research and information. As the biomedical sciences become more specialized the potential increases for isolation of discoveries and for failures to connect science to the needs of the people. Within the information science discipline several techniques are available to bridge the isolation between discoveries recorded in different sets of literatures. Electronic database searching with combinatorial keyword entries, syllogistic modeling and bibliometric author cocitation analysis are the principle techniques applied in this endeavor. The research questions are addressed to the absence or presence of human in vivo research on breast cancer with the potentially chemopreventative functional food NDGA. Utilizing a syllogistic model the literatures of functional foods, nordihydroguaiaretic acid and breast cancer were searched with designated combinatorial keywords. The documents retrieved were subjected to author cocitation analysis to demonstrate disjointness or connectivity of the two complementary literatures. The results demonstrated a possible preventative relationship between breast cancer in women and nordihydroguaiaretic acid, a phytochemical antioxidant and potential functional food. The results of this study are consistent with D.R. Swanson's pioneering work in knowledge synthesis. Swanson's methods can be used to identify non-interactive, disjoint literatures. Continuing support for his techniques has been demonstrated.
- E-Learning and In-Service Training: An Exploration of the Beliefs and Practices of Trainers and Trainees in the Turkish National Police
- This targeted research study, carried out by an officer of the Turkish National Police (TNP), investigated the perceptions and beliefs of TNP trainers and trainees towards the potential adoption and implementation of e-learning technology for in-service police training. Utilizing diffusion and innovation theory (DOI) (Rogers, 1995) and the conceptual technology integration process model (CTIM) (Nicolle, 2005), two different surveys were administered; one to the trainers and one to the trainees. The factor analyses revealed three shared trainer and trainee perceptions: A positive perception towards e-learning, personally and for the TNP; a belief in the importance of administrative support for e-learning integration; and the belief in importance of appropriate resources to facilitate integration and maintain implementation. Three major recommendations were made for the TNP. First, the research findings could be used as a road map by the TNP Education Department to provide a more flexible system to disseminate in-service training information. The second is to establish two-way channels of communication between the administration and the TNP personnel to efficiently operationalize the adoption and integration of e-learning technology. The third is the administrative provision of necessary hardware, software, and technical support.
- Makeshift Information Constructions: Information Flow and Undercover Police
- This dissertation presents the social virtual interface (SVI) model, which was born out of a need to develop a viable model of the complex interactions, information flow and information seeking behaviors among undercover officers. The SVI model was created from a combination of various philosophies and models in the literature of information seeking, communication and philosophy. The questions this research paper answers are as follows: 1. Can we make use of models and concepts familiar to or drawn from Information Science to construct a model of undercover police work that effectively represents the large number of entities and relationships? and 2. Will undercover police officers recognize this model as realistic? This study used a descriptive qualitative research method to examine the research questions. An online survey and hard copy survey were distributed to police officers who had worked in an undercover capacity. In addition groups of officers were interviewed about their opinion of the SVI model. The data gathered was analyzed and the model was validated by the results of the survey and interviews.
- Measuring the accuracy of four attributes of sound for conveying changes in a large data set.
Access: Use of this item is restricted to the UNT Community.
Human auditory perception is suited to receiving and interpreting information from the environment but this knowledge has not been used extensively in designing computer-based information exploration tools. It is not known which aspects of sound are useful for accurately conveying information in an auditory display. An auditory display was created using PD, a graphical programming language used primarily to manipulate digital sound. The interface for the auditory display was a blank window. When the cursor is moved around in this window, the sound generated would changed based on the underlying data value at any given point. An experiment was conducted to determine which attribute of sound most accurately represents data values in an auditory display. The four attributes of sound tested were frequency-sine waveform, frequency-sawtooth waveform, loudness and tempo. 24 subjects were given the task of finding the highest data point using sound alone using each of the four sound treatments. Three dependent variables were measured: distance accuracy, numeric accuracy, and time on task. Repeated measures ANOVA procedures conducted on these variables did not rise to the level of statistical significance (α=.05). None of the sound treatments was more accurate than the other as representing the underlying data values. 52% of the trials were accurate within 50 pixels of the highest data point (target). An interesting finding was the tendency for the frequency-sin waveform to be used in the least accurate trial attempts (38%). Loudness, on the other hand, accounted for very few (12.5%) of the least accurate trial attempts. In completing the experimental task, four different search techniques were employed by the subjects: perimeter, parallel sweep, sector, and quadrant. The perimeter technique was the most commonly used.
- A mechanism for richer representation of videos for children: Calibrating calculated entropy to perceived entropy
- This study explores the use of the information theory entropy equation in representations of videos for children. The calculated rates of information in the videos are calibrated to the corresponding perceived rates of information as elicited from the twelve 7 to 10 year old girls who were shown video documents. Entropy measures are calculated for several video elements: set time, set incidence, verbal time, verbal incidence, set constraint, nonverbal dependence, and character appearance. As hypothesized, mechanically calculated entropy measure (CEM) was found to be sufficiently similar to perceived entropy measure (PEM) made by children so that they can be used as useful and predictive elements of representations of children’s videos. The relationships between the CEM and the PEM show that CEM could stand for PEM in order to enrich representations for video documents for this age group. Speculations on transferring the CEM to PEM calibration to different age groups and different document types are made, as well as further implications for the field of information science.