UNT Libraries - 121 Matching Results

Search Results

Public School Educators' Use of Computer-Mediated Communication
This study examined the uses of computer-mediated communication (CMC) by educators in selected public schools. It used Rogers' Diffusion of Innovation Theory as the underpinnings of the study. CMC refers to any exchange of information that involves the use of computers for communication between individuals or individuals and a machine. This study was an exploration of difficulties users confront, what services they access, and the tasks they accomplish when using CMC. It investigated the factors that affect the use of CMC. The sample population was drawn from registered users on TENET, the Texas Education Network as of December 1997. The educators were described with frequency and percentages analyzing the demographic data. For the research, eight indices were selected to test how strongly these user and environmental attributes were associated with the use of CMC. These variables were (1) education, (2) position, (3) place of employment, (4) geographic location, (5) district size, (6) organization vitality, (7) adopter resources, and (8) instrumentality Two dependent variables were used to test for usage: (1) depth or frequency of CMC usage and amount of time spent online and (2) breadth or variety of Internet utilities used. Additionally, the users' perception of network benefits was measured. Network benefits were correlated with social interaction and perception of CMC to investigate what tasks educators were accomplishing with CMC. Correlations, SEQ CHAPTER h r 1 crosstabulations, and ANOVAs were used to analysis the data for testing the four hypotheses. The major findings of the study, based on the hypotheses tested, were that the socioeconomic variables of education and position influenced the use of CMC. A significant finding is that teachers used e-mail and for Internet resources less frequently than those in other positions. An interesting finding was that frequency of use was more significant for usage than amount of ...
Identifying At-Risk Students: An Assessment Instrument for Distributed Learning Courses in Higher Education
The current period of rapid technological change, particularly in the area of mediated communication, has combined with new philosophies of education and market forces to bring upheaval to the realm of higher education. Technical capabilities exceed our knowledge of whether expenditures on hardware and software lead to corresponding gains in student learning. Educators do not yet possess sophisticated assessments of what we may be gaining or losing as we widen the scope of distributed learning. The purpose of this study was not to draw sweeping conclusions with respect to the costs or benefits of technology in education. The researcher focused on a single issue involved in educational quality: assessing the ability of a student to complete a course. Previous research in this area indicates that attrition rates are often higher in distributed learning environments. Educators and students may benefit from a reliable instrument to identify those students who may encounter difficulty in these learning situations. This study is aligned with research focused on the individual engaged in seeking information, assisted or hindered by the capabilities of the computer information systems that create and provide access to information. Specifically, the study focused on the indicators of completion for students enrolled in video conferencing and Web-based courses. In the final version, the Distributed Learning Survey encompassed thirteen indicators of completion. The results of this study of 396 students indicated that the Distributed Learning Survey represented a reliable and valid instrument for identifying at-risk students in video conferencing and Web-based courses where the student population is similar to the study participants. Educational level, GPA, credit hours taken in the semester, study environment, motivation, computer confidence, and the number of previous distributed learning courses accounted for most of the predictive power in the discriminant function based on student scores from the survey.
Creating a Criterion-Based Information Agent Through Data Mining for Automated Identification of Scholarly Research on the World Wide Web
This dissertation creates an information agent that correctly identifies Web pages containing scholarly research approximately 96% of the time. It does this by analyzing the Web page with a set of criteria, and then uses a classification tree to arrive at a decision. The criteria were gathered from the literature on selecting print and electronic materials for academic libraries. A Delphi study was done with an international panel of librarians to expand and refine the criteria until a list of 41 operationalizable criteria was agreed upon. A Perl program was then designed to analyze a Web page and determine a numerical value for each criterion. A large collection of Web pages was gathered comprising 5,000 pages that contain the full work of scholarly research and 5,000 random pages, representative of user searches, which do not contain scholarly research. Datasets were built by running the Perl program on these Web pages. The datasets were split into model building and testing sets. Data mining was then used to create different classification models. Four techniques were used: logistic regression, nonparametric discriminant analysis, classification trees, and neural networks. The models were created with the model datasets and then tested against the test dataset. Precision and recall were used to judge the effectiveness of each model. In addition, a set of pages that were difficult to classify because of their similarity to scholarly research was gathered and classified with the models. The classification tree created the most effective classification model, with a precision ratio of 96% and a recall ratio of 95.6%. However, logistic regression created a model that was able to correctly classify more of the problematic pages. This agent can be used to create a database of scholarly research published on the Web. In addition, the technique can be used to create a ...
An Analysis of the Ability of an Instrument to Measure Quality of Library Service and Library Success
This study consisted of an examination of how service quality should be measured within libraries and how library service quality relates to library success. A modified version of the SERVQUAL instrument was evaluated to determine how effectively it measures library service quality. Instruments designed to measure information center success and information system success were evaluated to determine how effectively they measure library success and how they relate to SERVQUAL. A model of library success was developed to examine how library service quality relates to other variables associated with library success. Responses from 385 end users at two U.S. Army Corps of Engineers libraries were obtained through a mail survey. Results indicate that library service quality is best measured with a performance-based version of SERVQUAL, and that measuring importance may be as critical as measuring expectations for management purposes. Results also indicate that library service quality is an important factor in library success and that library success is best measured with a combination of SERVQUAL and library success instruments. The findings have implications for the development of new instruments to more effectively measure library service quality and library success as well as for the development of new models of library service quality and library success.
A Personal Documenation System for Scholars: A Tool for Thinking
This exploratory research focused on a problem stated years ago by Vannevar Bush: "The problem is how creative men think, and what can be done to help them think." The study explored the scholarly work process and the use of computer tools to augment thinking. Based on a review of several related literatures, a framework of 7 major categories and 28 subcategories of scholarly thinking was proposed. The literature was used to predict problems scholars have in organizing their information, potential solutions, and specific computer tool features to augment scholarly thinking. Info Select, a personal information manager with most of these features (text and outline processing, sophisticated searching and organizing), was chosen as a potential tool for thinking. The study looked at how six scholars (faculty and doctoral students in social science fields at three universities) organized information using Info Select as a personal documentation system for scholarly work. These multiple case studies involved four in-depth, focused interviews, written evaluations, direct observation, and analysis of computer logs and files collected over a 3- to 6-month period. A content analysis of interviews and journals supported the proposed AfFORD-W taxonomy: Scholarly work activities consisted of Adding, Filing, Finding, Organizing, Reminding, and Displaying information to produce a Written product. Very few activities fell outside this framework, and activities were distributed evenly across all categories. Problems, needs, and likes mentioned by scholars, however, clustered mainly in the filing, finding, and organizing categories. All problems were related to human memory. Both predictions and research findings imply a need for tools that support information storage and retrieval in personal documentation systems, for references and notes, with fast and easy input of source material. A computer tool for thinking should support categorizing and organizing, reorganizing and transporting information. It should provide a simple search engine and support ...
Korean Studies in North America 1977-1996: A Bibliometric Study
This research is a descriptive bibliometric study of the literature of the field of Korean studies. Its goal is to quantitatively describe the literature and serve as a model for such research in other area studies fields. This study analyzed 193 source articles and 7,166 citations in the articles in four representative Korean and Asian studies journals published in North America from 1977 to 1996. The journals included in this study were Korean Studies (KS), the Journal of Korean Studies (JKS), the Journal of Asian Studies (JAS), and the Harvard Journal of Asiatic Studies (HJAS). Subject matters and author characteristics of the source articles were examined, along with various characteristics such as the form, date, language, country of origin, subject, key authors, and key titles of the literature cited in the source articles. Research in Korean studies falls within fourteen broad disciplines, but concentrated in a few disciplines. Americans have been the most active authors in Korean studies, followed closely by authors of Korean ethnicity. Monographic literature was used most. The mean age of publications cited was 20.87 and the median age of publications cited was 12. The Price Index of Korean studies as a whole is 21.9 percent. Sources written in English were most cited (47.1%) and references to Korean language sources amounted to only 34.9% of all sources. In general, authors preferred sources published in their own countries. Sources on history were cited most by other disciplines. No significant core authors were identified. No significant core literature were identified either. This study indicates that Korean studies is still evolving. Some ways of promoting research in less studied disciplines and of facilitating formal communication between Korean scholars in Korea and Koreanists in North America need to be sought in order to promote well-balanced development in the field. This study ...
MEDLINE Metric: A method to assess medical students' MEDLINE search effectiveness
Medical educators advocate the need for medical students to acquire information management skills, including the ability to search the MEDLINE database. There has been no published validated method available to use for assessing medical students' MEDLINE information retrieval skills. This research proposes and evaluates a method, designed as the MEDLINE Metric, for assessing medical students' search skills. MEDLINE Metric consists of: (a) the development, by experts, of realistic clinical scenarios that include highly constructed search questions designed to test defined search skills; (b) timed tasks (searches) completed by subjects; (c) the evaluation of search results; and (d) instructive feedback. A goal is to offer medical educators a valid, reliable, and feasible way to judge mastery of information searching skill by measuring results (search retrieval) rather than process (search behavior) or cognition (knowledge about searching). Following a documented procedure for test development, search specialists and medical content experts formulated six clinical search scenarios and questions. One hundred and forty-five subjects completed the six-item test under timed conditions. Subjects represented a wide range of MEDLINE search expertise. One hundred twenty complete cases were used, representing 53 second-year medical students (44%), 47 fourth-year medical students (39%), and 20 medical librarians (17%). Data related to educational level, search training, search experience, confidence in retrieval, difficulty of search, and score were analyzed. Evidence supporting the validity of the method includes the agreement by experts about the skills and knowledge necessary to successfully retrieve information relevant to a clinical question from the MEDLINE database. Also, the test discriminated among different performance levels. There were statistically significant, positive relationships between test score and level of education, self-reported previous MEDLINE training, and self-reported previous search experience. The findings from this study suggest that MEDLINE Metric is a valid method for constructing and administering a performance-based test to identify ...
Empowering Agent for Oklahoma School Learning Communities: An Examination of the Oklahoma Library Improvement Program
The purposes of this study were to determine the initial impact of the Oklahoma Library Media Improvement Grants on Oklahoma school library media programs; assess whether the Oklahoma Library Media Improvement Grants continue to contribute to Oklahoma school learning communities; and examine possible relationships between school library media programs and student academic success. It also seeks to document the history of the Oklahoma Library Media Improvement Program 1978 - 1994 and increase awareness of its influence upon the Oklahoma school library media programs. Methods of data collection included: examining Oklahoma Library Media Improvement Program archival materials; sending a survey to 1703 school principals in Oklahoma; and interviewing Oklahoma Library Media Improvement Program participants. Data collection took place over a one year period. Data analyses were conducted in three primary phases: descriptive statistics and frequencies were disaggregated to examine mean scores as they related to money spent on school library media programs; opinions of school library media programs; and possible relationships between school library media programs and student academic achievement. Analysis of variance was used in the second phase of data analysis to determine if any variation between means was significant as related to Oklahoma Library Improvement Grants, time spent in the library media center by library media specialists, principal gender, opinions of library media programs, student achievement indicators, and the region of the state in which the respondent was located. The third phase of data analysis compared longitudinal data collected in the 2000 survey with past data. The primary results indicated students in Oklahoma from schools with a centralized library media center, served by a full-time library media specialist, and the school having received one or more Library Media Improvement Grants scored significantly higher academically than students in schools not having a centralized library media center, not served by a ...
An Experimental Study of Teachers' Verbal and Nonverbal Immediacy, Student Motivation, and Cognitive Learning in Video Instruction
This study used an experimental design and a direct test of recall to provide data about teacher immediacy and student cognitive learning. Four hypotheses and a research question addressed two research problems: first, how verbal and nonverbal immediacy function together and/or separately to enhance learning; and second, how immediacy affects cognitive learning in relation to student motivation. These questions were examined in the context of video instruction to provide insight into distance learning processes and to ensure maximum control over experimental manipulations. Participants (N = 347) were drawn from university students in an undergraduate communication course. Students were randomly assigned to groups, completed a measure of state motivation, and viewed a 15-minute video lecture containing part of the usual course content delivered by a guest instructor. Participants were unaware that the video instructor was actually performing one of four scripted manipulations reflecting higher and lower combinations of specific verbal and nonverbal cues, representing the four cells of the 2x2 research design. Immediately after the lecture, students completed a recall measure, consisting of portions of the video text with blanks in the place of key words. Participants were to fill in the blanks with exact words they recalled from the videotape. Findings strengthened previous research associating teacher nonverbal immediacy with enhanced cognitive learning outcomes. However, higher verbal immediacy, in the presence of higher and lower nonverbal immediacy, was not shown to produce greater learning among participants in this experiment. No interaction effects were found between higher and lower levels of verbal and nonverbal immediacy. Recall scores were comparatively low in the presence of higher verbal and lower nonverbal immediacy, suggesting that nonverbal expectancy violations may have hindered cognitive learning. Student motivation was not found to be a significant source of error in measuring immediacy's effects, and no interaction effects were detected ...
The Second Vatican Council and American Catholic Theological Research: A Bibliometric Analysis of Theological Studies: 1940-1995
A descriptive analysis was given of the characteristics of the authors and citations of the articles in the journal Theological Studies from 1940-1995. Data was gathered on the institutional affiliation, geographic location, occupation, and gender and personal characteristics of the author. The citation characteristics were examined for the cited authors, date and age of the citations, format, language, place of publication, and journal titles. These characteristics were compared to the time-period before and after the Second Vatican Council in order to detect any changes that might have occurred in the characteristics after certain recommendations by the council were made to theologians. Subject dispersion of the literature was also analyzed. Lotka's Law of author productivity and Bradford's Law of title dispersion were also performed for this literature. The profile of the characteristics of the authors showed that the articles published by women and laypersons has increased since the recommendations of the council. The data had a good fit to Lotka's Law for the pre-Vatican II time period but not for the period after Vatican II. The data was a good fit to Bradford's Law for the predicted number of journals in the nucleus and Zone 2, but the observed number of journals in Zone 3 was higher than predicted for all time-periods. Subject dispersion of research from disciplines other than theology is low but citation to works from the fields of education, psychology, social sciences, and science has increased since Vatican II. The results of the analysis of the characteristics of the citations showed that there was no significant change in the age, format and languages used, or the geographic location of the publisher of the cited works after Vatican II. Citation characteristics showed that authors prefer research from monographs published in English and in U.S. locations for all time-periods. Research ...
An Examination Of The Variation In Information Systems Project Cost Estimates: The Case Of Year 2000 Compliance Projects
The year 2000 (Y2K) problem presented a fortuitous opportunity to explore the relationship between estimated costs of software projects and five cost influence dimensions described by the Year 2000 Enterprise Cost Model (Kappelman, et al., 1998) -- organization, problem, solution, resources, and stage of completion. This research was a field study survey of (Y2K) project managers in industry, government, and education and part of a joint project that began in 1996 between the University of North Texas and the Y2K Working Group of the Society for Information Management (SIM). Evidence was found to support relationships between estimated costs and organization, problem, resources, and project stage but not for the solution dimension. Project stage appears to moderate the relationships for organization, particularly IS practices, and resources. A history of superior IS practices appears to mean lower estimated costs, especially for projects in larger IS organizations. Acquiring resources, especially external skills, appears to increase costs. Moreover, projects apparently have many individual differences, many related to size and to project stage, and their influences on costs appear to be at the sub-dimension or even the individual variable level. A Revised Year 2000 Enterprise Model is presented incorporating this granularity. Two primary conclusions can be drawn from this research: (1) large software projects are very complex and thus cost estimating is also; and (2) the devil of cost estimating is in the details of knowing which of the many possible variables are the important ones for each particular enterprise and project. This points to the importance of organizations keeping software project metrics and the historical calibration of cost-estimating practices. Project managers must understand the relevant details and their interaction and importance in order to successfully develop a cost estimate for a particular project, even when rational cost models are used. This research also indicates ...
A Study of Graphically Chosen Features for Representation of TREC Topic-Document Sets
Document representation is important for computer-based text processing. Good document representations must include at least the most salient concepts of the document. Documents exist in a multidimensional space that difficult the identification of what concepts to include. A current problem is to measure the effectiveness of the different strategies that have been proposed to accomplish this task. As a contribution towards this goal, this dissertation studied the visual inter-document relationship in a dimensionally reduced space. The same treatment was done on full text and on three document representations. Two of the representations were based on the assumption that the salient features in a document set follow the chi-distribution in the whole document set. The third document representation identified features through a novel method. A Coefficient of Variability was calculated by normalizing the Cartesian distance of the discriminating value in the relevant and the non-relevant document subsets. Also, the local dictionary method was used. Cosine similarity values measured the inter-document distance in the information space and formed a matrix to serve as input to the Multi-Dimensional Scale (MDS) procedure. A Precision-Recall procedure was averaged across all treatments to statistically compare them. Treatments were not found to be statistically the same and the null hypotheses were rejected.
A Theory for the Measurement of Internet Information Retrieval
The purpose of this study was to develop and evaluate a measurement model for Internet information retrieval strategy performance evaluation whose theoretical basis is a modification of the classical measurement model embodied in the Cranfield studies and their progeny. Though not the first, the Cranfield studies were the most influential of the early evaluation experiments. The general problem with this model was and continues to be the subjectivity of the concept of relevance. In cyberspace, information scientists are using quantitative measurement models for evaluating information retrieval performance that are based on the Cranfield model. This research modified this model by incorporating enduser relevance judgment rather than using objective relevance judgments, and by adopting a fundamental unit of measure developed for the cyberspace of Internet information retrieval rather than using recall and precision-type measures. The proposed measure, the Content-bearing Click (CBC) Ratio, was developed as a quantitative measure reflecting the performance of an Internet IR strategy. Since the hypertext "click" is common to many Internet IR strategies, it was chosen as the fundamental unit of measure rather than the "document." The CBC Ratio is a ratio of hypertext click counts that can be viewed as a false drop measure that determines the average number of irrelevant content-bearing clicks that an enduser check before retrieving relevant information. After measurement data were collected, they were used to evaluate the reliability of several methods for aggregating relevance judgments. After reliability coefficients were calculated, measurement model was used to compare web catalog and web database performance in an experimental setting. Conclusions were the reached concerning the reliability of the proposed measurement model and its ability to measure Internet IR performance, as well as implications for clinical use of the Internet and for future research in Information Science.
Information Seeking in a Virtual Learning Environment
Duplicating a time series study done by Kuhlthau and associates in 1989, this study examines the applicability of the Information Search Process (ISP) Model in the context of a virtual learning environment. This study confirms that students given an information seeking task in a virtual learning environment do exhibit the stages indicated by the ISP Model. The six-phase ISP Model is shown to be valid for describing the different stages of cognitive, affective, and physical tasks individuals progress through when facing a situation where they must search for information to complete an academic task in a virtual learning environment. The findings in this study further indicate there is no relationship between the amount of computer experience subjects possess and demonstrating the patterns of thoughts, feelings, and actions described by the ISP Model. The study demonstrates the ISP Model to be independent of the original physical library environments where the model was developed. An attempt is made to represent the ISP model in a slightly different manner that provides more of the sense of motion and interaction among the components of thoughts, feelings, and action than is currently provided for in the model. The study suggests that the development of non-self-reporting data collection techniques would be useful in complementing and furthering research to enhance and refine the representation of the ISP Model. Additionally, expanding the research to include the examination of group interaction is called for to enhance the ISP Model and develop further applications that could potentially aid educational delivery in all types of learning environments.
The Effects of Task-Based Documentation Versus Online Help Menu Documentation on the Acceptance of Information Technology
The objectives of this study were (1) to identify and describe task-based documentation; (2) to identify and describe any purported changes in users attitudes when IT migration was preceded by task-based documentation; (3) to suggest implications of task-based documentation on users attitude toward IT acceptance. Questionnaires were given to 150 university students. Of these, all 150 students participated in this study. The study determined the following: (1) if favorable pre-implementation attitudes toward a new e-mail system increase, as a result of training, if users expect it to be easy to learn and use; (2) if user acceptance of an e-mail program increase as expected perceived usefulness increase as delineated by task-based documentation; (3) if task-based documentation is more effective than standard help menus while learning a new application program; and (4) if training that requires active student participation increase the acceptance of a new e-mail system. The following conclusions were reached: (1) Positive pre-implementation attitudes toward a new e-mail system are not affected by training even if the users expect it to be easy to learn and use. (2) User acceptance of an e-mail program does not increase as perceived usefulness increase when aided by task-based documentation. (3) Task-based documentation is not more effective than standard help menus when learning a new application program. (4) Training that requires active student participation does not increase the acceptance of a new e-mail system.
The Validity of Health Claims on the World Wide Web: A Case Study of the Herbal Remedy Opuntia
The World Wide Web has become a significant source of medical information for the public, but there is concern that much of the information is inaccurate, misleading, and unsupported by scientific evidence. This study analyzes the validity of health claims on the World Wide Web for the herbal Opuntia using an evidence-based approach, and supports the observation that individuals must critically assess health information in this relatively new medium of communication. A systematic search by means of nine search engines and online resources of Web sites relating to herbal remedies was conducted and specific sites providing information on the cactus herbal remedy from the genus Opuntia were retrieved. Validity of therapeutic health claims on the Web sites was checked by comparison with reports in the scientific literature subjected to two established quality assessment rating instruments. 184 Web sites from a variety of sources were retrieved and evaluated, and 98 distinct health claims were identified. 53 scientific reports were retrieved to validate claims. 25 involved human subjects, and 28 involved animal or laboratory models. Only 33 (34%) of the claims were addressed in the scientific literature. For 3% of the claims, evidence from the scientific reports was conflicting or contradictory. Of the scientific reports involving human subjects, none met the predefined criteria for high quality as determined by quality assessment rating instruments. Two-thirds of the claims were unsupported by scientific evidence and were based on folklore, or indirect evidence from related sources. Information on herbal remedies such as Opuntia is well represented on the World Wide Web. Health claims on Web sites were numerous and varied widely in subject matter. The determination of the validity of information about claims made for herbals on the Web would help individuals assess their value in medical treatment. However, the Web is conducive to dubious ...
The Cluster Hypothesis: A Visual/Statistical Analysis
By allowing judgments based on a small number of exemplar documents to be applied to a larger number of unexamined documents, clustered presentation of search results represents an intuitively attractive possibility for reducing the cognitive resource demands on human users of information retrieval systems. However, clustered presentation of search results is sensible only to the extent that naturally occurring similarity relationships among documents correspond to topically coherent clusters. The Cluster Hypothesis posits just such a systematic relationship between document similarity and topical relevance. To date, experimental validation of the Cluster Hypothesis has proved problematic, with collection-specific results both supporting and failing to support this fundamental theoretical postulate. The present study consists of two computational information visualization experiments, representing a two-tiered test of the Cluster Hypothesis under adverse conditions. Both experiments rely on multidimensionally scaled representations of interdocument similarity matrices. Experiment 1 is a term-reduction condition, in which descriptive titles are extracted from Associated Press news stories drawn from the TREC information retrieval test collection. The clustering behavior of these titles is compared to the behavior of the corresponding full text via statistical analysis of the visual characteristics of a two-dimensional similarity map. Experiment 2 is a dimensionality reduction condition, in which inter-item similarity coefficients for full text documents are scaled into a single dimension and then rendered as a two-dimensional visualization; the clustering behavior of relevant documents within these unidimensionally scaled representations is examined via visual and statistical methods. Taken as a whole, results of both experiments lend strong though not unqualified support to the Cluster Hypothesis. In Experiment 1, semantically meaningful 6.6-word document surrogates systematically conform to the predictions of the Cluster Hypothesis. In Experiment 2, the majority of the unidimensionally scaled datasets exhibit a marked nonuniformity of distribution of relevant documents, further supporting the Cluster Hypothesis. Results of ...
Modeling Utilization of Planned Information Technology
Implementations of information technology solutions to address specific information problems are only successful when the technology is utilized. The antecedents of technology use involve user, system, task and organization characteristics as well as externalities which can affect all of these entities. However, measurement of the interaction effects between these entities can act as a proxy for individual attribute values. A model is proposed which based upon evaluation of these interaction effects can predict technology utilization. This model was tested with systems being implemented at a pediatric health care facility. Results from this study provide insight into the relationship between the antecedents of technology utilization. Specifically, task time provided significant direct causal effects on utilization. Indirect causal effects were identified in task value and perceived utility constructs. Perceived utility, along with organizational support also provided direct causal effects on user satisfaction. Task value also impacted user satisfaction in an indirect fashion. Also, results provide a predictive model and taxonomy of variables which can be applied to predict or manipulate the likelihood of utilization for planned technology.
Evaluation of Text-Based and Image-Based Representations for Moving Image Documents
Document representation is a fundamental concept in information retrieval (IR), and has been relied upon in textual IR systems since the advent of library catalogs. The reliance upon text-based representations of stored information has been perpetuated in conventional systems for the retrieval of moving images as well. Although newer systems have added image-based representations of moving image documents as aids to retrieval, there has been little research examining how humans interpret these different types of representations. Such basic research has the potential to inform IR system designers about how best to aid users of their systems in retrieving moving images. One key requirement for the effective use of document representations in either textual or image form is thedegree to which these representations are congruent with the original documents. A measure of congruence is the degree to which human responses to representations are similar to responses produced by the document being represented. The aim of this study was to develop a model for the representation of moving images based upon human judgements of representativeness. The study measured the degree of congruence between moving image documents and their representations, both text and image based, in a non-retrieval environment with and without task constraints. Multidimensional scaling (MDS) was used to examine the dimensional dispersions of human judgements for the full moving images and their representations.
Perceived features and similarity of images: An investigation into their relationships and a test of Tversky's contrast model.
The creation, storage, manipulation, and transmission of images have become less costly and more efficient. Consequently, the numbers of images and their users are growing rapidly. This poses challenges to those who organize and provide access to them. One of these challenges is similarity matching. Most current content-based image retrieval (CBIR) systems which can extract only low-level visual features such as color, shape, and texture, use similarity measures based on geometric models of similarity. However, most human similarity judgment data violate the metric axioms of these models. Tversky's (1977) contrast model, which defines similarity as a feature contrast task and equates the degree of similarity of two stimuli to a linear combination of their common and distinctive features, explains human similarity judgments much better than the geometric models. This study tested the contrast model as a conceptual framework to investigate the nature of the relationships between features and similarity of images as perceived by human judges. Data were collected from 150 participants who performed two tasks: an image description and a similarity judgment task. Qualitative methods (content analysis) and quantitative (correlational) methods were used to seek answers to four research questions related to the relationships between common and distinctive features and similarity judgments of images as well as measures of their common and distinctive features. Structural equation modeling, correlation analysis, and regression analysis confirmed the relationships between perceived features and similarity of objects hypothesized by Tversky (1977). Tversky's (1977) contrast model based upon a combination of two methods for measuring common and distinctive features, and two methods for measuring similarity produced statistically significant structural coefficients between the independent latent variables (common and distinctive features) and the dependent latent variable (similarity). This model fit the data well for a sample of 30 (435 pairs of) images and 150 participants (χ2 =16.97, ...
The Impact of Predisposition Towards Group Work on Intention to Use a CSCW System
Groupware packages are increasingly being used to support content delivery, class discussion, student to student and student to faculty interactions and group work on projects. This research focused on groupware packages that are used to support students who are located in different places, but who are assigned group projects as part of their coursework requirements. In many cases, students are being asked to use unfamiliar technologies that are very different from those that support personal productivity. For example, computer-assisted cooperative work (CSCW) technology is different from other more traditional, stand-alone software applications because it requires the user to interact with the computer as well as other users. However, familiarity with the technology is not the only requirement for successful completion of a group assigned project. For a group to be successful, it must also have a desire to work together on the project. If this pre-requisite is not present within the group, then the technology will only create additional communication and coordination barriers. How much of an impact does each of these factors have on the acceptance of CSCW technology? The significance of this study is threefold. First, this research contributed to how a user's predisposition toward group work affects their acceptance of CSCW technology. Second, it helped identify ways to overcome some of the obstacles associated with group work and the use of CSCW technology in an academic online environment. Finally, it helped identify early adopters of CSCW software and how these users can form the critical mass required to diffuse the technology. This dissertation reports the impact of predisposition toward group work and prior computer experience on the intention to use synchronous CSCW. It was found that predisposition toward group work was not only positively associated to perceived usefulness; it was also related to intention to use. It ...
A Comparative Analysis of Style of User Interface Look and Feel in a Synchronous Computer Supported Cooperative Work Environment
The purpose of this study is to determine whether the style of a user interface (i.e., its look and feel) has an effect on the usability of a synchronous computer supported cooperative work (CSCW) environment for delivering Internet-based collaborative content. The problem motivating this study is that people who are located in different places need to be able to communicate with one another. One way to do this is by using complex computer tools that allow users to share information, documents, programs, etc. As an increasing number of business organizations require workers to use these types of complex communication tools, it is important to determine how users regard these types of tools and whether they are perceived to be useful. If a tool, or interface, is not perceived to be useful then it is often not used, or used ineffectively. As organizations strive to improve communication with and among users by providing more Internet-based collaborative environments, the users' experience in this form of delivery may be tied to a style of user interface look and feel that could negatively affect their overall acceptance and satisfaction of the collaborative environment. The significance of this study is that it applies the technology acceptance model (TAM) as a tool for evaluating style of user interface look and feel in a collaborative environment, and attempts to predict which factors of that model, perceived ease of use and/or perceived usefulness, could lead to better acceptance of collaborative tools within an organization.
Makeshift Information Constructions: Information Flow and Undercover Police
This dissertation presents the social virtual interface (SVI) model, which was born out of a need to develop a viable model of the complex interactions, information flow and information seeking behaviors among undercover officers. The SVI model was created from a combination of various philosophies and models in the literature of information seeking, communication and philosophy. The questions this research paper answers are as follows: 1. Can we make use of models and concepts familiar to or drawn from Information Science to construct a model of undercover police work that effectively represents the large number of entities and relationships? and 2. Will undercover police officers recognize this model as realistic? This study used a descriptive qualitative research method to examine the research questions. An online survey and hard copy survey were distributed to police officers who had worked in an undercover capacity. In addition groups of officers were interviewed about their opinion of the SVI model. The data gathered was analyzed and the model was validated by the results of the survey and interviews.
Global response to cyberterrorism and cybercrime: A matrix for international cooperation and vulnerability assessment.
Cyberterrorism and cybercrime present new challenges for law enforcement and policy makers. Due to its transnational nature, a real and sound response to such a threat requires international cooperation involving participation of all concerned parties in the international community. However, vulnerability emerges from increased reliance on technology, lack of legal measures, and lack of cooperation at the national and international level represents real obstacle toward effective response to these threats. In sum, lack of global consensus in terms of responding to cyberterrorism and cybercrime is the general problem. Terrorists and cyber criminals will exploit vulnerabilities, including technical, legal, political, and cultural. Such a broad range of vulnerabilities can be dealt with by comprehensive cooperation which requires efforts both at the national and international level. "Vulnerability-Comprehensive Cooperation-Freedom Scale" or "Ozeren Scale" identified variables that constructed the scale based on the expert opinions. Also, the study presented typology of cyberterrorism, which involves three general classifications of cyberterrorism; Disruptive and destructive information attacks, Facilitation of technology to support the ideology, and Communication, Fund raising, Recruitment, Propaganda (C-F-R-P). Such a typology is expected to help those who are in a position of decision-making and investigating activities as well as academicians in the area of terrorism. The matrix for international cooperation and vulnerability assessment is expected to be used as a model for global response to cyberterrorism and cybercrime.
Assessment of a Library Learning Theory by Measuring Library Skills of Students Completing an Online Library Instruction Tutorial
This study is designed to reveal whether students acquire the domains and levels of library skills discussed in a learning library skills theory after participating in an online library instruction tutorial. The acquisition of the library skills is demonstrated through a review of the scores on online tutorial quizzes, responses to a library skills questionnaire, and bibliographies of course research papers. Additional areas to be studied are the characteristics of the participants enrolled in traditional and online courses at a community college and the possible influence of these characteristics on the demonstrated learning of library skills. Multiple measurement methods, identified through assessment of library instruction literature, are used to verify the effectiveness of the library skills theory and to strengthen the validity and reliability of the study results.
A Common Representation Format for Multimedia Documents
Multimedia documents are composed of multiple file format combinations, such as image and text, image and sound, or image, text and sound. The type of multimedia document determines the form of analysis for knowledge architecture design and retrieval methods. Over the last few decades, theories of text analysis have been proposed and applied effectively. In recent years, theories of image and sound analysis have been proposed to work with text retrieval systems and progressed quickly due in part to rapid progress in computer processing speed. Retrieval of multimedia documents formerly was divided into the categories of image and text, and image and sound. While standard retrieval process begins from text only, methods are developing that allow the retrieval process to be accomplished simultaneously using text and image. Although image processing for feature extraction and text processing for term extractions are well understood, there are no prior methods that can combine these two features into a single data structure. This dissertation will introduce a common representation format for multimedia documents (CRFMD) composed of both images and text. For image and text analysis, two techniques are used: the Lorenz Information Measurement and the Word Code. A new process named Jeong's Transform is demonstrated for extraction of text and image features, combining the two previous measurements to form a single data structure. Finally, this single data measurements to form a single data structure. Finally, this single data structure is analyzed by using multi-dimensional scaling. This allows multimedia objects to be represented on a two-dimensional graph as vectors. The distance between vectors represents the magnitude of the difference between multimedia documents. This study shows that image classification on a given test set is dramatically improved when text features are encoded together with image features. This effect appears to hold true even when the available ...
Sanctioned and Controlled Message Propagation in a Restrictive Information Environment: The Small World of Clandestine Radio Broadcasting
This dissertation seeks to identify the elements that inform the model for competing message propagation systems in a restrictive environment. It pays attention to message propagation by sanctioned and clandestine radio stations in pre- and post-independent Zimbabwe. This dissertation uses two models of message propagation in a limiting information environment: Sturges' information model of national liberation struggle and Chatman's small world information model. All the message propagation elements in the Sturges and Chatman's models are present in the broadcast texts analyzed. However, the findings of this dissertation indicate that communication in a restrictive information environment is designed such that its participants make sense of their situation, and come up with ways to solve the challenges of their small world. Also, a restrictive information environment is situational, and message propagators operating in it are subject to tactical changes at different times, accordingly altering their cognitive maps. The two models fail to address these concerns. This dissertation focused on message propagation in Zimbabwe because there is military belligerence involved in the information warfare. It therefore provides an extreme situation, which can help our understanding of more everyday instances of communication and interference of communication. Findings of this dissertation recommend the need to emphasize that information input, output and suppression are components dependent on each other; not discrete and independent categories of information activities.
An exploratory study of factors that influence student user success in an academic digital library.
The complex nature of digital libraries calls for appropriate models to study user success. Calls have been made to incorporate into these models factors that capture the interplay between people, organizations, and technology. In order to address this, two research questions were formulated: (1) To what extent does the comprehensive digital library user success model (DLUS), based on a combination of the EUCS and flow models, describe overall user success in a prototype digital library environment; and (2) To what extent does a combined model of DeLone & McLean's reformulated information system success model and comprehensive digital library user success model (DLUS) explain digital library user success in a prototype digital library environment? Participants were asked to complete an online survey questionnaire. A total of 160 completed and useable questionnaires were obtained. Data analyses through exploratory and confirmatory factor analyses and structural equation modeling produced results that support the two models. However, some relationships between latent variables hypothesized in the model were not confirmed. A modified version of the proposed comprehensive plus user success model in a digital library environment was tested and supported through model fit statistics. This model was recommended as a possible alternative model of user success. The dissertation also makes a number of recommendations for future research.
The intersection of social networks in a public service model: A case study.
Examining human interaction networks contributes to an understanding of factors that improve and constrain collaboration. This study examined multiple network levels of information exchanges within a public service model designed to strengthen community partnerships by connecting city services to the neighborhoods. The research setting was the Neighbourhood Integrated Service Teams (NIST) program in Vancouver, B.C., Canada. A literature review related information dimensions to the municipal structure, including social network theory, social network analysis, social capital, transactive memory theory, public goods theory, and the information environment of the public administration setting. The research method involved multiple instruments and included surveys of two bounded populations. First, the membership of the NIST program received a survey asking for identification of up to 20 people they contact for NIST-related work. Second, a network component of the NIST program, 23 community centre coordinators in the Parks and Recreation Department, completed a survey designed to identify their information exchanges relating to regular work responsibilities and the infusion of NIST issues. Additionally, 25 semi-structured interviews with the coordinators and other program members, collection of organization documents, field observation, and feedback sessions provided valuable insight into the complexity of the model. This research contributes to the application of social network theory and analysis in information environments and provides insight for public administrators into the operation of the model and reasons for the program's network effectiveness.
Intangible Qualities of Rare Books: Toward a Decision-Making Framework for Preservation Management in Rare Book Collections, Based Upon the Concept of the Book as Object
For rare book collections, a considerable challenge is involved in evaluating collection materials in terms of their inherent value, which includes the textual and intangible information the materials provide for the collection's users. Preservation management in rare book collections is a complex and costly process. As digitization and other technological advances in surrogate technology have provided new forms representation, new dilemmas in weighing the rare book's inherently valuable characteristics against the possibly lesser financial costs of surrogates have arisen. No model has been in wide use to guide preservation management decisions. An initial iteration of such a model is developed, based on a Delphi-like iterative questioning of a group of experts in the field of rare books. The results are used to synthesize a preservation management framework for rare book collections, and a small-scale test of the framework has been completed through two independent analyses of five rare books in a functioning collection. Utilizing a standardized template for making preservation decisions offers a variety of benefits. Preservation decisions may include prioritizing action upon the authentic objects, or developing and maintaining surrogates in lieu of retaining costly original collection materials. The framework constructed in this study provides a method for reducing the subjectivity of preservation decision-making and facilitating the development of a standard of practice for preservation management within rare book collections.
Information Needs of Art Museum Visitors: Real and Virtual
Museums and libraries are considered large repositories of human knowledge and human culture. They have similar missions and goals in distributing accumulated knowledge to society. Current digitization projects allow both, museums and libraries to reach a broader audience, share their resources with a variety of users. While studies of information seeking behavior, retrieval systems and metadata in library science have a long history; such research studies in museum environments are at their early experimental stage. There are few studies concerning information seeking behavior and needs of virtual museum visitors, especially with the use of images in the museums' collections available on the Web. The current study identifies preferences of a variety of user groups about the information specifics on current exhibits, museum collections metadata information, and the use of multimedia. The study of information seeking behavior of users groups of museum digital collections or cultural collections allows examination and analysis of users' information needs, and the organization of cultural information, including descriptive metadata and the quantity of information that may be required. In addition, the study delineates information needs that different categories of users may have in common: teachers in high schools, students in colleges and universities, museum professionals, art historians and researchers, and the general public. This research also compares informational and educational needs of real visitors with the needs of virtual visitors. Educational needs of real visitors are based on various studies conducted and summarized by Falk and Dierking (2000), and an evaluation of the art museum websites previously conducted to support the current study.
Implications of the inclusion of document retrieval systems as actors in a social network.
Traditionally, social network analysis (SNA) techniques enable the examination of relationships and the flow of information within networks of human members or groups of humans. This study extended traditional social network analysis to include a nonhuman group member, specifically a document retrieval system. The importance of document retrieval systems as information sources, the changes in business environments that necessitates the use of information and communication technologies, and the attempts to make computer systems more life-like, provide the reasons for considering the information system as a group member. The review of literature for this study does not encompass a single body of knowledge. Instead, several areas combined to inform this study, including social informatics for its consideration of the intersection of people and information technology, network theory and social network analysis, organizations and information, organizational culture, and finally, storytelling in organizations as a means of transferring information. The methodology included distribution of surveys to two small businesses that used the same document retrieval system, followed by semi-structured interviews of selected group members, which allowed elaboration on the survey findings. The group members rated each other and the system on four interaction criteria relating to four social networks of interest, including awareness, access, information flow, and problem solving. Traditional measures of social networks, specifically density, degree, reciprocity, transitivity, distance, degree centrality, and closeness centrality provided insight into the positioning of the nonhuman member within the social group. The human members of the group were able to respond to the survey that included the system but were not ready to consider the system as being equivalent to other human members. SNA measures positioned the system as an average member of the group, not a star, but not isolated either. Examination of the surveys or the interviews in isolation would not have given a ...
Terrorism as a social information entity: A model for early intervention.
This dissertation studies different social aspects of terrorists and terrorist organizations in an effort to better deal with terrorism, especially in the long run. The researcher, who also worked as a Police Captain at Turkish National Police Anti-Terrorism Department, seeks solutions to today's global problem by studying both literature and a Delphi examination of a survey of 1070 imprisoned terrorists. The research questions include questions such as "What are the reasons behind terrorism?", "Why does terrorism occur?", "What ideologies provide the framework for terrorist violence?, "Why do some individuals become terrorists and others do not?" and "Under what conditions will terrorists end their violence?" The results of the study presents the complexity of the terrorism problem as a social experience and impossibility of a single solution or remedy for the global problem of terrorism. The researcher through his examination of the findings of the data, presented that terrorism is a social phenomenon with criminal consequences that needs to be dealt by means of two dimensional approaches. The first is the social dimension of terrorism and the second is the criminal dimension of terrorism. Based on this, the researcher constructed a conceptual model which addresses both of these dimensions under the titles of long-term solutions and short-term solutions. The long-term solutions deal with the social aspects of terrorism under the title of Proactive Approach to Terrorism and the short-term solutions deal with the criminal aspects of terrorism under the title of The Immediate Fight against Terrorism. The researcher constructed this model because there seems to be a tendency of not asking the question of "Why does terrorism occur?" Instead, the focus is usually on dealing with the consequences of terrorism and future terrorist threats. While it is essential that the governments need to provide the finest security measures for their societies, ...
A Complex Systems Model for Understanding the Causes of Corruption: Case Study - Turkey
It is attempted with this dissertation to draw an explanatory interdisciplinary framework to clarify the causes of systemic corruption. Following an intense review of political sciences, economics, and sociology literatures on the issue, a complex systems theoretical model is constructed. A political system consists of five main components: Society, interest aggregators, legislative, executive and private sector, and the human actors in these domains. It is hypothesized that when the legitimacy level of the system is low and morality of the systemic actors is flawed, selected political, social and economic incentives and opportunities that may exist within the structure of the systemic components might -individually or as a group- trigger corrupt transactions between the actors of the system. If left untouched, corruption might spread through the system by repetition and social learning eventually becoming the source of corruption itself. By eroding the already weak legitimacy and morality, it may increase the risk of corruption even further. This theoretical explanation is used to study causes of systemic corruption in the Turkish political system. Under the guidance of the complex systems theory, initial systemic conditions, -legacy of the predecessor of Turkey Ottoman Empire-, is evaluated first, and then political, social and economic factors that are presumed to be breeding corruption in contemporary Turkey is investigated. In this section, special focus is given on the formation and operation of amoral social networks and their contribution to the entrenchment of corruption within the system. Based upon the findings of the case study, the theoretical model that is informed by the literature is reformed: Thirty five system and actor level variables are identified to be related with systemic corruption and nature of the causality between them and corruption is explained. Although results of this study can not be academically generalized for obvious reasons; the analytical framework ...
A multi-dimensional entropy model of jazz improvisation for music information retrieval.
Jazz improvisation provides a case context for examining information in music; entropy provides a means for representing music for retrieval. Entropy measures are shown to distinguish between different improvisations on the same theme, thus demonstrating their potential for representing jazz information for analysis and retrieval. The calculated entropy measures are calibrated against human representation by means of a case study of an advanced jazz improvisation course, in which synonyms for "entropy" are frequently used by the instructor. The data sets are examined for insights in music information retrieval, music information behavior, and music representation.
Knowledge management in times of change: Tacit and explicit knowledge transfers.
This study proposed a look at the importance and challenges of knowledge management in times of great change. In order to understand the information phenomena of interest, impacts on knowledge workers and knowledge documents in times of great organizational change, the study is positioned in a major consolidation of state agencies in Texas. It pays special attention to how the changes were perceived by the knowledge workers by interviewing those that were impacted by the changes resulting from the reorganization. The overall goal is to assess knowledge management in times of great organizational change by analyzing the impact of consolidation on knowledge management in Texas's Health and Human Services agencies. The overarching research question is what happened to the knowledge management structure during this time of great change? The first research question was what was the knowledge worker environment during the time of change? The second research question was what was the knowledge management environment of the agencies during the time of change? The last research question was did consolidation of the HHS agencies diminish the ability to transition from tacit to explicit knowledge? Additionally, the study investigates how the bill that mandated the consolidation was covered in the local media as well as the actual budget and employee loss impact of the consolidation in order to better understand the impacts on knowledge workers and knowledge documents as a result of major organizational restructuring. The findings have both theoretical and practical implications for information science, knowledge management and project management.
Factors Influencing How Students Value Asynchronous Web Based Courses
This dissertation discovered the factors influencing how students value asynchronous Web-based courses through the use of qualitative methods. Data was collected through surveys, observations, interviews, email correspondence, chat room and bulletin board transcripts. Instruments were tested in pilot studies of previous semesters. Factors were identified for two class formats. The asynchronous CD/Internet class format and the synchronous online Web based class format. Also, factors were uncovered for two of the instructional tools used in the course: the WebCT forum and WebCT testing. Factors were grouped accordingly as advantages or disadvantages under major categories. For the asynchronous CD/Internet class format the advantages were Convenience, Flexibility, Learning Enhancement, and Psychology. The disadvantages included Isolation, Learning Environment, and Technology. For the synchronous online Web based class format the advantages were Convenience, Flexibility, Human Interaction, Learning Enhancement and Psychology, whereas the disadvantages included Isolation, Learning Environment and Technology. Concurrently, the study revealed the following factors as advantages of the WebCT Forum: Help Each Other, Interaction, Socialization, Classroom News, and Time Independent. The disadvantages uncovered were Complaints, Technical Problems and Isolation. Finally, advantages specified for the WebCT testing tool were Convenience, Flexibility and Innovations, and its disadvantages were Surroundings Not Conducive to Learning, and Technical Problems. Results indicate that not only classroom preference, learning style and personality type influence how students value a Web based course, but, most importantly, a student's lifestyle (number of personal commitments, how far they live, and life's priorities). The WebCT forum or bulletin board, and the WebCT testing or computerized testing were seen mostly by students, as good tools for encouraging classroom communication and testing because of the convenience and flexibility offered. Still, further research is needed both quantitatively and qualitatively to ascertain the true weight of the factors discovered in this study.
Reading Interests and Activity of Older Adults and Their Sense of Life Satisfaction
This study addresses the problem of reading among older adults and the relation of such reading to their sense of life satisfaction. The study also considers the relation between reading interests and activity of older adults and the availability to them of library materials and services.
Supporting Computer-Mediated Collaboration through User Customized Agents
This research investigated a neglected problem - interruption of groups by agent advisory systems. The question was whether interruption by the agent advisory system was beneficial. A survey of literature in four areas is included in this dissertation. The areas surveyed were Agents, Online Help, Computer Supported Cooperative Work(CSCW) and Awareness in CSCW. Based on the review, a human subject experiment was conducted. The study investigated whether the style of agent advisory interface improved the performance of group members. There were three sets of groups, a control set that did not have advisory agents, a set that had system provided advisory agents and a set that had group customized advisory agents. The groups worked together using a CSCW application developed with GroupKit, a CSCW toolkit. The groups with group customized advisory agents used an Agent Manager application to define advisory agents that would give them advice as they worked in the CSCW application. The findings showed that the type of advisory agents did not significantly influence the performance of the groups. The groups with customized agents performed slightly better than the other groups but the difference was not statistically significant. When notified that advice was issued, groups with customized agents and groups with provided agents seldom accessed the agent's advice. General design guidelines for agent interruption have not been solved. Future work is needed to finish the job. The definitive solution may be some mixture of the three known individual design solutions.
An E-government Readiness Model
The purpose of this study is to develop an e-government readiness model and to test this model. Consistent with this model several instruments, IS assessment (ISA), IT governance (ITG), and Organization-IS alignment (IS-ALIGN) are examined for their ability to measure the readiness of one organization for e-government and to test the instruments fit in the proposed e-government model. The ISA instrument used is the result of adapting and combining the IS-SERVQUAL instrument proposed by Van Dyke, Kappelman, and Pybutok (1997), and the IS-SUCCESS instrument developed by Kappelman and Chong (2001) for the City of Denton (COD) project at UNT. The IS Success Model was first proposed by DeLone and McLean (1992), but they did not validate this model. The ITG instrument was based on the goals of the COD project for IT governance and was developed by Sanchez and Kappelman (2001) from UNT. The ISALIGN instrument was also developed by Sanchez and Kappelman (2001) for the COD project. It is an instrument based on the Malcolm Baldrige National Quality Award (MBNQA) that measures how effectively a government organization utilizes IT to support its various objectives. The EGOV instrument was adapted from the study of the Action-Audience Model developed by Koh and Balthazrd (1997) to measure how well a government organization is prepared to usher in e-government in terms of various success factors at planning, system and data levels. An on-line survey was conducted with employees of the City of Denton, Texas. An invitation letter to participate in the survey was sent to the 1100 employees of the City of Denton via email, 339 responses were received, yielding a response rate of 31%. About 168 responses were discarded because they were incomplete and had the missing values, leaving 171 usable surveys, for a usable set of responses that had a response ...
University Students and the Internet: Information Seeking Study
This study explored university students' information needs and seeking behaviors on the Internet. A Web-based survey was administrated one time. Two hundred responses were received from the target sample within the two weeks period of the study. Data were analyzed with descriptive statistics, factor analysis, and graphical representation. The study explored various issues related to the usability, preferences, and activities of the Internet, such as searching tools, e-mail, search engines, and preferred primary sources of everyday-life information needs. The study explored the perceptions of the students toward the Internet and the traditional library. Kuhlthau's model of the information-seeking process, which includes six stages and affective components, was utilized and modified in the construction of the Web survey. A study by Presno (1998), which includes the four types of Internet anxiety, was utilized in the construction of the Web survey. With regard to the six stages of Kuhlthau model, the majority of the respondents experienced stage 5, which was about information gathering; stage 3 had the next highest number of respondents. Very few respondents experienced stages 1 and 2. There was a systematic pattern in which, the earlier the stages the respondents were in, the more negative adjectives they selected, and vice versa. The feeling adjectives section showed a difference in the behavior between males and females. The results indicated that most students had Internet time delay anxiety. In general, the study found that students have a great interest in the Internet and consider it an important source of information for their personal, educational, and communication activities.
Relevance Thresholds: A Conjunctive/Disjunctive Model of End-User Cognition as an Evaluative Process
This investigation identifies end-user cognitive heuristics that facilitate judgment and evaluation during information retrieval (IR) system interactions. The study extends previous research surrounding relevance as a key construct for representing the value end-users ascribe to items retrieved from IR systems and the perceived effectiveness of such systems. The Lens Model of user cognition serves as the foundation for design and interpretation of the study; earlier research in problem solving, decision making, and attitude formation also contribute to the model and analysis. A self reporting instrument collected evaluative responses from 32 end-users related to 1432 retrieved items in relation to five characteristics of each item: topical, pertinence, utility, systematic, and motivational levels of relevance. The nominal nature of the data collected led to non-parametric statistical analyses that indicated that end-user evaluation of retrieved items to resolve an information problem at hand is most likely a multi-stage process. That process appears to be a cognitive progression from topic to meaning (pertinence) to functionality (use). Each step in end-user evaluative processing engages a cognitive hierarchy of heuristics that includes consideration (of appropriate cues), differentiation (the positive or negative aspects of those cues considered), and aggregation (the combination of differentiated cue aspects needed to render an evaluative label of the item in relation to the information problem at hand). While individuals may differ in their judgments and evaluations of retrieved items, they appear to make those decisions by using consistent heuristic approaches.
Effect of Technology Integration Education on the Attitudes of Teachers and their Students
This study analyzed the effect of technology integration education on teachers' and students' attitudes toward information technology. Two instruments measuring similar attributes were used to assess teachers' and students' attitudes. Differences in pre- and post-test scores were used to determine changes that occurred during the course of the study.
Diagnosing Learner Deficiencies in Algorithmic Reasoning
It is hypothesized that useful diagnostic information can reside in the wrong answers of multiple-choice tests, and that properly designed distractors can yield indications of misinformation and missing information in algorithmic reasoning on the part of the test taker. In addition to summarizing the literature regarding diagnostic research as opposed to scoring research, this study proposes a methodology for analyzing test results and compares the findings with those from the research of Birenbaum and Tatsuoka and others. The proposed method identifies the conditions of misinformation and missing information, and it contains a statistical compensation for careless errors. Strengths and weaknesses of the method are explored, and suggestions for further research are offered.
Selected Factors Associated With Reading Interests of Seventh- and Eighth-grade Pupils
This study sought to determine if there were differences in the types of reading interests of seventh- and eighth-grade pupils associated with their racial origins, their socioeconomic status, or their school environments. It also sought to consider the strength of reading interest scores as related to other variables and to consider the relationship between these scores and the number of hours spent in reading and the change in amount of reading since the previous school year.
Relation of Personal Characteristics to Type of Position Among Bibliographic Network Coordinators, Ex-coordinators, and Selected Library Depeartment Heads
The objectives of this investigation were two-fold. The first was to determine the personal characteristics of Bibliographic Network Coordinators, both past and present; the second was to compare these identified characteristics with those of persons working in traditional library positions at comparable levels of responsibility.
Factors Related to the Professional Progress of Academic Librarians in Louisiana
Three groups of Academic librarians in Louisiana were surveyed to determine what factors other than job performance influenced professional progress (Salary increases, promotion and tenure) for them. Staff development activities were also investigated to determine if they played any significant role in influencing professional progress. Three opinion questions were also asked in this investigation about the feasibility of using an index that was developed to assess quantitatively staff development activities.
The Personal Reading Interests of Third, Fourth, and Fifth Grade Children in Selected Arkansas Public Schools
The purpose of this study was to determine the personal reading interests of students in the third, fourth and fifth grades and to determine if advances in technology in the past twenty years have changed their reading interests.
Increasing Telecommunications Channel Capacity: Impacts on Firm Profitability
In calling for the deployment of high-capacity telecommunications infrastructures, the Clinton Administration is relying on market forces to drive demand toward self-sustaining development. There is little doubt that many firms will embrace the new telecommunications services for a variety of reasons including market differentiation, vertical market integration, and other organization-specific factors. However, there is little evidence at the firm level that adopting the use of increased-capacity telecommunications technologies is associated with improvements in firm profitability. This study seeks to identify the presence of impacts on firm income that can be associated with the adoption of T1 telecommunications services.
Citation Accuracy in the Journal Literature of Four Disciplines : Chemistry, Psychology, Library Science, and English and American Literature
The primary purpose of this study was to determine if there is a relationship between the bibliographic citation practices of the members of a discipline and the emphasis placed on citation accuracy and purposes in the graduate instruction of the discipline.