Search Results

The Cluster Hypothesis: A Visual/Statistical Analysis
By allowing judgments based on a small number of exemplar documents to be applied to a larger number of unexamined documents, clustered presentation of search results represents an intuitively attractive possibility for reducing the cognitive resource demands on human users of information retrieval systems. However, clustered presentation of search results is sensible only to the extent that naturally occurring similarity relationships among documents correspond to topically coherent clusters. The Cluster Hypothesis posits just such a systematic relationship between document similarity and topical relevance. To date, experimental validation of the Cluster Hypothesis has proved problematic, with collection-specific results both supporting and failing to support this fundamental theoretical postulate. The present study consists of two computational information visualization experiments, representing a two-tiered test of the Cluster Hypothesis under adverse conditions. Both experiments rely on multidimensionally scaled representations of interdocument similarity matrices. Experiment 1 is a term-reduction condition, in which descriptive titles are extracted from Associated Press news stories drawn from the TREC information retrieval test collection. The clustering behavior of these titles is compared to the behavior of the corresponding full text via statistical analysis of the visual characteristics of a two-dimensional similarity map. Experiment 2 is a dimensionality reduction condition, in which inter-item similarity coefficients for full text documents are scaled into a single dimension and then rendered as a two-dimensional visualization; the clustering behavior of relevant documents within these unidimensionally scaled representations is examined via visual and statistical methods. Taken as a whole, results of both experiments lend strong though not unqualified support to the Cluster Hypothesis. In Experiment 1, semantically meaningful 6.6-word document surrogates systematically conform to the predictions of the Cluster Hypothesis. In Experiment 2, the majority of the unidimensionally scaled datasets exhibit a marked nonuniformity of distribution of relevant documents, further supporting the Cluster Hypothesis. Results of …
Creating a Criterion-Based Information Agent Through Data Mining for Automated Identification of Scholarly Research on the World Wide Web
This dissertation creates an information agent that correctly identifies Web pages containing scholarly research approximately 96% of the time. It does this by analyzing the Web page with a set of criteria, and then uses a classification tree to arrive at a decision. The criteria were gathered from the literature on selecting print and electronic materials for academic libraries. A Delphi study was done with an international panel of librarians to expand and refine the criteria until a list of 41 operationalizable criteria was agreed upon. A Perl program was then designed to analyze a Web page and determine a numerical value for each criterion. A large collection of Web pages was gathered comprising 5,000 pages that contain the full work of scholarly research and 5,000 random pages, representative of user searches, which do not contain scholarly research. Datasets were built by running the Perl program on these Web pages. The datasets were split into model building and testing sets. Data mining was then used to create different classification models. Four techniques were used: logistic regression, nonparametric discriminant analysis, classification trees, and neural networks. The models were created with the model datasets and then tested against the test dataset. Precision and recall were used to judge the effectiveness of each model. In addition, a set of pages that were difficult to classify because of their similarity to scholarly research was gathered and classified with the models. The classification tree created the most effective classification model, with a precision ratio of 96% and a recall ratio of 95.6%. However, logistic regression created a model that was able to correctly classify more of the problematic pages. This agent can be used to create a database of scholarly research published on the Web. In addition, the technique can be used to create a …
An Examination Of The Variation In Information Systems Project Cost Estimates: The Case Of Year 2000 Compliance Projects
The year 2000 (Y2K) problem presented a fortuitous opportunity to explore the relationship between estimated costs of software projects and five cost influence dimensions described by the Year 2000 Enterprise Cost Model (Kappelman, et al., 1998) -- organization, problem, solution, resources, and stage of completion. This research was a field study survey of (Y2K) project managers in industry, government, and education and part of a joint project that began in 1996 between the University of North Texas and the Y2K Working Group of the Society for Information Management (SIM). Evidence was found to support relationships between estimated costs and organization, problem, resources, and project stage but not for the solution dimension. Project stage appears to moderate the relationships for organization, particularly IS practices, and resources. A history of superior IS practices appears to mean lower estimated costs, especially for projects in larger IS organizations. Acquiring resources, especially external skills, appears to increase costs. Moreover, projects apparently have many individual differences, many related to size and to project stage, and their influences on costs appear to be at the sub-dimension or even the individual variable level. A Revised Year 2000 Enterprise Model is presented incorporating this granularity. Two primary conclusions can be drawn from this research: (1) large software projects are very complex and thus cost estimating is also; and (2) the devil of cost estimating is in the details of knowing which of the many possible variables are the important ones for each particular enterprise and project. This points to the importance of organizations keeping software project metrics and the historical calibration of cost-estimating practices. Project managers must understand the relevant details and their interaction and importance in order to successfully develop a cost estimate for a particular project, even when rational cost models are used. This research also indicates …
An Experimental Study of Teachers' Verbal and Nonverbal Immediacy, Student Motivation, and Cognitive Learning in Video Instruction
This study used an experimental design and a direct test of recall to provide data about teacher immediacy and student cognitive learning. Four hypotheses and a research question addressed two research problems: first, how verbal and nonverbal immediacy function together and/or separately to enhance learning; and second, how immediacy affects cognitive learning in relation to student motivation. These questions were examined in the context of video instruction to provide insight into distance learning processes and to ensure maximum control over experimental manipulations. Participants (N = 347) were drawn from university students in an undergraduate communication course. Students were randomly assigned to groups, completed a measure of state motivation, and viewed a 15-minute video lecture containing part of the usual course content delivered by a guest instructor. Participants were unaware that the video instructor was actually performing one of four scripted manipulations reflecting higher and lower combinations of specific verbal and nonverbal cues, representing the four cells of the 2x2 research design. Immediately after the lecture, students completed a recall measure, consisting of portions of the video text with blanks in the place of key words. Participants were to fill in the blanks with exact words they recalled from the videotape. Findings strengthened previous research associating teacher nonverbal immediacy with enhanced cognitive learning outcomes. However, higher verbal immediacy, in the presence of higher and lower nonverbal immediacy, was not shown to produce greater learning among participants in this experiment. No interaction effects were found between higher and lower levels of verbal and nonverbal immediacy. Recall scores were comparatively low in the presence of higher verbal and lower nonverbal immediacy, suggesting that nonverbal expectancy violations may have hindered cognitive learning. Student motivation was not found to be a significant source of error in measuring immediacy's effects, and no interaction effects were detected …
Identifying At-Risk Students: An Assessment Instrument for Distributed Learning Courses in Higher Education
The current period of rapid technological change, particularly in the area of mediated communication, has combined with new philosophies of education and market forces to bring upheaval to the realm of higher education. Technical capabilities exceed our knowledge of whether expenditures on hardware and software lead to corresponding gains in student learning. Educators do not yet possess sophisticated assessments of what we may be gaining or losing as we widen the scope of distributed learning. The purpose of this study was not to draw sweeping conclusions with respect to the costs or benefits of technology in education. The researcher focused on a single issue involved in educational quality: assessing the ability of a student to complete a course. Previous research in this area indicates that attrition rates are often higher in distributed learning environments. Educators and students may benefit from a reliable instrument to identify those students who may encounter difficulty in these learning situations. This study is aligned with research focused on the individual engaged in seeking information, assisted or hindered by the capabilities of the computer information systems that create and provide access to information. Specifically, the study focused on the indicators of completion for students enrolled in video conferencing and Web-based courses. In the final version, the Distributed Learning Survey encompassed thirteen indicators of completion. The results of this study of 396 students indicated that the Distributed Learning Survey represented a reliable and valid instrument for identifying at-risk students in video conferencing and Web-based courses where the student population is similar to the study participants. Educational level, GPA, credit hours taken in the semester, study environment, motivation, computer confidence, and the number of previous distributed learning courses accounted for most of the predictive power in the discriminant function based on student scores from the survey.
MEDLINE Metric: A method to assess medical students' MEDLINE search effectiveness
Medical educators advocate the need for medical students to acquire information management skills, including the ability to search the MEDLINE database. There has been no published validated method available to use for assessing medical students' MEDLINE information retrieval skills. This research proposes and evaluates a method, designed as the MEDLINE Metric, for assessing medical students' search skills. MEDLINE Metric consists of: (a) the development, by experts, of realistic clinical scenarios that include highly constructed search questions designed to test defined search skills; (b) timed tasks (searches) completed by subjects; (c) the evaluation of search results; and (d) instructive feedback. A goal is to offer medical educators a valid, reliable, and feasible way to judge mastery of information searching skill by measuring results (search retrieval) rather than process (search behavior) or cognition (knowledge about searching). Following a documented procedure for test development, search specialists and medical content experts formulated six clinical search scenarios and questions. One hundred and forty-five subjects completed the six-item test under timed conditions. Subjects represented a wide range of MEDLINE search expertise. One hundred twenty complete cases were used, representing 53 second-year medical students (44%), 47 fourth-year medical students (39%), and 20 medical librarians (17%). Data related to educational level, search training, search experience, confidence in retrieval, difficulty of search, and score were analyzed. Evidence supporting the validity of the method includes the agreement by experts about the skills and knowledge necessary to successfully retrieve information relevant to a clinical question from the MEDLINE database. Also, the test discriminated among different performance levels. There were statistically significant, positive relationships between test score and level of education, self-reported previous MEDLINE training, and self-reported previous search experience. The findings from this study suggest that MEDLINE Metric is a valid method for constructing and administering a performance-based test to identify …
Modeling Utilization of Planned Information Technology
Implementations of information technology solutions to address specific information problems are only successful when the technology is utilized. The antecedents of technology use involve user, system, task and organization characteristics as well as externalities which can affect all of these entities. However, measurement of the interaction effects between these entities can act as a proxy for individual attribute values. A model is proposed which based upon evaluation of these interaction effects can predict technology utilization. This model was tested with systems being implemented at a pediatric health care facility. Results from this study provide insight into the relationship between the antecedents of technology utilization. Specifically, task time provided significant direct causal effects on utilization. Indirect causal effects were identified in task value and perceived utility constructs. Perceived utility, along with organizational support also provided direct causal effects on user satisfaction. Task value also impacted user satisfaction in an indirect fashion. Also, results provide a predictive model and taxonomy of variables which can be applied to predict or manipulate the likelihood of utilization for planned technology.
A Study of Graphically Chosen Features for Representation of TREC Topic-Document Sets
Document representation is important for computer-based text processing. Good document representations must include at least the most salient concepts of the document. Documents exist in a multidimensional space that difficult the identification of what concepts to include. A current problem is to measure the effectiveness of the different strategies that have been proposed to accomplish this task. As a contribution towards this goal, this dissertation studied the visual inter-document relationship in a dimensionally reduced space. The same treatment was done on full text and on three document representations. Two of the representations were based on the assumption that the salient features in a document set follow the chi-distribution in the whole document set. The third document representation identified features through a novel method. A Coefficient of Variability was calculated by normalizing the Cartesian distance of the discriminating value in the relevant and the non-relevant document subsets. Also, the local dictionary method was used. Cosine similarity values measured the inter-document distance in the information space and formed a matrix to serve as input to the Multi-Dimensional Scale (MDS) procedure. A Precision-Recall procedure was averaged across all treatments to statistically compare them. Treatments were not found to be statistically the same and the null hypotheses were rejected.
The Validity of Health Claims on the World Wide Web: A Case Study of the Herbal Remedy Opuntia
The World Wide Web has become a significant source of medical information for the public, but there is concern that much of the information is inaccurate, misleading, and unsupported by scientific evidence. This study analyzes the validity of health claims on the World Wide Web for the herbal Opuntia using an evidence-based approach, and supports the observation that individuals must critically assess health information in this relatively new medium of communication. A systematic search by means of nine search engines and online resources of Web sites relating to herbal remedies was conducted and specific sites providing information on the cactus herbal remedy from the genus Opuntia were retrieved. Validity of therapeutic health claims on the Web sites was checked by comparison with reports in the scientific literature subjected to two established quality assessment rating instruments. 184 Web sites from a variety of sources were retrieved and evaluated, and 98 distinct health claims were identified. 53 scientific reports were retrieved to validate claims. 25 involved human subjects, and 28 involved animal or laboratory models. Only 33 (34%) of the claims were addressed in the scientific literature. For 3% of the claims, evidence from the scientific reports was conflicting or contradictory. Of the scientific reports involving human subjects, none met the predefined criteria for high quality as determined by quality assessment rating instruments. Two-thirds of the claims were unsupported by scientific evidence and were based on folklore, or indirect evidence from related sources. Information on herbal remedies such as Opuntia is well represented on the World Wide Web. Health claims on Web sites were numerous and varied widely in subject matter. The determination of the validity of information about claims made for herbals on the Web would help individuals assess their value in medical treatment. However, the Web is conducive to dubious …
Empowering Agent for Oklahoma School Learning Communities: An Examination of the Oklahoma Library Improvement Program
The purposes of this study were to determine the initial impact of the Oklahoma Library Media Improvement Grants on Oklahoma school library media programs; assess whether the Oklahoma Library Media Improvement Grants continue to contribute to Oklahoma school learning communities; and examine possible relationships between school library media programs and student academic success. It also seeks to document the history of the Oklahoma Library Media Improvement Program 1978 - 1994 and increase awareness of its influence upon the Oklahoma school library media programs. Methods of data collection included: examining Oklahoma Library Media Improvement Program archival materials; sending a survey to 1703 school principals in Oklahoma; and interviewing Oklahoma Library Media Improvement Program participants. Data collection took place over a one year period. Data analyses were conducted in three primary phases: descriptive statistics and frequencies were disaggregated to examine mean scores as they related to money spent on school library media programs; opinions of school library media programs; and possible relationships between school library media programs and student academic achievement. Analysis of variance was used in the second phase of data analysis to determine if any variation between means was significant as related to Oklahoma Library Improvement Grants, time spent in the library media center by library media specialists, principal gender, opinions of library media programs, student achievement indicators, and the region of the state in which the respondent was located. The third phase of data analysis compared longitudinal data collected in the 2000 survey with past data. The primary results indicated students in Oklahoma from schools with a centralized library media center, served by a full-time library media specialist, and the school having received one or more Library Media Improvement Grants scored significantly higher academically than students in schools not having a centralized library media center, not served by a …
The Second Vatican Council and American Catholic Theological Research: A Bibliometric Analysis of Theological Studies: 1940-1995
A descriptive analysis was given of the characteristics of the authors and citations of the articles in the journal Theological Studies from 1940-1995. Data was gathered on the institutional affiliation, geographic location, occupation, and gender and personal characteristics of the author. The citation characteristics were examined for the cited authors, date and age of the citations, format, language, place of publication, and journal titles. These characteristics were compared to the time-period before and after the Second Vatican Council in order to detect any changes that might have occurred in the characteristics after certain recommendations by the council were made to theologians. Subject dispersion of the literature was also analyzed. Lotka's Law of author productivity and Bradford's Law of title dispersion were also performed for this literature. The profile of the characteristics of the authors showed that the articles published by women and laypersons has increased since the recommendations of the council. The data had a good fit to Lotka's Law for the pre-Vatican II time period but not for the period after Vatican II. The data was a good fit to Bradford's Law for the predicted number of journals in the nucleus and Zone 2, but the observed number of journals in Zone 3 was higher than predicted for all time-periods. Subject dispersion of research from disciplines other than theology is low but citation to works from the fields of education, psychology, social sciences, and science has increased since Vatican II. The results of the analysis of the characteristics of the citations showed that there was no significant change in the age, format and languages used, or the geographic location of the publisher of the cited works after Vatican II. Citation characteristics showed that authors prefer research from monographs published in English and in U.S. locations for all time-periods. Research …
Public School Educators' Use of Computer-Mediated Communication
This study examined the uses of computer-mediated communication (CMC) by educators in selected public schools. It used Rogers' Diffusion of Innovation Theory as the underpinnings of the study. CMC refers to any exchange of information that involves the use of computers for communication between individuals or individuals and a machine. This study was an exploration of difficulties users confront, what services they access, and the tasks they accomplish when using CMC. It investigated the factors that affect the use of CMC. The sample population was drawn from registered users on TENET, the Texas Education Network as of December 1997. The educators were described with frequency and percentages analyzing the demographic data. For the research, eight indices were selected to test how strongly these user and environmental attributes were associated with the use of CMC. These variables were (1) education, (2) position, (3) place of employment, (4) geographic location, (5) district size, (6) organization vitality, (7) adopter resources, and (8) instrumentality Two dependent variables were used to test for usage: (1) depth or frequency of CMC usage and amount of time spent online and (2) breadth or variety of Internet utilities used. Additionally, the users' perception of network benefits was measured. Network benefits were correlated with social interaction and perception of CMC to investigate what tasks educators were accomplishing with CMC. Correlations, SEQ CHAPTER h r 1 crosstabulations, and ANOVAs were used to analysis the data for testing the four hypotheses. The major findings of the study, based on the hypotheses tested, were that the socioeconomic variables of education and position influenced the use of CMC. A significant finding is that teachers used e-mail and for Internet resources less frequently than those in other positions. An interesting finding was that frequency of use was more significant for usage than amount of …
Relevance Thresholds: A Conjunctive/Disjunctive Model of End-User Cognition as an Evaluative Process
This investigation identifies end-user cognitive heuristics that facilitate judgment and evaluation during information retrieval (IR) system interactions. The study extends previous research surrounding relevance as a key construct for representing the value end-users ascribe to items retrieved from IR systems and the perceived effectiveness of such systems. The Lens Model of user cognition serves as the foundation for design and interpretation of the study; earlier research in problem solving, decision making, and attitude formation also contribute to the model and analysis. A self reporting instrument collected evaluative responses from 32 end-users related to 1432 retrieved items in relation to five characteristics of each item: topical, pertinence, utility, systematic, and motivational levels of relevance. The nominal nature of the data collected led to non-parametric statistical analyses that indicated that end-user evaluation of retrieved items to resolve an information problem at hand is most likely a multi-stage process. That process appears to be a cognitive progression from topic to meaning (pertinence) to functionality (use). Each step in end-user evaluative processing engages a cognitive hierarchy of heuristics that includes consideration (of appropriate cues), differentiation (the positive or negative aspects of those cues considered), and aggregation (the combination of differentiated cue aspects needed to render an evaluative label of the item in relation to the information problem at hand). While individuals may differ in their judgments and evaluations of retrieved items, they appear to make those decisions by using consistent heuristic approaches.
University Students and the Internet: Information Seeking Study
This study explored university students' information needs and seeking behaviors on the Internet. A Web-based survey was administrated one time. Two hundred responses were received from the target sample within the two weeks period of the study. Data were analyzed with descriptive statistics, factor analysis, and graphical representation. The study explored various issues related to the usability, preferences, and activities of the Internet, such as searching tools, e-mail, search engines, and preferred primary sources of everyday-life information needs. The study explored the perceptions of the students toward the Internet and the traditional library. Kuhlthau's model of the information-seeking process, which includes six stages and affective components, was utilized and modified in the construction of the Web survey. A study by Presno (1998), which includes the four types of Internet anxiety, was utilized in the construction of the Web survey. With regard to the six stages of Kuhlthau model, the majority of the respondents experienced stage 5, which was about information gathering; stage 3 had the next highest number of respondents. Very few respondents experienced stages 1 and 2. There was a systematic pattern in which, the earlier the stages the respondents were in, the more negative adjectives they selected, and vice versa. The feeling adjectives section showed a difference in the behavior between males and females. The results indicated that most students had Internet time delay anxiety. In general, the study found that students have a great interest in the Internet and consider it an important source of information for their personal, educational, and communication activities.
A mechanism for richer representation of videos for children: Calibrating calculated entropy to perceived entropy
This study explores the use of the information theory entropy equation in representations of videos for children. The calculated rates of information in the videos are calibrated to the corresponding perceived rates of information as elicited from the twelve 7 to 10 year old girls who were shown video documents. Entropy measures are calculated for several video elements: set time, set incidence, verbal time, verbal incidence, set constraint, nonverbal dependence, and character appearance. As hypothesized, mechanically calculated entropy measure (CEM) was found to be sufficiently similar to perceived entropy measure (PEM) made by children so that they can be used as useful and predictive elements of representations of children’s videos. The relationships between the CEM and the PEM show that CEM could stand for PEM in order to enrich representations for video documents for this age group. Speculations on transferring the CEM to PEM calibration to different age groups and different document types are made, as well as further implications for the field of information science.
An E-government Readiness Model
The purpose of this study is to develop an e-government readiness model and to test this model. Consistent with this model several instruments, IS assessment (ISA), IT governance (ITG), and Organization-IS alignment (IS-ALIGN) are examined for their ability to measure the readiness of one organization for e-government and to test the instruments fit in the proposed e-government model. The ISA instrument used is the result of adapting and combining the IS-SERVQUAL instrument proposed by Van Dyke, Kappelman, and Pybutok (1997), and the IS-SUCCESS instrument developed by Kappelman and Chong (2001) for the City of Denton (COD) project at UNT. The IS Success Model was first proposed by DeLone and McLean (1992), but they did not validate this model. The ITG instrument was based on the goals of the COD project for IT governance and was developed by Sanchez and Kappelman (2001) from UNT. The ISALIGN instrument was also developed by Sanchez and Kappelman (2001) for the COD project. It is an instrument based on the Malcolm Baldrige National Quality Award (MBNQA) that measures how effectively a government organization utilizes IT to support its various objectives. The EGOV instrument was adapted from the study of the Action-Audience Model developed by Koh and Balthazrd (1997) to measure how well a government organization is prepared to usher in e-government in terms of various success factors at planning, system and data levels. An on-line survey was conducted with employees of the City of Denton, Texas. An invitation letter to participate in the survey was sent to the 1100 employees of the City of Denton via email, 339 responses were received, yielding a response rate of 31%. About 168 responses were discarded because they were incomplete and had the missing values, leaving 171 usable surveys, for a usable set of responses that had a response …
Supporting Computer-Mediated Collaboration through User Customized Agents
This research investigated a neglected problem - interruption of groups by agent advisory systems. The question was whether interruption by the agent advisory system was beneficial. A survey of literature in four areas is included in this dissertation. The areas surveyed were Agents, Online Help, Computer Supported Cooperative Work(CSCW) and Awareness in CSCW. Based on the review, a human subject experiment was conducted. The study investigated whether the style of agent advisory interface improved the performance of group members. There were three sets of groups, a control set that did not have advisory agents, a set that had system provided advisory agents and a set that had group customized advisory agents. The groups worked together using a CSCW application developed with GroupKit, a CSCW toolkit. The groups with group customized advisory agents used an Agent Manager application to define advisory agents that would give them advice as they worked in the CSCW application. The findings showed that the type of advisory agents did not significantly influence the performance of the groups. The groups with customized agents performed slightly better than the other groups but the difference was not statistically significant. When notified that advice was issued, groups with customized agents and groups with provided agents seldom accessed the agent's advice. General design guidelines for agent interruption have not been solved. Future work is needed to finish the job. The definitive solution may be some mixture of the three known individual design solutions.
The Information Environment of Academic Library Directors: Use of Information Resources and Communication Technologies
This study focuses on the use of information resources and communication technologies, both traditional and electronic, by academic library directors. The purpose is to improve understanding of managerial behavior when using information resources and communication technologies within a shared information environment. Taylor's concept of an information use environment is used to capture the elements associated with information use and communication within the context of decision-making styles, managerial roles, organizational environments, and professional communities. This qualitative study uses interviews, observations, questionnaires, and documents. Library directors participating in the study are from doctoral-degree granting universities in the southwestern United States. Data collection involved on-site observations with a PDA (personal digital assistant), structured interviews with library directors and their administrative assistants, the Decision Style Inventory, and a questionnaire based on Mintzberg's managerial roles. Findings show the existence of a continuum in managerial activities between an Administrator and an Administrator/Academic as critical to understanding information use and communication patterns among library directors. There is a gap between self-perception of managerial activities and actual performance, a finding that would not have surfaced without the use of multiple methods. Other findings include the need for a technical ombudsman, a managerial-level position reporting to the library director; the importance of information management as an administrative responsibility; the importance of trust when evaluating information; and the importance of integrating information and communication across formats, time, and managerial activities.
Smoothing the information seeking path: Removing representational obstacles in the middle-school digital library.
Middle school student's interaction within a digital library is explored. Issues of interface features used, obstacles encountered, search strategies and search techniques used, and representation obstacles are examined. A mechanism for evaluating user's descriptors is tested and effects of augmenting the system's resource descriptions with these descriptors on retrieval is explored. Transaction log data analysis (TLA) was used, with external corroborating achievement data provided by teachers. Analysis was conducted using quantitative and qualitative methods. Coding schemes for the failure analysis, search strategies and techniques analysis, as well as extent of match analysis between terms in student's questions and their search terms, and extent of match analysis between search terms and controlled vocabulary were developed. There are five chapters with twelve supporting appendixes. Chapter One presents an introduction to the problem and reviews the pilot study. Chapter Two presents the literature review and theoretical basis for the study. Chapter Three describes the research questions, hypotheses and methods. Chapter Four presents findings. Chapter Five presents a summary of the findings and their support of the hypotheses. Unanticipated findings, limitations, speculations, and areas of further research are indicated. Findings indicate that middle school users interact with the system in various sequences of patterns. User groups' interactions and scaffold use are influenced by the teacher's objectives for using the ADL. Users preferred to use single word searches over Boolean, phrase or natural language searches. Users tended to use a strategy of repeating the same exact search, instead of using the advanced scaffolds. A high percent of users attempted at least one search that included spelling or typographical errors, punctuation, or sequentially repeated searches. Search terms matched the DQ's in some instantiation 54% of all searches. Terms used by the system to represent the resources do not adequately represent the user groups' information needs, however, …
A Comparison of Communication Motives of On-Site and Off-Site Students in Videoconference-Based Courses
The objective of this investigation is to determine whether student site location in an instructional videoconference is related to students' motives for communicating with their instructor. The study is based, in part, on the work of Martin et al. who identify five separate student-teacher communication motives. These motives, or dimensions, are termed relational, functional, excuse, participation, and sycophancy, and are measured by a 30-item questionnaire. Several communication-related theories were used to predict differences between on-site and off-site students, Media richness theory was used, foundationally, to explain differences between mediated and face-to-face communication and other theories such as uncertainty reduction theory were used in conjunction with media richness theory to predict specific differences.Two hundred eighty-one completed questionnaires were obtained from Education and Library and Information Science students in 17 separate course-sections employing interactive video at the University of North Texas during the Spring and Summer semesters of the 2001/2002 school year. This study concludes that off-site students in an instructional videoconference are more likely than their on-site peers to report being motivated to communicate with their instructor for participation reasons. If off-site students are more motivated than on-site students to communicate as a means to participate, then it may be important for instructors to watch for actual differences in participation levels, and instructors may need to be well versed in pedagogical methods that attempt to increase participation, The study also suggests that current teaching methods being employed in interactive video environments may be adequate with regard to functional, excuse-making, relational and sycophantic communication.
Solutions for Dynamic Channel Assignment and Synchronization Problem for Distributed Wireless Multimedia System
The recent advances in mobile computing and distributed multimedia systems allow mobile hosts (clients) to access wireless multimedia Data at anywhere and at anytime. In accessing multimedia information on the distributed multimedia servers from wireless personal communication service systems, a channel assignment problem and synchronization problems should be solved efficiently. Recent demand for mobile telephone service have been growing rapidly while the electro-magnetic spectrum of frequencies allocated for this purpose remain limited. Any solution to the channel assignment problem is subject to this limitation, as well as the interference constraint between adjacent channels in the spectrum. Channel allocation schemes provide a flexible and efficient access to bandwidth in wireless and mobile communication systems. In this dissertation, both an efficient distributed algorithm for dynamic channel allocation based upon mutual exclusion model, and an efficient distributed synchronization algorithm using Quasi-sink for wireless and mobile multimedia systems to ensure and facilitate mobile client access to multimedia objects are proposed. Algorithm's performance with several channel systems using different types of call arrival patterns is determined analytically. A set of simulation experiments to evaluate the performance of our scheme using message complexity and buffer usage at each frame arrival time.
A Common Representation Format for Multimedia Documents
Multimedia documents are composed of multiple file format combinations, such as image and text, image and sound, or image, text and sound. The type of multimedia document determines the form of analysis for knowledge architecture design and retrieval methods. Over the last few decades, theories of text analysis have been proposed and applied effectively. In recent years, theories of image and sound analysis have been proposed to work with text retrieval systems and progressed quickly due in part to rapid progress in computer processing speed. Retrieval of multimedia documents formerly was divided into the categories of image and text, and image and sound. While standard retrieval process begins from text only, methods are developing that allow the retrieval process to be accomplished simultaneously using text and image. Although image processing for feature extraction and text processing for term extractions are well understood, there are no prior methods that can combine these two features into a single data structure. This dissertation will introduce a common representation format for multimedia documents (CRFMD) composed of both images and text. For image and text analysis, two techniques are used: the Lorenz Information Measurement and the Word Code. A new process named Jeong's Transform is demonstrated for extraction of text and image features, combining the two previous measurements to form a single data structure. Finally, this single data measurements to form a single data structure. Finally, this single data structure is analyzed by using multi-dimensional scaling. This allows multimedia objects to be represented on a two-dimensional graph as vectors. The distance between vectors represents the magnitude of the difference between multimedia documents. This study shows that image classification on a given test set is dramatically improved when text features are encoded together with image features. This effect appears to hold true even when the available …
CT3 as an Index of Knowledge Domain Structure: Distributions for Order Analysis and Information Hierarchies
The problem with which this study is concerned is articulating all possible CT3 and KR21 reliability measures for every case of a 5x5 binary matrix (32,996,500 possible matrices). The study has three purposes. The first purpose is to calculate CT3 for every matrix and compare the results to the proposed optimum range of .3 to .5. The second purpose is to compare the results from the calculation of KR21 and CT3 reliability measures. The third purpose is to calculate CT3 and KR21 on every strand of a class test whose item set has been reduced using the difficulty strata identified by Order Analysis. The study was conducted by writing a computer program to articulate all possible 5 x 5 matrices. The program also calculated CT3 and KR21 reliability measures for each matrix. The nonparametric technique of Order Analysis was applied to two sections of test items to stratify the items into difficulty levels. The difficulty levels were used to reduce the item set from 22 to 9 items. All possible strands or chains of these items were identified so that both reliability measures (CT3 and KR21) could be calculated. One major finding of this study indicates that .3 to .5 is a desirable range for CT3 (cumulative p=.86 to p=.98) if cumulative frequencies are measured. A second major finding is that the KR21 reliability measure produced an invalid result more than half the time. The last major finding is that CT3, rescaled to range between 0 and 1, supports De Vellis' guidelines for reliability measures. The major conclusion is that CT3 is a better measure of reliability since it considers both inter- and intra-item variances.
The Effect of Personality Type on the Use of Relevance Criteria for Purposes of Selecting Information Sources.
Even though information scientists generally recognize that relevance judgments are multidimensional and dynamic, there is still discussion and debate regarding the degree to which certain internal (cognition, personality) and external (situation, social relationships) factors affect the use of criteria in reaching those judgments. Much of the debate centers on the relationship of those factors to the criteria and reliable methods for measuring those relationships. This study researched the use of relevance criteria to select an information source by undergraduate students whose task it is to create a course schedule for a semester. During registration periods, when creating their semester schedules, students filled out a two-part questionnaire. After completion of the questionnaire the students completed a Myers-Briggs Type Indicator instrument in order to determine their personality type. Data was analyzed using one-way ANOVAS and Chi-Square. A positive correlation exists between personality type as expressed by the MBTI and the information source selected as most important by the subject. A correlation also exists between personality type and relevance criteria use. The correlation is stronger for some criteria than for others. Therefore, one can expect personality type to have an effect on the use of relevance criteria while selecting information sources.
A study of on-line use and perceived effectiveness of compliance-gaining in health-related banner advertisements for senior citizens.
This research investigated banner ads on the World Wide Web, specifically the types of messages used in those ads and the effectiveness of the ads as seen by their intended audience. The focus was on health-related banner advertisements targeting senior citizens. The study first sought to determine the frequency of appearance of those ads when classified into categories of compliance-gaining tactics provided by research scholars. Second, the study explored the relative perceived effectiveness among those categories. Two graduate students from a Central Texas university sorted text messages into predetermined compliance-gaining categories. Chi square tests looked for significant differences in the frequencies of banner ads in each category. Forty-five senior citizens from the Central Texas area completed surveys regarding the perceived effectiveness of a randomly ordered, randomly selected set of categorized banner ads. A repeated measures test attempted to determine whether some compliance-gaining strategies used in health-related banner ads were perceived as more effective than others. The hypothesis stated that there would be differences in frequencies of compliance-gaining strategies used among the compliance-gaining categories in health-related banner ads for senior citizens. The hypothesis was supported. The research question asked if some categories of compliance-gaining strategies used in health-related banner ads were perceived as more effective than others by senior citizens. There was no evidence that senior citizens perceived any compliance-gaining category as being more effective than any other. However, post hoc analyses revealed trends in the types of compliance-gaining messages senior citizens perceived as more effective. These trends provide a basis for directional predictions in future studies.
Accessing Information on the World Wide Web: Predicting Usage Based on Involvement
Advice for Web designers often includes an admonition to use short, scannable, bullet-pointed text, reflecting the common belief that browsing the Web most often involves scanning rather than reading. Literature from several disciplines focuses on the myriad combinations of factors related to online reading but studies of the users' interests and motivations appear to offer a more promising avenue for understanding how users utilize information on Web pages. This study utilized the modified Personal Involvement Inventory (PII), a ten-item instrument used primarily in the marketing and advertising fields, to measure interest and motivation toward a topic presented on the Web. Two sites were constructed from Reader's Digest Association, Inc. online articles and a program written to track students' use of the site. Behavior was measured by the initial choice of short versus longer versions of the main page, the number of pages visited and the amount of time spent on the site. Data were gathered from students at a small, private university in the southwest part of the United States to answer six hypotheses which posited that subjects with higher involvement in a topic presented on the Web and a more positive attitude toward the Web would tend to select the longer text version, visit more pages, and spend more time on the site. While attitude toward the Web did not correlate significantly with any of the behavioral factors, the level of involvement was associated with the use of the sites in two of three hypotheses, but only partially in the manner hypothesized. Increased involvement with a Web topic did correlate with the choice of a longer, more detailed initial Web page, but was inversely related to the number of pages viewed so that the higher the involvement, the fewer pages visited. An additional indicator of usage, the average amount …
An Empirical Investigation of Critical Factors that Influence Data Warehouse Implementation Success in Higher Educational Institutions
Data warehousing (DW) in the last decade has become the technology of choice for building data management infrastructures to provide organizations the decision-making capabilities needed to effectively carry out its activities. Despite its phenomenal growth and importance to organizations the rate of DW implementation success has been less than stellar. Many DW implementation projects fail due to technical or organizational reasons. There has been limited research on organizational factors and their role in DW implementations. It is important to understand the role and impact of both technical but organizational factors in DW implementations and their relative importance to implementation performance. A research model was developed to test the significance of technical and organizational factors in the three phases of implementation with DW implementation performance. The independent variables were technical (data, technology, and expertise) and organizational (management, goals, users, organization). The dependent variable was performance (content, accuracy, format, ease of use, and timeliness). The data collection method was a Web based survey of DW implementers and DW users sampled (26) from a population of 108 identified DW implementations. Regression was used as the multivariate statistical technique to analyze the data. The results show that organization factors are significantly related to performance. Also, that some variables in the post-implementation phase have a significant relationship with performance. Based on the results of the tests the model was revised to reflect the relative impact of technical and organizational factors on DW performance. Results suggest that in some cases organizational factors have a significant relationship with DW implementation performance. The implications and interpretation of these results provide researchers and practitioners' insights and a new perspective in the area of DW implementations.
Measuring the accuracy of four attributes of sound for conveying changes in a large data set.
Human auditory perception is suited to receiving and interpreting information from the environment but this knowledge has not been used extensively in designing computer-based information exploration tools. It is not known which aspects of sound are useful for accurately conveying information in an auditory display. An auditory display was created using PD, a graphical programming language used primarily to manipulate digital sound. The interface for the auditory display was a blank window. When the cursor is moved around in this window, the sound generated would changed based on the underlying data value at any given point. An experiment was conducted to determine which attribute of sound most accurately represents data values in an auditory display. The four attributes of sound tested were frequency-sine waveform, frequency-sawtooth waveform, loudness and tempo. 24 subjects were given the task of finding the highest data point using sound alone using each of the four sound treatments. Three dependent variables were measured: distance accuracy, numeric accuracy, and time on task. Repeated measures ANOVA procedures conducted on these variables did not rise to the level of statistical significance (α=.05). None of the sound treatments was more accurate than the other as representing the underlying data values. 52% of the trials were accurate within 50 pixels of the highest data point (target). An interesting finding was the tendency for the frequency-sin waveform to be used in the least accurate trial attempts (38%). Loudness, on the other hand, accounted for very few (12.5%) of the least accurate trial attempts. In completing the experimental task, four different search techniques were employed by the subjects: perimeter, parallel sweep, sector, and quadrant. The perimeter technique was the most commonly used.
Perceived value of journals for academic prestige, general reading and classroom use: A study of journals in educational and instructional technology.
Conducting research, evaluating research, and publishing scholarly works all play an extremely prominent role for university faculty members. Tenure and promotion decisions are greatly influenced by the perceived value of publications as viewed by members of faculty evaluation committees. Faculty members seeking tenure may be limited to publishing in a limited group of journals perceived to be valuable by members of an academic committee. This study attempted to determine the value of various kinds of periodicals (journals, magazines, and e-journals), based on three principal criteria, as perceived by professionals (university faculty, K-12 practitioners, and corporate trainers) in the educational/instructional technology (E/IT) field. The criteria for journal evaluation were Academic Prestige, General Reading, and Classroom Use. The perceived value of journals based on each criterion was compared to determine any significant differences. Members of the Association for Educational Communications and Technology (AECT) were asked to rate 30 journals in the E/IT field using the three criteria. Statistically significant differences were found among ratings in 63% of the journals. The statistical analyses indicated that differences in the perceived value of journals among E/IT professionals across the three criteria (Academic Prestige, General Reading, and Classroom Use) were statistically significant. It is also noted that refereed journals were rated higher than nonrefereed journals for the Academic Prestige criterion. Survey respondents indicated that individual journals were not valued for the same reasons. This finding implies that the formation of any equitable measure for determining the value of faculty members' journal article publications would be best if based on definable criteria determined by colleagues. Lists of valued journals for each area of faculty assessment would provide standards of excellence both inside and outside the E/IT field for those who serve on tenure and promotion committees in educational institutions.
Children's Color Association for Digital Image Retrieval.
In the field of information sciences, attention has been focused on developing mature information retrieval systems that abstract information automatically from the contents of information resources, such as books, images and films. As a subset of information retrieval research, content-based image retrieval systems automatically abstract elementary information from images in terms of colors, shapes, and texture. Color is the most commonly used in similarity measurement for content-based image retrieval systems. Human-computer interface design and image retrieval methods benefit from studies based on the understanding of their potential users. Today's children are exposed to digital technology at a very young age, and they will be the major technology users in five to ten years. This study focuses on children's color perception and color association with a controlled set of digital images. The method of survey research was used to gather data for this exploratory study about children's color association from a children's population, third to sixth graders. An online questionnaire with fifteen images was used to collect quantitative data of children's color selections. Face-to-face interviews investigated the rationale and factors affecting the color choices and children's interpretation of the images. The findings in this study indicate that the color children associated with in the images was the one that took the most space or the biggest part of an image. Another powerful factor in color selection was the vividness or saturation of the color. Colors that stood out the most generally attracted the greatest attention. Preferences of color, character, or subject matter in an image also strongly affected children's color association with images. One of the most unexpected findings was that children would choose a color to replace a color in an image. In general, children saw more things than what were actually represented in the images. However, the children's interpretation …
An Evaluation of the Effect of Learning Styles and Computer Competency on Students' Satisfaction on Web-Based Distance Learning Environments
This study investigates the correlation between students' learning styles, computer competency and student satisfaction in Web-based distance learning. Three hundred and one graduate students participated in the current study during the Summer and Fall semesters of 2002 at the University of North Texas. Participants took the courses 100% online and came to the campus only once for software training. Computer competency and student satisfaction were measured using the Computer Skill and Use Assessment and the Student Satisfaction Survey questionnaires. Kolb's Learning Style Inventory measured students' learning styles. The study concludes that there is a significant difference among the different learning styles with respect to student satisfaction level when the subjects differ with regard to computer competency. For accommodating amd diverging styles, a higher level of computer competency results in a higher level of student satisfaction. But for converging and assimilating styles, a higher level of computer competency suggests a lower level of student satisfaction. A significant correlation was found between computer competency and student satisfaction level within Web-based courses for accommodating styles and no significant results were found in the other learning styles.
Information systems assessment: development of a comprehensive framework and contingency theory to assess the effectiveness of the information systems function.
The purpose of this research is to develop a comprehensive, IS assessment framework using existing IS assessment theory as a base and incorporating suggestions from other disciplines. To validate the framework and to begin the investigation of current IS assessment practice, a survey instrument was developed. A small group of subject matter experts evaluated and improved the instrument. The instrument was further evaluated using a small sample of IS representatives. Results of this research include a reexamination of the IS function measurement problem using new frameworks of analyses yielding (a) guidance for the IS manager or executive on which IS measures might best fit their organization, (b) a further verification of the important measures most widely used by IS executives, (c) a comprehensive, theoretically-derived, IS assessment framework, and by (d) the enhancement of IS assessment theory by incorporating ideas from actual practice. The body of knowledge gains a comprehensive, IS assessment framework that can be further tested for usefulness and applicability. Future research is recommended to substantiate and improve on these findings. Chapter 2 is a complete survey of prior research, subdivided by relevant literature divisions, such as organizational effectiveness, quality management, and IS assessment. Chapter 3 includes development of and support for the research questions, IS assessment framework, and the research model. Chapter 4 describes how the research was conducted. It includes a brief justification for the research approach, a description of how the framework was evaluated, a description of how the survey instrument was developed and evaluated, a description of the participants and how they were selected, a synopsis of the data collection procedures, a brief description of follow-up procedures, and a summary. Chapter 5 presents the results of the research. Chapter 6 is a summary and conclusion of the research. Finally, included in the appendices are definitions …
The Physiology of Collaboration: An Investigation of Library-Museum-University Partnerships
Collaboration appears to be a magical solution for many problems when there is scarcity of resources, lack of knowledge or skills, and/or environmental threats. However, there is little knowledge about the nature of collaboration. A holistic conceptual framework was developed for the collaborative process, and the conceptualization process used systems thinking approach. The author has selectively chosen conceptualizations and/or research by a limited subset of scholars whose ideas appeared to be the most relevant and useful to explore the type of collaboration studied here. In other words, the selection of the literature was based on an eclectic selection. Multiple cases were used in this research to understand the factors that are components of collaborative effort among non-profit organizations and the relationships among those factors. This study also investigated the stages of collaborative process. Data were collected from 54 participants who were partners in collaborate projects funded by the Institute of Museum and Library Services (IMLS). Among these 54 participants, 50 answered the online questionnaire and 38 received the telephone interviews. The data collected was analyzed using cluster analysis, multidimensional scaling, internal consistency reliability, and descriptive statistics. The component factors of collaboration were grouped by the following seven concepts: trustworthiness, competence, dependency, misunderstanding and/or conflict, complexity, commitment and mechanism of coordination. This study showed twelve relationships among these factors. For instance, different points of view and partners' capacity to maintain inter-organizational relationships were found to be opposite concepts. In addition, the findings in this study indicate that 84% of participants reported the presence of the five pre-defined stages: execution, networking, definition, relationship, and common evaluation.
The gathering and use of information by fifth grade students with access to Palm® handhelds.
Handheld computers may hold the possibility for a one-to-one computer: student ratio. The impact of the use of Palm® (Palm, Inc.) handhelds on information acquisition and use by 5th grade students in a North Texas school during a class research project was investigated. Five research questions were examined using observation, interviews, surveys, and document analysis. Are there differences in information gathering and use with the Palm between gifted, dyslexic, and regular learners? What relevance criteria do students use to evaluate a web site to determine whether to download the site to the Palm and afterwards whether to use the downloaded site's information in the report? How do the Palms affect the writing process? Do the animations and concept maps produced on the Palm demonstrate understanding of the intended concepts? Are there significant differences in results (i.e., final products grade) between Palm users and non-Palm users? Three groups of learners in the class, gifted, dyslexic, and regular learners, participated in the study. The regular and dyslexic students reported using Web sites that had not been downloaded to the Palm. Students reported several factors used to decide whether to download Web sites, but the predominant deciding factor was the amount of information. The students used a combination of writing on paper and the Palm in the preparation of the report. Many students flipped between two programs, FreeWrite and Fling-It, finding information and then writing the facts into the report. The peer review process was more difficult with the Palm. Most students had more grammatical errors in this research report than in previous research projects. By creating animated drawings on the Palm handheld, the students demonstrated their understanding of the invention though sometimes the media or the student's drawing skills limited the quality of the final product. Creating the animations was motivational and …
Knowledge synthesis in the biomedical literature: Nordihydroguaiaretic acid and breast cancer.
This dissertation refines knowledge synthesis from publicly accessible databases, based on the model of D.R. Swanson. Knowledge synthesis endeavors bring together two or more non-interactive literatures to create combinatorial research data on a specific topic. In this endeavor the biomedical literature was searched on the anti-neoplastic agent nordihydroguaiaretic acid (NDGA) for its potential role as a functional food in the chemoprevention of breast cancer. Bibliometric cocitation was utilized to identify complementary but non-interactive literatures in the disciplines of biomedicine and dietary science. The continuing specialization and fragmentation of the cancer literature degenerates the potential usefulness of cross-disciplinary research and information. As the biomedical sciences become more specialized the potential increases for isolation of discoveries and for failures to connect science to the needs of the people. Within the information science discipline several techniques are available to bridge the isolation between discoveries recorded in different sets of literatures. Electronic database searching with combinatorial keyword entries, syllogistic modeling and bibliometric author cocitation analysis are the principle techniques applied in this endeavor. The research questions are addressed to the absence or presence of human in vivo research on breast cancer with the potentially chemopreventative functional food NDGA. Utilizing a syllogistic model the literatures of functional foods, nordihydroguaiaretic acid and breast cancer were searched with designated combinatorial keywords. The documents retrieved were subjected to author cocitation analysis to demonstrate disjointness or connectivity of the two complementary literatures. The results demonstrated a possible preventative relationship between breast cancer in women and nordihydroguaiaretic acid, a phytochemical antioxidant and potential functional food. The results of this study are consistent with D.R. Swanson's pioneering work in knowledge synthesis. Swanson's methods can be used to identify non-interactive, disjoint literatures. Continuing support for his techniques has been demonstrated.
Testing a model of the relationships among organizational performance, IT-business alignment and IT governance.
Information Technology (IT) is often viewed as a resource that is capable of enhancing organizational performance. However, it is difficult for organizations to measure the actual contribution of IT investments. Despite an abundance of literature, there is an insufficiency of generally applicable frameworks and instruments to help organizations definitively assess the relationship among organizational performance, IT-business alignment, and IT governance. Previous studies have emphasized IT-business alignment as an important enabler of organizational effectiveness; however, the direct and indirect effects of IT governance have not been incorporated into these studies. The purpose of this study was (1) to propose a new model that defines the relationships among IT governance, IT-business alignment, and organizational performance, (2) to develop and validate measures for the IT governance and IT-business alignment constructs, and (3) to test this IT Governance-Alignment-Performance or "IT GAP" model. This study made some novel contributions to the understanding of the factors affecting organizational performance. The quest for IT-business alignment in the MIS literature has been based on the presumption that IT contributes directly to organizational performance. However, this study found that although IT-business alignment does contribute to organizational performance, IT governance is an important antecedent of both IT-business alignment and organizational performance. The major contributions of this work are the development and validation of uni-dimensional scales for both IT-business alignment and IT governance, and the confirmation of the validity of the IT GAP model to explain the hypothesized relationships among the three constructs. Future studies may improve upon this research by using different organizations settings, industries, and stakeholders. This study indicates that in order for organizations to improve the value, contribution, and alignment of IT investments they first need to improve the ways in which they govern their IT activities and the processes and mechanisms by which IT decisions are made.
The Effect of Information Literacy Instruction on Library Anxiety Among International Students
This study explored what effect information literacy instruction (ILI) may have on both a generalized anxiety state and library anxiety specifically. The population studied was international students using resources in a community college. Library anxiety among international students begins with certain barriers that cause anxiety (i.e., language/communication barriers, adjusting to a new education/library system and general cultural adjustments). Library Anxiety is common among college students and is characterized by feelings of negative emotions including, ruminations, tension, fear and mental disorganization (Jiao & Onwuegbuzie, 1999a). This often occurs when a student contemplates conducting research in a library and is due to any number of perceived inabilities about using the library. In order for students to become successful in their information seeking behavior this anxiety needs to be reduced. The study used two groups of international students enrolled in the English for Speakers of other Languages (ESOL) program taking credit courses. Each student completed Bostick's Library Anxiety Scale (LAS) and Spielberger's State-Trait Anxiety Inventory (STAI) to assess anxiety level before and after treatment. Subjects were given a research assignment that required them to use library resources. Treatment: Group 1 (experimental group) attended several library instruction classes (the instruction used Kuhltau's information search process model). Group 2 (control group) was in the library working on assignment but did not receive any formal library instruction. After the treatment the researcher and ESOL program instructor(s) measured the level of anxiety between groups. ANCOVA was used to analyze Hypotheses 1 and 2, which compared pretest and posttest for each group. Research assignment grades were used to analyze Hypothesis 3 comparing outcomes among the two groups. The results of the analysis ascertained that ILI was associated with reducing state and library anxiety among international students when given an assignment using library resources.
A Mythic Perspective of Commodification on the World Wide Web
Capitalism's success, according to Karl Marx, is based on continued development of new markets and products. As globalization shrinks the world marketplace, corporations are forced to seek both new customers and products to sell. Commodification is the process of transforming objects, ideas and even people into merchandise. The recent growth of the World Wide Web has caught the attention of the corporate world, and they are attempting to convert a free-share-based medium into a profit-based outlet. To be successful, they must change Web users' perception about the nature of the Web itself. This study asks the question: Is there mythic evidence of commodification on the World Wide Web? It examines how the World Wide Web is presented to readers of three national publications-Wired, Newsweek, and Business Week-from 1993 to 2000. It uses Barthes' two-tiered model of myths to examine the descriptors used to modify and describe the World Wide Web. The descriptors were clustered into 11 general categories, including connectivity, social, being, scene, consumption, revolution, tool, value, biology, arena, and other. Wired articles did not demonstrate a trend in categorical change from 1993 to 2000; the category of choice shifted back and forth between Revolution, Connectivity, Scene, and Being. Newsweek articles demonstrated an obvious directional shift. Connectivity is the dominant myth from 1994 to 1998, when the revolution category dominates. Similarly, Business Week follows the prevailing myth of connectivity from 1994 to 1997. From 1998 on, the competition-related categories of revolution and arena lead all categories. The study finds evidence of commodification on the World Wide Web, based on the trend in categories in Newsweek and Business Week that move from a foundational myth that presents a perception of cooperation in 1994 to one of competition in 1998 and later. The study recommends further in-depth research of the target publications, …
An Observational Investigation of On-Duty Critical Care Nurses' Information Behavior in a Nonteaching Community Hospital
Critical care nurses work in an environment rich in informative interactions. Although there have been post hoc self-report survey studies of nurses' information seeking, there have been no observational studies of the patterns and content of their on-duty information behavior. This study used participant observation and in-context interviews to describe 50 hours of the observable information behavior of a representative sample of critical care nurses in a 20-bed critical care hospital unit. The researcher used open, in vivo, and axial coding to develop a grounded theory model of their consistent pattern of multimedia interactions. The resulting Nurse's Patient-Chart Cycle describes nurses' activities during the shift as centering on a regular alternation with the patient and the patient's chart (various record systems), clearly bounded with nursing "report" interactions at the beginning and the end of the shift. The nurses' demeanor markedly changed between interactions with the chart and interactions with the patient. Other informative interactions were observed with other health care workers and the patient's family, friends and visitors. The nurses' information seeking was centered on the patient. They mostly sought information from people, the patient record and other digital systems. They acted on or passed on most of the information they found. Some information they recorded for their personal use during the shift. The researcher observed the nurses using mostly patient specific information, but they also used some social and logistic information. They occasionally sought knowledge based information. Barriers to information acquisition included illegible handwriting, difficult navigation of online systems, equipment failure, unavailable people, social protocols and mistakes caused by multi-tasking people working with multiple complex systems. No formal use was observed of standardized nursing diagnoses, nursing interventions, or nursing outcomes taxonomies. While the nurses expressed respect for evidence-based practice, there clearly was no time or opportunity for reading research …
Wayfinding tools in public library buildings: A multiple case study.
Wayfinding is the process of using one or more tools to move from one location to another in order to accomplish a task or to achieve a goal. This qualitative study explores the process of wayfinding as it applies to locating information in a public library. A group of volunteers were asked to find a selection of items in three types of libraries-traditional, contemporary, and modern. The retrieval process was timed and the reactions of the volunteers were recorded, documented, and analyzed. The impact of various wayfinding tools-architecture, layout, color, signage, computer support, collection organization-on the retrieval process was also identified. The study revealed that many of the wayfinding tools currently available in libraries do not facilitate item retrieval. Inconsistencies, ambiguities, obstructions, disparities, and operational deficiencies all contributed to end-user frustration and retrieval failure. The study suggests that failing to address these issues may prompt library patrons-end users who are increasingly interested in finding information with minimal expenditures of time and effort-may turn to other information-retrieval strategies and abandon a system that they find confusing and frustrating.
Cognitive Playfulness, Innovativeness, and Belief of Essentialness: Characteristics of Educators who have the Ability to Make Enduring Changes in the Integration of Technology into the Classroom Environment.
Research on the adoption of innovation is largely limited to factors affecting immediate change with few studies focusing on enduring or lasting change. The purpose of the study was to examine the personality characteristics of cognitive playfulness, innovativeness, and essentialness beliefs in educators who were able to make an enduring change in pedagogy based on the use of technology in the curriculum within their assigned classroom settings. The study utilized teachers from 33 school districts and one private school in Texas who were first-year participants in the Intel® Teach to the Future program. The research design focused on how cognitive playfulness, innovativeness, and essentialness beliefs relate to a sustained high level of information technology use in the classroom. The research questions were: 1) Are individuals who are highly playful more likely to continue to demonstrate an ability to integrate technology use in the classroom at a high level than those who are less playful? 2) Are individuals who are highly innovative more likely to continue to demonstrate an ability to integrate technology use in the classroom at a high level than those who are less innovative? 3) Are individuals who believe information technology use is critical and indispensable to their teaching more likely to continue to demonstrate an ability to integrate technology use in the classroom at a high level than those who believe it is supplemental and not essential? The findings of the current study indicated that playfulness, innovativeness, and essentialness scores as defined by the scales used were significantly correlated to an individual's sustained ability to use technology at a high level. Playfulness was related to the educator's level of innovativeness, as well. Also, educators who believed the use of technology was critical and indispensable to their instruction were more likely to be able to demonstrate a sustained …
Information Needs of Art Museum Visitors: Real and Virtual
Museums and libraries are considered large repositories of human knowledge and human culture. They have similar missions and goals in distributing accumulated knowledge to society. Current digitization projects allow both, museums and libraries to reach a broader audience, share their resources with a variety of users. While studies of information seeking behavior, retrieval systems and metadata in library science have a long history; such research studies in museum environments are at their early experimental stage. There are few studies concerning information seeking behavior and needs of virtual museum visitors, especially with the use of images in the museums' collections available on the Web. The current study identifies preferences of a variety of user groups about the information specifics on current exhibits, museum collections metadata information, and the use of multimedia. The study of information seeking behavior of users groups of museum digital collections or cultural collections allows examination and analysis of users' information needs, and the organization of cultural information, including descriptive metadata and the quantity of information that may be required. In addition, the study delineates information needs that different categories of users may have in common: teachers in high schools, students in colleges and universities, museum professionals, art historians and researchers, and the general public. This research also compares informational and educational needs of real visitors with the needs of virtual visitors. Educational needs of real visitors are based on various studies conducted and summarized by Falk and Dierking (2000), and an evaluation of the art museum websites previously conducted to support the current study.
A Comparative Analysis of Style of User Interface Look and Feel in a Synchronous Computer Supported Cooperative Work Environment
The purpose of this study is to determine whether the style of a user interface (i.e., its look and feel) has an effect on the usability of a synchronous computer supported cooperative work (CSCW) environment for delivering Internet-based collaborative content. The problem motivating this study is that people who are located in different places need to be able to communicate with one another. One way to do this is by using complex computer tools that allow users to share information, documents, programs, etc. As an increasing number of business organizations require workers to use these types of complex communication tools, it is important to determine how users regard these types of tools and whether they are perceived to be useful. If a tool, or interface, is not perceived to be useful then it is often not used, or used ineffectively. As organizations strive to improve communication with and among users by providing more Internet-based collaborative environments, the users' experience in this form of delivery may be tied to a style of user interface look and feel that could negatively affect their overall acceptance and satisfaction of the collaborative environment. The significance of this study is that it applies the technology acceptance model (TAM) as a tool for evaluating style of user interface look and feel in a collaborative environment, and attempts to predict which factors of that model, perceived ease of use and/or perceived usefulness, could lead to better acceptance of collaborative tools within an organization.
The Impact of Predisposition Towards Group Work on Intention to Use a CSCW System
Groupware packages are increasingly being used to support content delivery, class discussion, student to student and student to faculty interactions and group work on projects. This research focused on groupware packages that are used to support students who are located in different places, but who are assigned group projects as part of their coursework requirements. In many cases, students are being asked to use unfamiliar technologies that are very different from those that support personal productivity. For example, computer-assisted cooperative work (CSCW) technology is different from other more traditional, stand-alone software applications because it requires the user to interact with the computer as well as other users. However, familiarity with the technology is not the only requirement for successful completion of a group assigned project. For a group to be successful, it must also have a desire to work together on the project. If this pre-requisite is not present within the group, then the technology will only create additional communication and coordination barriers. How much of an impact does each of these factors have on the acceptance of CSCW technology? The significance of this study is threefold. First, this research contributed to how a user's predisposition toward group work affects their acceptance of CSCW technology. Second, it helped identify ways to overcome some of the obstacles associated with group work and the use of CSCW technology in an academic online environment. Finally, it helped identify early adopters of CSCW software and how these users can form the critical mass required to diffuse the technology. This dissertation reports the impact of predisposition toward group work and prior computer experience on the intention to use synchronous CSCW. It was found that predisposition toward group work was not only positively associated to perceived usefulness; it was also related to intention to use. It …
Perceived features and similarity of images: An investigation into their relationships and a test of Tversky's contrast model.
The creation, storage, manipulation, and transmission of images have become less costly and more efficient. Consequently, the numbers of images and their users are growing rapidly. This poses challenges to those who organize and provide access to them. One of these challenges is similarity matching. Most current content-based image retrieval (CBIR) systems which can extract only low-level visual features such as color, shape, and texture, use similarity measures based on geometric models of similarity. However, most human similarity judgment data violate the metric axioms of these models. Tversky's (1977) contrast model, which defines similarity as a feature contrast task and equates the degree of similarity of two stimuli to a linear combination of their common and distinctive features, explains human similarity judgments much better than the geometric models. This study tested the contrast model as a conceptual framework to investigate the nature of the relationships between features and similarity of images as perceived by human judges. Data were collected from 150 participants who performed two tasks: an image description and a similarity judgment task. Qualitative methods (content analysis) and quantitative (correlational) methods were used to seek answers to four research questions related to the relationships between common and distinctive features and similarity judgments of images as well as measures of their common and distinctive features. Structural equation modeling, correlation analysis, and regression analysis confirmed the relationships between perceived features and similarity of objects hypothesized by Tversky (1977). Tversky's (1977) contrast model based upon a combination of two methods for measuring common and distinctive features, and two methods for measuring similarity produced statistically significant structural coefficients between the independent latent variables (common and distinctive features) and the dependent latent variable (similarity). This model fit the data well for a sample of 30 (435 pairs of) images and 150 participants (χ2 =16.97, …
Privacy Concerns and Personality Traits Influencing Online Behavior: A Structural Model
The concept of privacy has proven difficult to analyze because of its subjective nature and susceptibility to psychological and contextual influences. This study challenges the concept of privacy as a valid construct for addressing individuals' concerns regarding online disclosure of personal information, based on the premise that underlying behavioral traits offer a more reliable and temporally stable measure of privacy-oriented behavior than do snapshots of environmentally induced emotional states typically measured by opinion polls. This study investigated the relationship of personality characteristics associated with individuals' general privacy-related behavior to their online privacy behaviors and concerns. Two latent constructs, Functional Privacy Orientation and Online Privacy Orientation, were formulated. Functional Privacy Orientation is defined as a general measure of individuals' perception of control over their privacy. It was measured using the factors General Disclosiveness, Locus of Control, Generalized Trust, Risk Orientation, and Risk Propensity as indicator variables. Online Privacy Orientation is defined as a measure of individuals' perception of control over their privacy in an online environment. It was measured using the factors Willingness to Disclose Online, Level of Privacy Concern, Information Management Privacy Concerns, and Reported Online Disclosure as indicator variables. A survey questionnaire that included two new instruments to measure online disclosure and a willingness to disclose online was used to collect data from a sample of 274 adults. Indicator variables for each of the latent constructs, Functional Privacy Orientation and Online Privacy Orientation, were evaluated using corrected item-total correlations, factor analysis, and coefficient alpha. The measurement models and relationship between Functional Privacy Orientation and Online Privacy Orientation were assessed using exploratory factor analysis and structural equation modeling respectively. The structural model supported the hypothesis that Functional Privacy Orientation significantly influences Online Privacy Orientation. Theoretical, methodological, and practical implications and suggestions for analysis of privacy concerns and behavior are presented.
Quality Management in Museum Information Systems: A Case Study of ISO 9001-2000 as an Evaluative Technique
Museums are service-oriented information systems that provide access to information bearing materials contained in the museum's collections. Within museum environments, the primary vehicle for quality assurance and public accountability is the accreditation process of the American Association of Museums (AAM). Norbert Wiener founded the field of cybernetics, employing concepts of information feedback as a mechanism for system modification and control. W. Edwards Deming applied Wiener's principles to management theory, initiating the wave of change in manufacturing industries from production-driven to quality-driven systems. Today, the principles are embodied in the ISO 9000 International Standards for quality management systems (QMS), a globally-recognized set of standards, widely employed as a vehicle of quality management in manufacturing and service industries. The International Organization for Standardization defined a process for QMS registration against ISO 9001 that is similar in purpose to accreditation. This study's goals were to determine the degree of correspondence between elements of ISO 9001 and quality-related activities within museum environments, and to ascertain the relevance of ISO 9001-2000 as a technique of museum evaluation, parallel to accreditation. A content analysis compared museum activities to requirements specified in the ISO 9001-2000 International Standard. The study examined museum environment surrogates which consisted of (a) web sites of nine museum studies programs in the United States and (b) web sites of two museum professional associations, the AAM and the International Council of Museums (ICOM). Data items consisted of terms and phrases from the web sites and the associated context of each item. Affinity grouping of the data produced high degrees of correspondence to the categories and functional subcategories of ISO 9001. Many quality-related activities were found at the operational levels of museum environments, although not integrated as a QMS. If activities were unified as a QMS, the ISO 9001 Standard has potential for application as …
A Complex Systems Model for Understanding the Causes of Corruption: Case Study - Turkey
It is attempted with this dissertation to draw an explanatory interdisciplinary framework to clarify the causes of systemic corruption. Following an intense review of political sciences, economics, and sociology literatures on the issue, a complex systems theoretical model is constructed. A political system consists of five main components: Society, interest aggregators, legislative, executive and private sector, and the human actors in these domains. It is hypothesized that when the legitimacy level of the system is low and morality of the systemic actors is flawed, selected political, social and economic incentives and opportunities that may exist within the structure of the systemic components might -individually or as a group- trigger corrupt transactions between the actors of the system. If left untouched, corruption might spread through the system by repetition and social learning eventually becoming the source of corruption itself. By eroding the already weak legitimacy and morality, it may increase the risk of corruption even further. This theoretical explanation is used to study causes of systemic corruption in the Turkish political system. Under the guidance of the complex systems theory, initial systemic conditions, -legacy of the predecessor of Turkey Ottoman Empire-, is evaluated first, and then political, social and economic factors that are presumed to be breeding corruption in contemporary Turkey is investigated. In this section, special focus is given on the formation and operation of amoral social networks and their contribution to the entrenchment of corruption within the system. Based upon the findings of the case study, the theoretical model that is informed by the literature is reformed: Thirty five system and actor level variables are identified to be related with systemic corruption and nature of the causality between them and corruption is explained. Although results of this study can not be academically generalized for obvious reasons; the analytical framework …
Global response to cyberterrorism and cybercrime: A matrix for international cooperation and vulnerability assessment.
Cyberterrorism and cybercrime present new challenges for law enforcement and policy makers. Due to its transnational nature, a real and sound response to such a threat requires international cooperation involving participation of all concerned parties in the international community. However, vulnerability emerges from increased reliance on technology, lack of legal measures, and lack of cooperation at the national and international level represents real obstacle toward effective response to these threats. In sum, lack of global consensus in terms of responding to cyberterrorism and cybercrime is the general problem. Terrorists and cyber criminals will exploit vulnerabilities, including technical, legal, political, and cultural. Such a broad range of vulnerabilities can be dealt with by comprehensive cooperation which requires efforts both at the national and international level. "Vulnerability-Comprehensive Cooperation-Freedom Scale" or "Ozeren Scale" identified variables that constructed the scale based on the expert opinions. Also, the study presented typology of cyberterrorism, which involves three general classifications of cyberterrorism; Disruptive and destructive information attacks, Facilitation of technology to support the ideology, and Communication, Fund raising, Recruitment, Propaganda (C-F-R-P). Such a typology is expected to help those who are in a position of decision-making and investigating activities as well as academicians in the area of terrorism. The matrix for international cooperation and vulnerability assessment is expected to be used as a model for global response to cyberterrorism and cybercrime.
Makeshift Information Constructions: Information Flow and Undercover Police
This dissertation presents the social virtual interface (SVI) model, which was born out of a need to develop a viable model of the complex interactions, information flow and information seeking behaviors among undercover officers. The SVI model was created from a combination of various philosophies and models in the literature of information seeking, communication and philosophy. The questions this research paper answers are as follows: 1. Can we make use of models and concepts familiar to or drawn from Information Science to construct a model of undercover police work that effectively represents the large number of entities and relationships? and 2. Will undercover police officers recognize this model as realistic? This study used a descriptive qualitative research method to examine the research questions. An online survey and hard copy survey were distributed to police officers who had worked in an undercover capacity. In addition groups of officers were interviewed about their opinion of the SVI model. The data gathered was analyzed and the model was validated by the results of the survey and interviews.
Back to Top of Screen