UNT Libraries - 212 Matching Results

Search Results

Increasing Telecommunications Channel Capacity: Impacts on Firm Profitability
In calling for the deployment of high-capacity telecommunications infrastructures, the Clinton Administration is relying on market forces to drive demand toward self-sustaining development. There is little doubt that many firms will embrace the new telecommunications services for a variety of reasons including market differentiation, vertical market integration, and other organization-specific factors. However, there is little evidence at the firm level that adopting the use of increased-capacity telecommunications technologies is associated with improvements in firm profitability. This study seeks to identify the presence of impacts on firm income that can be associated with the adoption of T1 telecommunications services.
An Analysis of the Ability of an Instrument to Measure Quality of Library Service and Library Success
This study consisted of an examination of how service quality should be measured within libraries and how library service quality relates to library success. A modified version of the SERVQUAL instrument was evaluated to determine how effectively it measures library service quality. Instruments designed to measure information center success and information system success were evaluated to determine how effectively they measure library success and how they relate to SERVQUAL. A model of library success was developed to examine how library service quality relates to other variables associated with library success. Responses from 385 end users at two U.S. Army Corps of Engineers libraries were obtained through a mail survey. Results indicate that library service quality is best measured with a performance-based version of SERVQUAL, and that measuring importance may be as critical as measuring expectations for management purposes. Results also indicate that library service quality is an important factor in library success and that library success is best measured with a combination of SERVQUAL and library success instruments. The findings have implications for the development of new instruments to more effectively measure library service quality and library success as well as for the development of new models of library service quality and library success.
Information Use Environment of Self-managed Teams : A Case Study
This research investigated how self-managed teams get the information they need to perform their job tasks. Two important factors prompted this study: the growing importance of self-managed teams in the workplace and the impact of the information system on team performance.
A multi-dimensional entropy model of jazz improvisation for music information retrieval.
Jazz improvisation provides a case context for examining information in music; entropy provides a means for representing music for retrieval. Entropy measures are shown to distinguish between different improvisations on the same theme, thus demonstrating their potential for representing jazz information for analysis and retrieval. The calculated entropy measures are calibrated against human representation by means of a case study of an advanced jazz improvisation course, in which synonyms for "entropy" are frequently used by the instructor. The data sets are examined for insights in music information retrieval, music information behavior, and music representation.
A Conceptual Map for Understanding the Terrorist Recruitment Process: Observation and Analysis of Turkish Hezbollah Terrorist Organizations.
Terrorism is a historical problem; however, it becomes one of the biggest problems in 21st century. September 11 and the following Madrid, Istanbul and London attacks showed that it is the most significant problem threatening world peace and security. Governments have started to deal with terrorism by improving security measurements and making new investments to stop terrorism. Most of the governments' and scholars' focus is on immediate threats and causes of terrorism, instead of looking at long-term solutions such as root causes and underlying reasons of terrorism, and the recruitment style of terrorist organizations If terrorist recruitment does not stop, then it is safe to say terrorist activities cannot be stopped. This study focused on the recruitment process by observing two different terrorist organizations, DHKP/C and Turkish Hezbollah. The researcher brings 13 years of field experience and first-person data gathered from inside the terrorist organizations. The research questions of this study were: (i) How can an individual be prevented from joining or carrying out terrorist activities?; (ii) What factors are correlated with joining a terrorist organization?; (iii) What are the recruitment processes of the DHKP/C, PKK, and Turkish Hezbollah?; (iv) Is there any common process of being a member of these three terrorist organizations?; and (v) What are the similarities and differences these terrorist organizations? As a result of this analysis, a terrorist recruitment process map was created. With the help of this map, social organizations such as family and schools may be able to identify ways to prevent individuals from joining terrorist organizations. Also, this map will also be helpful for government organizations such as counterterrorism and intelligence to achieve the same goal.
Constraints on Adoption of Innovations: Internet Availability in the Developing World.
In a world that is increasingly united in time and distance, I examine why the world is increasingly divided socially, economically, and digitally. Using data for 35 variables from 93 countries, I separate the countries into groups of 31 each by gross domestic product per capita. These groups of developed, lesser developed and least developed countries are used in comparative analysis. Through a review of relevant literature and tests of bivariate correlation, I select eight key variables that are significantly related to information communication technology development and to human development. For this research, adoption of the Internet in the developing world is the innovation of particular interest. Thus, for comparative purposes, I chose Internet Users per 1000 persons per country and the Human Development Index as the dependent variables upon which the independent variables are regressed. Although small in numbers among the least developed countries, I find Internet Users as the most powerful influence on human development for the poorest countries. The research focuses on key obstacles as well as variables of opportunity for Internet usage in developing countries. The greatest obstacles are in fact related to Internet availability and the cost/need ratio for infrastructure expansion. However, innovations for expanded Internet usage in developing countries are expected to show positive results for increased Internet usage, as well as for greater human development and human capital. In addition to the diffusion of innovations in terms of the Internet, the diffusion of cultures through migration is also discussed in terms of the effect on social capital and the drain on human capital from developing countries.
Detecting the Presence of Disease by Unifying Two Methods of Remote Sensing.
There is currently no effective tool available to quickly and economically measure a change in landmass in the setting of biomedical professionals and environmental specialists. The purpose of this study is to structure and demonstrate a statistical change-detection method using remotely sensed data that can detect the presence of an infectious land borne disease. Data sources included the Texas Department of Health database, which provided the types of infectious land borne diseases and indicated the geographical area to study. Methods of data collection included the gathering of images produced by digital orthophoto quadrangle and aerial videography and Landsat. Also, a method was developed to identify statistically the severity of changes of the landmass over a three-year period. Data analysis included using a unique statistical detection procedure to measure the severity of change in landmass when a disease was not present and when the disease was present. The statistical detection method was applied to two different remotely sensed platform types and again to two like remotely sensed platform types. The results indicated that when the statistical change detection method was used for two different types of remote sensing mediums (i.e.-digital orthophoto quadrangle and aerial videography), the results were negative due to skewed and unreliable data. However, when two like remote sensing mediums were used (i.e.- videography to videography and Landsat to Landsat) the results were positive and the data were reliable.
Coyote Ugly Librarian: A Participant Observer Examination of Lnowledge Construction in Reality TV.
Reality TV is the most popular genre of television programming today. The number of reality television shows has grown exponentially over the last fifteen years since the premier of The Real World in 1992. Although reality TV uses styles similar to those used in documentary film, the “reality” of the shows is questioned by critics and viewers alike. The current study focuses on the “reality” that is presented to viewers and how that “reality” is created and may differ from what the participants of the shows experience. I appeared on two reality shows, Faking It and That's Clever, and learned a great deal as a participant observer. Within the study, I outline my experience and demonstrate how editing changed the reality I experienced into what was presented to the viewers. O'Connor's (1996) representation context web serves as a model for the realities created through reality television. People derive various benefits from watching reality TV. Besides the obvious entertainment value of reality TV, viewers also gather information via this type of programming. Viewers want to see real people on television reacting to unusual circumstances without the use of scripts. By surveying reality TV show viewers and participants, this study gives insight into how real the viewers believe the shows are and how authentic they actually are. If these shows are presented as reality, viewers are probably taking what they see as historical fact. The results of the study indicate more must be done so that the “reality” of reality TV does not misinform viewers.
Discovering a Descriptive Taxonomy of Attributes of Exemplary School Library Websites
This descriptive study examines effective online school library practice. A Delphi panel selected a sample of 10 exemplary sites and helped to create two research tools--taxonomies designed to analyze the features and characteristics of school library Websites. Using the expert-identified sites as a sample, a content analysis was conducted to systematically identify site features and characteristics. Anne Clyde's longitudinal content analysis of school library Websites was used as a baseline to examine trends in practice; in addition, the national guidelines document, Information Power: Building Partnerships for Learning, was examined to explore ways in which the traditional mission and roles of school library programs are currently translated online. Results indicated great variation in depth and coverage even among Websites considered exemplary. Sites in the sample are growing more interactive and student-centered, using blogs as supplemental communication strategies. Nevertheless, even these exemplary sites were slow to adopt the advances in technology to meet the learning needs and interests of young adult users. Ideally the study's findings will contribute to understanding of state-of-the-art and will serve to identify trends, as well as serving as a guide to practitioners in planning, developing, and maintaining school library Websites.
The Effect of Media on Citizens' Fear of Crime in Turkey.
This study was conducted on-site in Istanbul, Turkey, to determine the effects that mass media has on citizens' perceptions about fear of crime, in particular, and fear, in general. Specifically, the study was designed to (1) determine the tendency of citizens' media consumption, (2) determine the level of fear of crime among Turkish citizens, (3) establish the effect of media on citizens' fear of crime, and (4) determine if gender, age, educational level, neighborhood, and monthly income have an independent effect on fear of crime. To achieve this purpose, after administering a survey in Istanbul, the researcher collected appropriate data and then utilized regression analysis to examine the relationship between media variables and fear of crime. A survey consisting of three parts was administered to 545 Turkish citizens over the age of 18 who currently reside in Istanbul, Turkey. In Part I of the survey, respondents were asked to identify their trends in relation to media consumption, and in Part II respondents were asked to report their feelings about fear of crime. Finally, Part III consisted of socio-demographic characteristics including gender, age, marital status, level of education, and income. The media variables used for this study were, general TV viewing, watching crime drama, watching TV news, listening to radio news, reading newspaper news, and reading Internet news. Regarding the independent effects of socio-demographic variables on fear of crime, only gender was found to be significantly related thereby supporting the research hypothesis. From six media variables, only watching crime drama show and reading Internet news found to be related with individuals' fear of crime; however, this relation disappeared after controlling with socio-demographic variables. In addition, no cultivation effect could be found among the sub-groups of sample.
The Effect of Personality Type on the Use of Relevance Criteria for Purposes of Selecting Information Sources.
Even though information scientists generally recognize that relevance judgments are multidimensional and dynamic, there is still discussion and debate regarding the degree to which certain internal (cognition, personality) and external (situation, social relationships) factors affect the use of criteria in reaching those judgments. Much of the debate centers on the relationship of those factors to the criteria and reliable methods for measuring those relationships. This study researched the use of relevance criteria to select an information source by undergraduate students whose task it is to create a course schedule for a semester. During registration periods, when creating their semester schedules, students filled out a two-part questionnaire. After completion of the questionnaire the students completed a Myers-Briggs Type Indicator instrument in order to determine their personality type. Data was analyzed using one-way ANOVAS and Chi-Square. A positive correlation exists between personality type as expressed by the MBTI and the information source selected as most important by the subject. A correlation also exists between personality type and relevance criteria use. The correlation is stronger for some criteria than for others. Therefore, one can expect personality type to have an effect on the use of relevance criteria while selecting information sources.
Information Structures in Notated Music: Statistical Explorations of Composers' Performance Marks in Solo Piano Scores
Written notation has a long history in many musical traditions and has been particularly important in the composition and performance of Western art music. This study adopted the conceptual view that a musical score consists of two coordinated but separate communication channels: the musical text and a collection of composer-selected performance marks that serve as an interpretive gloss on that text. Structurally, these channels are defined by largely disjoint vocabularies of symbols and words. While the sound structures represented by musical texts are well studied in music theory and analysis, the stylistic patterns of performance marks and how they acquire contextual meaning in performance is an area with fewer theoretical foundations. This quantitative research explored the possibility that composers exhibit recurring patterns in their use of performance marks. Seventeen solo piano sonatas written between 1798 and 1913 by five major composers were analyzed from modern editions by tokenizing and tabulating the types and usage frequencies of their individual performance marks without regard to the associated musical texts. Using analytic methods common in information science, the results demonstrated persistent statistical similarities among the works of each composer and differences among the work groups of different composers. Although based on a small sample, the results still offered statistical support for the existence of recurring stylistic patterns in composers' use of performance marks across their works.
Impetuses for First, Second, and Third Year Law Student Information Seeking Behavior, and Perception of Common Knowledge and Citation
This dissertation examined how previous information literacy training, law student gender, age, and previously obtained education affects first, second, and third year law students selection of information sources, their understanding of common knowledge, and their decision of whether or not to give attribution to these sources. To examine these factors, this study implemented a paradigm called the principle of least effort that contended humans in general tended to complete the least amount of work possible to complete presented tasks. This study sought to discover whether law students follow this same path of completing the least amount of work possible to finish presented tasks, and whether this behavior affects information source selection, citation, and understanding of common knowledge. I performed six focus groups and crafted and disseminated an online survey to examine these factors. Via this data collection, it was discovered that law students do exhibit some differences in understanding of citation and citation behavior based on age and their year in law school. They also exhibited some differences regarding common knowledge based on their year in law school, where they received their information literacy training, and where they attend law school. Yet, no statistically significant differences were discovered regarding where one attends law school and citation and source selection. Further this study revealed law students do follow this paradigm and seek the path of least resistance to accomplish law school assignments.
Development of an Instrument to Measure the Level of Acceptability and Tolerability of Cyber Aggression: Mixed-Methods Research on Saudi Arabian Social Media Users
Cyber aggression came about as a result of advances in information communication technology and the aggressive usage of the technology in real life. Cyber aggression can take on many forms and facets. However, the main focus of this study is cyberbullying and cyberstalking through information sharing practices that might constitute digital aggressive acts. Human aggression has been extensively investigated. Studies focusing on understanding the causes and effects that can lead to physical and digital aggression have shown the prevalence of cyber aggression in different settings. Moreover, these studies have shown strong relationship between cyber aggression and the physiological and physical trauma on both perpetrators and their victims. Nevertheless, the literature shows a lack of studies that could measure the level of acceptance and tolerance of these dangerous digital acts. This study is divided into two main stages; Stage one is a qualitative pilot study carried out to explore the concept of cyber aggression and its existence in Saudi Arabia. In-depth interviews were conducted with 14 Saudi social media users to collect understanding and meanings of cyber aggression. The researcher followed the Colaizzi’s methods to analyze the descriptive data. A proposed model was generated to describe cyber aggression in social media applications. The results showed that there is a level of acceptance to some cyber aggression acts due to a number of factors. The second stage of the study is focused on developing scales with reliable items that could determine acceptability and tolerability of cyber aggression. In this second stage, the researcher used the factors discovered during the first stage as source to create the scales’ items. The proposed methods and scales were analyzed and tested to increase reliability as indicated by the Cronbach’s Alpha value. The scales were designed to measure how acceptable and tolerable is cyber-bullying, cyber-stalking in Saudi ...
Poststructuralist Critical Rhetorical Analysis as a Problem Analysis Tool: A Case Study of Information Impact in Denton’s Hydraulic Fracturing Debate
Energy and the natural environment are central concerns among stakeholders across the globe. Decisions on this scale often require interaction among a myriad of institutions and individuals who navigate a complex variety of challenges. In Denton, Texas in 2014, voters were asked to make such a decision when tasked with a referendum to determine whether the city would continue to allow hydraulic fracturing activity within its borders. For social scientists, this situation requires further analysis in an effort to better understand how and why individuals make the decisions they do. One possible approach for exploring this process is a method of poststructuralist critical rhetorical analysis, which is concerned with how individuals’ identities change through interaction with institutions. This study reflects upon the texts themselves through a poststructuralist critical rhetorical analysis of images employed by those in favor of and those against Denton’s ban on hydraulic fracturing in an attempt to identify images that alter the grid of intelligibility for the audience. The paper includes deliberation about the relative merits, subsequent disadvantages, and possible questions for further study as they relate to the theoretical implications of critical rhetorical analysis as information science. Ultimately, the study identifies poststructuralist critical rhetorical analysis as a method for solving information science problems in a way that considers closely the way identity is shaped through engagement with institutions.
Conversational Use of Photographic Images on Facebook: Modeling Visual Thinking on Social Media
Modeling the "thick description" of photographs began at the intersection of personal and institutional descriptions. Comparing institutional descriptions of particular photos that were also used in personal online conversations was the initial phase. Analyzing conversations that started with a photographic image from the collection of the Library of Congress (LC) or the collection of the Manchester Historic Association (MHA) provided insights into how cultural heritage institutions could enrich the description of photographs by using informal descriptions such as those applied by Facebook users. Taking photos of family members, friends, places, and interesting objects is something people do often in their daily lives. Some photographic images are stored, and some are shared with others in gatherings, occasions, and holidays. Face-to-face conversations about remembering some of the details of photographs and the event they record are themselves rarely recorded. Digital cameras make it easy to share personal photos in Web conversations and to duplicate old photos and share them on the Internet. The World Wide Web even makes it simple to insert images from cultural heritage institutions in order to enhance conversations. Images have been used as tokens within conversations along with the sharing of information and background knowledge about them. The recorded knowledge from conversations using photographic images on Social Media (SM) has resulted in a repository of rich descriptions of photographs that often include information of a type that does not result from standard archival practices. Closed group conversations on Facebook among members of a community of interest/practice often involve the use of photographs to start conversations, convey details, and initiate story-telling about objets, events, and people. Modeling of the conversational use of photographic images on SM developed from the exploratory analyses of the historical photographic images of the Manchester, NH group on Facebook. The model was influenced by the ...
Assessing Terrorist Cyber Threats: Engineering a Functional Construct
Terrorist organizations and individuals make use of the Internet for supportive activities such as communication, recruiting, financing, training, and planning operations. However, little is known about the level of computer-based (“cyber”) threat such terrorist organizations and individuals pose. One step in facilitating the examination and assessment of the level of cyber threat posed by terrorist organizations and individuals is development of an assessment tool or methodology. This tool would guide intelligence collection efforts and would support and facilitate comparative assessment of the cyber threat posed by terrorist organizations and individuals through the provision of a consistent method of assessment across time, amongst organizations and individuals, and between analysts. This study leveraged the professional experience of experts to engineer a new functional construct – a structured analytical technique designed to assess the cyber threat posed by terrorist entities and individuals. The resultant instrument was a novel structured analytical construct that uses defined indicators of a terrorist organization/individual’s intent to carry out cyber attacks, and their capability to actually do so as measures of an organization/individual’s overall level of cyber threat.
Customers' Attitudes toward Mobile Banking Applications in Saudi Arabia
Mobile banking services have changed the design and delivery of financial services and the whole banking sector. Financial service companies employ mobile banking applications as new alternative channels to increase customers' convenience and to reduce costs and maintain profitability. The primary focus of this study was to explore the Saudi bank customers' perceptions about the adoption of mobile banking applications and to test the relationships between the factors that influence mobile banking adoption as independent variables and the action to adopt them as the dependent variable. Saudi customers' perceptions were tested based on the extended versions of IDT, TAM and other diffusion of innovation theories and frameworks to generate a model of constructs that can be used to study the use and the adoption of mobile technology by users. Koenig-Lewis, Palmer, & Moll's (2010) model was used to test its constructs of (1) perceived usefulness, (2) perceived ease of use, (3) perceived compatibility, (4) perceived credibility, (5) perceived trust, (6) perceived risk, and (7) perceived cost, and these were the independent variables in current study. This study revealed a high level of adoption that 82.7% of Saudis had adopted mobile banking applications. Also, the findings of this study identified a statistically significant relationship between all of demographic differences: gender, education level, monthly income, and profession and mobile banking services among adopters and non-adopters. Seven attributes relating to the adoption of mobile banking applications were evaluated in this study to assess which variables affected Saudi banks customers in their adoption of mobile banking services. The findings indicated that the attributes that significantly affected the adoption of mobile banking applications among Saudis were perceived trust, perceived cost, and perceived risk. These three predictors, as a result, explained more than 60% of variance in intention to adopt mobile banking technology in Saudi Arabia. ...
Understanding the Information Seeking of Pre-Kindergarten Students: An Ethnographic Exploration of Their Seeking Behaviors in a Preschool Setting
Although there has been research conducted in the area of information seeking behavior in children, the research focusing on young children, more specifically on pre-kindergarten students, is almost nonexistent. Children at this age are in the preoperational developmental stage. They tend to display curiosity about the world around them, and use other people as a means to gain the information they are seeking. Due to the insistence from President Obama to implement pre-kindergarten programs for all low and middle class children, the need to understand the cognitive, emotional, and physical needs of these children is becoming increasingly imperative. To researchers, the actions displayed by these young children on a daily basis remain vital in determining the methods by which they are categorized, studies, and even taught. This study employed Deci and Ryan's self-determination theory (SDT), Dervin's sense-making theory, Kuhlthau''s information search process model (ISP), and Shenton and Dixon's microcosmic model of information seeking via people to lay the theoretical foundational framework. This ethnographic study aimed to fill the age gap found in information seeking literature. By observing young children in the school setting, I gained insight into how these children seek information. The resulting information collected via field observations and semi-structured interviews were coded based on Shenton and Dixon's model of information seeking via people. The findings, in Chapter 5, revealed emerging codes and trends in the information seeking behaviors of pre-kindergarten students.
From the Outside In: A Multivariate Correlational Analysis of Effectiveness in Communities of Practice
Online communities of practice (CoPs) provide social spaces for people to connect, learn, and engage with one another around shared interests and passions. CoPs are innovatively employed within industry and education for their inherent knowledge management characteristics and as a means of improving professional practice. Measuring the success of a CoP is a challenge researchers are examining through various strategies. Recent literature supports measuring community effectiveness through the perceptions of its members; however, evaluating a community by means of member perception introduces complicating factors from outside the community. In order to gain insight into the importance of external factors, this quantitative study examined the influence of factors in the professional lives of educators on their perceptions of their CoP experience. Through an empirical examination of CoPs employed to connect educators and advance their professional learning, canonical correlation analysis was used to examine correlations between factors believed to be influential on the experiences of community members.
Journalist as Information Provider: Examining the One-Voice Model of a Corporate Sports Account
While journalists were once viewed as gatekeepers, dispensing news and information via one-way communication channels, their role as information provider has evolved. Nowhere is this more apparent than on the social networking site Twitter, where information seekers have unprecedented access to information providers. The two-way communication that these information seekers have come to expect can be challenging for organizations such as ESPN who have multiple Twitter accounts and millions of followers. By designating one team of people as responsible for the organization's largest Twitter account, SportsCenter, ESPN has sought to establish manageable methods of interacting with this account's followers, while furthering the goals of the organization and providing sports news around the clock. This study provides a better understanding of the group responsible for ESPN's SportsCenter Twitter account: the motivation and strategies behind the group's Twitter use as well as the dynamics of this network, such as information flow and collaboration. Relying on the Information Seeking and Communication Model, this study also provides a better understanding of information exchanges with those outside the network, specifically a selection of the account's Twitter followers. Additionally, the role of journalist as information provider and certain themes that emerged from the content of the tweets are discussed. The research employed social network analysis and exploratory, descriptive case study methods. The results of this study contribute to social network and information theory as well as to journalistic and information science practice.
The Cluster Hypothesis: A Visual/Statistical Analysis
By allowing judgments based on a small number of exemplar documents to be applied to a larger number of unexamined documents, clustered presentation of search results represents an intuitively attractive possibility for reducing the cognitive resource demands on human users of information retrieval systems. However, clustered presentation of search results is sensible only to the extent that naturally occurring similarity relationships among documents correspond to topically coherent clusters. The Cluster Hypothesis posits just such a systematic relationship between document similarity and topical relevance. To date, experimental validation of the Cluster Hypothesis has proved problematic, with collection-specific results both supporting and failing to support this fundamental theoretical postulate. The present study consists of two computational information visualization experiments, representing a two-tiered test of the Cluster Hypothesis under adverse conditions. Both experiments rely on multidimensionally scaled representations of interdocument similarity matrices. Experiment 1 is a term-reduction condition, in which descriptive titles are extracted from Associated Press news stories drawn from the TREC information retrieval test collection. The clustering behavior of these titles is compared to the behavior of the corresponding full text via statistical analysis of the visual characteristics of a two-dimensional similarity map. Experiment 2 is a dimensionality reduction condition, in which inter-item similarity coefficients for full text documents are scaled into a single dimension and then rendered as a two-dimensional visualization; the clustering behavior of relevant documents within these unidimensionally scaled representations is examined via visual and statistical methods. Taken as a whole, results of both experiments lend strong though not unqualified support to the Cluster Hypothesis. In Experiment 1, semantically meaningful 6.6-word document surrogates systematically conform to the predictions of the Cluster Hypothesis. In Experiment 2, the majority of the unidimensionally scaled datasets exhibit a marked nonuniformity of distribution of relevant documents, further supporting the Cluster Hypothesis. Results of ...
Creating a Criterion-Based Information Agent Through Data Mining for Automated Identification of Scholarly Research on the World Wide Web
This dissertation creates an information agent that correctly identifies Web pages containing scholarly research approximately 96% of the time. It does this by analyzing the Web page with a set of criteria, and then uses a classification tree to arrive at a decision. The criteria were gathered from the literature on selecting print and electronic materials for academic libraries. A Delphi study was done with an international panel of librarians to expand and refine the criteria until a list of 41 operationalizable criteria was agreed upon. A Perl program was then designed to analyze a Web page and determine a numerical value for each criterion. A large collection of Web pages was gathered comprising 5,000 pages that contain the full work of scholarly research and 5,000 random pages, representative of user searches, which do not contain scholarly research. Datasets were built by running the Perl program on these Web pages. The datasets were split into model building and testing sets. Data mining was then used to create different classification models. Four techniques were used: logistic regression, nonparametric discriminant analysis, classification trees, and neural networks. The models were created with the model datasets and then tested against the test dataset. Precision and recall were used to judge the effectiveness of each model. In addition, a set of pages that were difficult to classify because of their similarity to scholarly research was gathered and classified with the models. The classification tree created the most effective classification model, with a precision ratio of 96% and a recall ratio of 95.6%. However, logistic regression created a model that was able to correctly classify more of the problematic pages. This agent can be used to create a database of scholarly research published on the Web. In addition, the technique can be used to create a ...
A Theory for the Measurement of Internet Information Retrieval
The purpose of this study was to develop and evaluate a measurement model for Internet information retrieval strategy performance evaluation whose theoretical basis is a modification of the classical measurement model embodied in the Cranfield studies and their progeny. Though not the first, the Cranfield studies were the most influential of the early evaluation experiments. The general problem with this model was and continues to be the subjectivity of the concept of relevance. In cyberspace, information scientists are using quantitative measurement models for evaluating information retrieval performance that are based on the Cranfield model. This research modified this model by incorporating enduser relevance judgment rather than using objective relevance judgments, and by adopting a fundamental unit of measure developed for the cyberspace of Internet information retrieval rather than using recall and precision-type measures. The proposed measure, the Content-bearing Click (CBC) Ratio, was developed as a quantitative measure reflecting the performance of an Internet IR strategy. Since the hypertext "click" is common to many Internet IR strategies, it was chosen as the fundamental unit of measure rather than the "document." The CBC Ratio is a ratio of hypertext click counts that can be viewed as a false drop measure that determines the average number of irrelevant content-bearing clicks that an enduser check before retrieving relevant information. After measurement data were collected, they were used to evaluate the reliability of several methods for aggregating relevance judgments. After reliability coefficients were calculated, measurement model was used to compare web catalog and web database performance in an experimental setting. Conclusions were the reached concerning the reliability of the proposed measurement model and its ability to measure Internet IR performance, as well as implications for clinical use of the Internet and for future research in Information Science.
An Experimental Study of Teachers' Verbal and Nonverbal Immediacy, Student Motivation, and Cognitive Learning in Video Instruction
This study used an experimental design and a direct test of recall to provide data about teacher immediacy and student cognitive learning. Four hypotheses and a research question addressed two research problems: first, how verbal and nonverbal immediacy function together and/or separately to enhance learning; and second, how immediacy affects cognitive learning in relation to student motivation. These questions were examined in the context of video instruction to provide insight into distance learning processes and to ensure maximum control over experimental manipulations. Participants (N = 347) were drawn from university students in an undergraduate communication course. Students were randomly assigned to groups, completed a measure of state motivation, and viewed a 15-minute video lecture containing part of the usual course content delivered by a guest instructor. Participants were unaware that the video instructor was actually performing one of four scripted manipulations reflecting higher and lower combinations of specific verbal and nonverbal cues, representing the four cells of the 2x2 research design. Immediately after the lecture, students completed a recall measure, consisting of portions of the video text with blanks in the place of key words. Participants were to fill in the blanks with exact words they recalled from the videotape. Findings strengthened previous research associating teacher nonverbal immediacy with enhanced cognitive learning outcomes. However, higher verbal immediacy, in the presence of higher and lower nonverbal immediacy, was not shown to produce greater learning among participants in this experiment. No interaction effects were found between higher and lower levels of verbal and nonverbal immediacy. Recall scores were comparatively low in the presence of higher verbal and lower nonverbal immediacy, suggesting that nonverbal expectancy violations may have hindered cognitive learning. Student motivation was not found to be a significant source of error in measuring immediacy's effects, and no interaction effects were detected ...
Empowering Agent for Oklahoma School Learning Communities: An Examination of the Oklahoma Library Improvement Program
The purposes of this study were to determine the initial impact of the Oklahoma Library Media Improvement Grants on Oklahoma school library media programs; assess whether the Oklahoma Library Media Improvement Grants continue to contribute to Oklahoma school learning communities; and examine possible relationships between school library media programs and student academic success. It also seeks to document the history of the Oklahoma Library Media Improvement Program 1978 - 1994 and increase awareness of its influence upon the Oklahoma school library media programs. Methods of data collection included: examining Oklahoma Library Media Improvement Program archival materials; sending a survey to 1703 school principals in Oklahoma; and interviewing Oklahoma Library Media Improvement Program participants. Data collection took place over a one year period. Data analyses were conducted in three primary phases: descriptive statistics and frequencies were disaggregated to examine mean scores as they related to money spent on school library media programs; opinions of school library media programs; and possible relationships between school library media programs and student academic achievement. Analysis of variance was used in the second phase of data analysis to determine if any variation between means was significant as related to Oklahoma Library Improvement Grants, time spent in the library media center by library media specialists, principal gender, opinions of library media programs, student achievement indicators, and the region of the state in which the respondent was located. The third phase of data analysis compared longitudinal data collected in the 2000 survey with past data. The primary results indicated students in Oklahoma from schools with a centralized library media center, served by a full-time library media specialist, and the school having received one or more Library Media Improvement Grants scored significantly higher academically than students in schools not having a centralized library media center, not served by a ...
An Examination Of The Variation In Information Systems Project Cost Estimates: The Case Of Year 2000 Compliance Projects
The year 2000 (Y2K) problem presented a fortuitous opportunity to explore the relationship between estimated costs of software projects and five cost influence dimensions described by the Year 2000 Enterprise Cost Model (Kappelman, et al., 1998) -- organization, problem, solution, resources, and stage of completion. This research was a field study survey of (Y2K) project managers in industry, government, and education and part of a joint project that began in 1996 between the University of North Texas and the Y2K Working Group of the Society for Information Management (SIM). Evidence was found to support relationships between estimated costs and organization, problem, resources, and project stage but not for the solution dimension. Project stage appears to moderate the relationships for organization, particularly IS practices, and resources. A history of superior IS practices appears to mean lower estimated costs, especially for projects in larger IS organizations. Acquiring resources, especially external skills, appears to increase costs. Moreover, projects apparently have many individual differences, many related to size and to project stage, and their influences on costs appear to be at the sub-dimension or even the individual variable level. A Revised Year 2000 Enterprise Model is presented incorporating this granularity. Two primary conclusions can be drawn from this research: (1) large software projects are very complex and thus cost estimating is also; and (2) the devil of cost estimating is in the details of knowing which of the many possible variables are the important ones for each particular enterprise and project. This points to the importance of organizations keeping software project metrics and the historical calibration of cost-estimating practices. Project managers must understand the relevant details and their interaction and importance in order to successfully develop a cost estimate for a particular project, even when rational cost models are used. This research also indicates ...
Identifying At-Risk Students: An Assessment Instrument for Distributed Learning Courses in Higher Education
The current period of rapid technological change, particularly in the area of mediated communication, has combined with new philosophies of education and market forces to bring upheaval to the realm of higher education. Technical capabilities exceed our knowledge of whether expenditures on hardware and software lead to corresponding gains in student learning. Educators do not yet possess sophisticated assessments of what we may be gaining or losing as we widen the scope of distributed learning. The purpose of this study was not to draw sweeping conclusions with respect to the costs or benefits of technology in education. The researcher focused on a single issue involved in educational quality: assessing the ability of a student to complete a course. Previous research in this area indicates that attrition rates are often higher in distributed learning environments. Educators and students may benefit from a reliable instrument to identify those students who may encounter difficulty in these learning situations. This study is aligned with research focused on the individual engaged in seeking information, assisted or hindered by the capabilities of the computer information systems that create and provide access to information. Specifically, the study focused on the indicators of completion for students enrolled in video conferencing and Web-based courses. In the final version, the Distributed Learning Survey encompassed thirteen indicators of completion. The results of this study of 396 students indicated that the Distributed Learning Survey represented a reliable and valid instrument for identifying at-risk students in video conferencing and Web-based courses where the student population is similar to the study participants. Educational level, GPA, credit hours taken in the semester, study environment, motivation, computer confidence, and the number of previous distributed learning courses accounted for most of the predictive power in the discriminant function based on student scores from the survey.
Relevance Thresholds: A Conjunctive/Disjunctive Model of End-User Cognition as an Evaluative Process
This investigation identifies end-user cognitive heuristics that facilitate judgment and evaluation during information retrieval (IR) system interactions. The study extends previous research surrounding relevance as a key construct for representing the value end-users ascribe to items retrieved from IR systems and the perceived effectiveness of such systems. The Lens Model of user cognition serves as the foundation for design and interpretation of the study; earlier research in problem solving, decision making, and attitude formation also contribute to the model and analysis. A self reporting instrument collected evaluative responses from 32 end-users related to 1432 retrieved items in relation to five characteristics of each item: topical, pertinence, utility, systematic, and motivational levels of relevance. The nominal nature of the data collected led to non-parametric statistical analyses that indicated that end-user evaluation of retrieved items to resolve an information problem at hand is most likely a multi-stage process. That process appears to be a cognitive progression from topic to meaning (pertinence) to functionality (use). Each step in end-user evaluative processing engages a cognitive hierarchy of heuristics that includes consideration (of appropriate cues), differentiation (the positive or negative aspects of those cues considered), and aggregation (the combination of differentiated cue aspects needed to render an evaluative label of the item in relation to the information problem at hand). While individuals may differ in their judgments and evaluations of retrieved items, they appear to make those decisions by using consistent heuristic approaches.
Quality Management in Museum Information Systems: A Case Study of ISO 9001-2000 as an Evaluative Technique
Museums are service-oriented information systems that provide access to information bearing materials contained in the museum's collections. Within museum environments, the primary vehicle for quality assurance and public accountability is the accreditation process of the American Association of Museums (AAM). Norbert Wiener founded the field of cybernetics, employing concepts of information feedback as a mechanism for system modification and control. W. Edwards Deming applied Wiener's principles to management theory, initiating the wave of change in manufacturing industries from production-driven to quality-driven systems. Today, the principles are embodied in the ISO 9000 International Standards for quality management systems (QMS), a globally-recognized set of standards, widely employed as a vehicle of quality management in manufacturing and service industries. The International Organization for Standardization defined a process for QMS registration against ISO 9001 that is similar in purpose to accreditation. This study's goals were to determine the degree of correspondence between elements of ISO 9001 and quality-related activities within museum environments, and to ascertain the relevance of ISO 9001-2000 as a technique of museum evaluation, parallel to accreditation. A content analysis compared museum activities to requirements specified in the ISO 9001-2000 International Standard. The study examined museum environment surrogates which consisted of (a) web sites of nine museum studies programs in the United States and (b) web sites of two museum professional associations, the AAM and the International Council of Museums (ICOM). Data items consisted of terms and phrases from the web sites and the associated context of each item. Affinity grouping of the data produced high degrees of correspondence to the categories and functional subcategories of ISO 9001. Many quality-related activities were found at the operational levels of museum environments, although not integrated as a QMS. If activities were unified as a QMS, the ISO 9001 Standard has potential for application as ...
Public School Educators' Use of Computer-Mediated Communication
This study examined the uses of computer-mediated communication (CMC) by educators in selected public schools. It used Rogers' Diffusion of Innovation Theory as the underpinnings of the study. CMC refers to any exchange of information that involves the use of computers for communication between individuals or individuals and a machine. This study was an exploration of difficulties users confront, what services they access, and the tasks they accomplish when using CMC. It investigated the factors that affect the use of CMC. The sample population was drawn from registered users on TENET, the Texas Education Network as of December 1997. The educators were described with frequency and percentages analyzing the demographic data. For the research, eight indices were selected to test how strongly these user and environmental attributes were associated with the use of CMC. These variables were (1) education, (2) position, (3) place of employment, (4) geographic location, (5) district size, (6) organization vitality, (7) adopter resources, and (8) instrumentality Two dependent variables were used to test for usage: (1) depth or frequency of CMC usage and amount of time spent online and (2) breadth or variety of Internet utilities used. Additionally, the users' perception of network benefits was measured. Network benefits were correlated with social interaction and perception of CMC to investigate what tasks educators were accomplishing with CMC. Correlations, SEQ CHAPTER h r 1 crosstabulations, and ANOVAs were used to analysis the data for testing the four hypotheses. The major findings of the study, based on the hypotheses tested, were that the socioeconomic variables of education and position influenced the use of CMC. A significant finding is that teachers used e-mail and for Internet resources less frequently than those in other positions. An interesting finding was that frequency of use was more significant for usage than amount of ...
Modeling Utilization of Planned Information Technology
Implementations of information technology solutions to address specific information problems are only successful when the technology is utilized. The antecedents of technology use involve user, system, task and organization characteristics as well as externalities which can affect all of these entities. However, measurement of the interaction effects between these entities can act as a proxy for individual attribute values. A model is proposed which based upon evaluation of these interaction effects can predict technology utilization. This model was tested with systems being implemented at a pediatric health care facility. Results from this study provide insight into the relationship between the antecedents of technology utilization. Specifically, task time provided significant direct causal effects on utilization. Indirect causal effects were identified in task value and perceived utility constructs. Perceived utility, along with organizational support also provided direct causal effects on user satisfaction. Task value also impacted user satisfaction in an indirect fashion. Also, results provide a predictive model and taxonomy of variables which can be applied to predict or manipulate the likelihood of utilization for planned technology.
A Mythic Perspective of Commodification on the World Wide Web
Capitalism's success, according to Karl Marx, is based on continued development of new markets and products. As globalization shrinks the world marketplace, corporations are forced to seek both new customers and products to sell. Commodification is the process of transforming objects, ideas and even people into merchandise. The recent growth of the World Wide Web has caught the attention of the corporate world, and they are attempting to convert a free-share-based medium into a profit-based outlet. To be successful, they must change Web users' perception about the nature of the Web itself. This study asks the question: Is there mythic evidence of commodification on the World Wide Web? It examines how the World Wide Web is presented to readers of three national publications-Wired, Newsweek, and Business Week-from 1993 to 2000. It uses Barthes' two-tiered model of myths to examine the descriptors used to modify and describe the World Wide Web. The descriptors were clustered into 11 general categories, including connectivity, social, being, scene, consumption, revolution, tool, value, biology, arena, and other. Wired articles did not demonstrate a trend in categorical change from 1993 to 2000; the category of choice shifted back and forth between Revolution, Connectivity, Scene, and Being. Newsweek articles demonstrated an obvious directional shift. Connectivity is the dominant myth from 1994 to 1998, when the revolution category dominates. Similarly, Business Week follows the prevailing myth of connectivity from 1994 to 1997. From 1998 on, the competition-related categories of revolution and arena lead all categories. The study finds evidence of commodification on the World Wide Web, based on the trend in categories in Newsweek and Business Week that move from a foundational myth that presents a perception of cooperation in 1994 to one of competition in 1998 and later. The study recommends further in-depth research of the target publications, ...
The Second Vatican Council and American Catholic Theological Research: A Bibliometric Analysis of Theological Studies: 1940-1995
A descriptive analysis was given of the characteristics of the authors and citations of the articles in the journal Theological Studies from 1940-1995. Data was gathered on the institutional affiliation, geographic location, occupation, and gender and personal characteristics of the author. The citation characteristics were examined for the cited authors, date and age of the citations, format, language, place of publication, and journal titles. These characteristics were compared to the time-period before and after the Second Vatican Council in order to detect any changes that might have occurred in the characteristics after certain recommendations by the council were made to theologians. Subject dispersion of the literature was also analyzed. Lotka's Law of author productivity and Bradford's Law of title dispersion were also performed for this literature. The profile of the characteristics of the authors showed that the articles published by women and laypersons has increased since the recommendations of the council. The data had a good fit to Lotka's Law for the pre-Vatican II time period but not for the period after Vatican II. The data was a good fit to Bradford's Law for the predicted number of journals in the nucleus and Zone 2, but the observed number of journals in Zone 3 was higher than predicted for all time-periods. Subject dispersion of research from disciplines other than theology is low but citation to works from the fields of education, psychology, social sciences, and science has increased since Vatican II. The results of the analysis of the characteristics of the citations showed that there was no significant change in the age, format and languages used, or the geographic location of the publisher of the cited works after Vatican II. Citation characteristics showed that authors prefer research from monographs published in English and in U.S. locations for all time-periods. Research ...
The Role of Information in the Selection Process of a Primary Care Physician
There is a paucity of information about the various factors that influence the selection of primary care physicians. Also, the relative significance of these factors is not known, making it difficult to properly address ways to improve the information flow to patients when they select a primary care physician.
A Study of Graphically Chosen Features for Representation of TREC Topic-Document Sets
Document representation is important for computer-based text processing. Good document representations must include at least the most salient concepts of the document. Documents exist in a multidimensional space that difficult the identification of what concepts to include. A current problem is to measure the effectiveness of the different strategies that have been proposed to accomplish this task. As a contribution towards this goal, this dissertation studied the visual inter-document relationship in a dimensionally reduced space. The same treatment was done on full text and on three document representations. Two of the representations were based on the assumption that the salient features in a document set follow the chi-distribution in the whole document set. The third document representation identified features through a novel method. A Coefficient of Variability was calculated by normalizing the Cartesian distance of the discriminating value in the relevant and the non-relevant document subsets. Also, the local dictionary method was used. Cosine similarity values measured the inter-document distance in the information space and formed a matrix to serve as input to the Multi-Dimensional Scale (MDS) procedure. A Precision-Recall procedure was averaged across all treatments to statistically compare them. Treatments were not found to be statistically the same and the null hypotheses were rejected.
Students' Criteria for Course Selection: Towards a Metadata Standard for Distributed Higher Education
By 2007, one half of higher education students are expected to enroll in distributed learning courses. Higher education institutions need to attract students searching the Internet for courses and need to provide students with enough information to select courses. Internet resource discovery tools are readily available, however, users have difficulty selecting relevant resources. In part this is due to the lack of a standard for representation of Internet resources. An emerging solution is metadata. In the educational domain, the IEEE Learning Technology Standards Committee (LTSC) has specified a Learning Object Metadata (LOM) standard. This exploratory study (a) determined criteria students think are important for selecting higher education courses, (b) discovered relationships between these criteria and students' demographic characteristics, educational status, and Internet experience, and (c) evaluated these criteria vis-à-vis the IEEE LTSC LOM standard. Web-based questionnaires (N=209) measured (a) the criteria students think are important in the selection of higher education courses and (b) three factors that might influence students' selections. Respondents were principally female (66%), employed full time (57%), and located in the U.S. (89%). The chi square goodness-of-fit test determined 40 criteria students think are important and exploratory factor analysis determined five common factors among the top 21 criteria, three evaluative factors and two descriptive. Results indicated evaluation criteria are very important in course selection. Spearman correlation coefficients and chi-square tests of independence determined the relationships between the importance of selection criteria and demographic characteristics, educational status, and Internet experience. Four profiles emerged representing groups of students with unique concerns. Side by side analysis determined if the IEEE LTSC LOM standard included the criteria of importance to students. The IEEE LOM by itself is not enough to meet students course selection needs. Recommendations include development of a metadata standard for course evaluation and accommodation of group differences in ...
A Model of Information Therapy: Definition and Empirical Application
This study involves the investigation of the basis and validity of considering health information as therapeutic, the definition of Information Therapy, and whether the therapeutic nature of information can be measured empirically. The purpose of the study is to determine if there are any significant differences in the therapeutic effect of Information Therapy through the different delivery modes of support groups communicating face-to-face and those utilizing computer-mediated communication on the Internet. The comparison of these groups revealed no significant differences on three measures of health: physical, mental, and social support. Because one communication medium is not found to be advantageous over the other, the use of the computer can extend the benefits of Information Therapy to the home-bound, to those in remote areas, to people with time restraints, and those who may be shy. The validity of the therapeutic nature of information was verified by participant report of the effect of a health information search. Results demonstrated that the primary source for information is the physician, followed by the Internet, and 77% of participants reported a positive or therapeutic effect when health information was found. These results are significant because individuals who are in positions to deliver Information Therapy can better meet needs by identification of the sources to which people look for information and can have a major impact on patient care and the general health of the population. Providing people with information can empower them to take an active role in their health, can increase confidence in self-care, and should provide coping and disease management skills thus decreasing the utilization of healthcare resources and preventing costly acute and chronic health complications.
Korean Studies in North America 1977-1996: A Bibliometric Study
This research is a descriptive bibliometric study of the literature of the field of Korean studies. Its goal is to quantitatively describe the literature and serve as a model for such research in other area studies fields. This study analyzed 193 source articles and 7,166 citations in the articles in four representative Korean and Asian studies journals published in North America from 1977 to 1996. The journals included in this study were Korean Studies (KS), the Journal of Korean Studies (JKS), the Journal of Asian Studies (JAS), and the Harvard Journal of Asiatic Studies (HJAS). Subject matters and author characteristics of the source articles were examined, along with various characteristics such as the form, date, language, country of origin, subject, key authors, and key titles of the literature cited in the source articles. Research in Korean studies falls within fourteen broad disciplines, but concentrated in a few disciplines. Americans have been the most active authors in Korean studies, followed closely by authors of Korean ethnicity. Monographic literature was used most. The mean age of publications cited was 20.87 and the median age of publications cited was 12. The Price Index of Korean studies as a whole is 21.9 percent. Sources written in English were most cited (47.1%) and references to Korean language sources amounted to only 34.9% of all sources. In general, authors preferred sources published in their own countries. Sources on history were cited most by other disciplines. No significant core authors were identified. No significant core literature were identified either. This study indicates that Korean studies is still evolving. Some ways of promoting research in less studied disciplines and of facilitating formal communication between Korean scholars in Korea and Koreanists in North America need to be sought in order to promote well-balanced development in the field. This study ...
Information Seeking in a Virtual Learning Environment
Duplicating a time series study done by Kuhlthau and associates in 1989, this study examines the applicability of the Information Search Process (ISP) Model in the context of a virtual learning environment. This study confirms that students given an information seeking task in a virtual learning environment do exhibit the stages indicated by the ISP Model. The six-phase ISP Model is shown to be valid for describing the different stages of cognitive, affective, and physical tasks individuals progress through when facing a situation where they must search for information to complete an academic task in a virtual learning environment. The findings in this study further indicate there is no relationship between the amount of computer experience subjects possess and demonstrating the patterns of thoughts, feelings, and actions described by the ISP Model. The study demonstrates the ISP Model to be independent of the original physical library environments where the model was developed. An attempt is made to represent the ISP model in a slightly different manner that provides more of the sense of motion and interaction among the components of thoughts, feelings, and action than is currently provided for in the model. The study suggests that the development of non-self-reporting data collection techniques would be useful in complementing and furthering research to enhance and refine the representation of the ISP Model. Additionally, expanding the research to include the examination of group interaction is called for to enhance the ISP Model and develop further applications that could potentially aid educational delivery in all types of learning environments.
The Effect of Information Literacy Instruction on Library Anxiety Among International Students
This study explored what effect information literacy instruction (ILI) may have on both a generalized anxiety state and library anxiety specifically. The population studied was international students using resources in a community college. Library anxiety among international students begins with certain barriers that cause anxiety (i.e., language/communication barriers, adjusting to a new education/library system and general cultural adjustments). Library Anxiety is common among college students and is characterized by feelings of negative emotions including, ruminations, tension, fear and mental disorganization (Jiao & Onwuegbuzie, 1999a). This often occurs when a student contemplates conducting research in a library and is due to any number of perceived inabilities about using the library. In order for students to become successful in their information seeking behavior this anxiety needs to be reduced. The study used two groups of international students enrolled in the English for Speakers of other Languages (ESOL) program taking credit courses. Each student completed Bostick's Library Anxiety Scale (LAS) and Spielberger's State-Trait Anxiety Inventory (STAI) to assess anxiety level before and after treatment. Subjects were given a research assignment that required them to use library resources. Treatment: Group 1 (experimental group) attended several library instruction classes (the instruction used Kuhltau's information search process model). Group 2 (control group) was in the library working on assignment but did not receive any formal library instruction. After the treatment the researcher and ESOL program instructor(s) measured the level of anxiety between groups. ANCOVA was used to analyze Hypotheses 1 and 2, which compared pretest and posttest for each group. Research assignment grades were used to analyze Hypothesis 3 comparing outcomes among the two groups. The results of the analysis ascertained that ILI was associated with reducing state and library anxiety among international students when given an assignment using library resources.
The Effects of Task-Based Documentation Versus Online Help Menu Documentation on the Acceptance of Information Technology
The objectives of this study were (1) to identify and describe task-based documentation; (2) to identify and describe any purported changes in users attitudes when IT migration was preceded by task-based documentation; (3) to suggest implications of task-based documentation on users attitude toward IT acceptance. Questionnaires were given to 150 university students. Of these, all 150 students participated in this study. The study determined the following: (1) if favorable pre-implementation attitudes toward a new e-mail system increase, as a result of training, if users expect it to be easy to learn and use; (2) if user acceptance of an e-mail program increase as expected perceived usefulness increase as delineated by task-based documentation; (3) if task-based documentation is more effective than standard help menus while learning a new application program; and (4) if training that requires active student participation increase the acceptance of a new e-mail system. The following conclusions were reached: (1) Positive pre-implementation attitudes toward a new e-mail system are not affected by training even if the users expect it to be easy to learn and use. (2) User acceptance of an e-mail program does not increase as perceived usefulness increase when aided by task-based documentation. (3) Task-based documentation is not more effective than standard help menus when learning a new application program. (4) Training that requires active student participation does not increase the acceptance of a new e-mail system.
The Electronic Ranch: the Information Environment of Cattle Breeders
The present study was a longitudinal analysis of the information needs of Red Angus cattle breeders and their use of networked information services. It was based on two surveys. The first, conducted in 1995--96, polled all 1067 ranches of the Red Angus Association of America. Responses from 192 Red Angus breeders were used to determine the value of different information types and to evaluate perceptions of the greatest barriers to the adoption of network information services. The second survey, mailed to 41 Red Angus breeders in 1998, focused on early adopters and likely users of network services. Responses from 15 breeders were used to evaluate perceptions of the greatest barriers to the effective use of Web-based information services.
Factors Influencing How Students Value Asynchronous Web Based Courses
This dissertation discovered the factors influencing how students value asynchronous Web-based courses through the use of qualitative methods. Data was collected through surveys, observations, interviews, email correspondence, chat room and bulletin board transcripts. Instruments were tested in pilot studies of previous semesters. Factors were identified for two class formats. The asynchronous CD/Internet class format and the synchronous online Web based class format. Also, factors were uncovered for two of the instructional tools used in the course: the WebCT forum and WebCT testing. Factors were grouped accordingly as advantages or disadvantages under major categories. For the asynchronous CD/Internet class format the advantages were Convenience, Flexibility, Learning Enhancement, and Psychology. The disadvantages included Isolation, Learning Environment, and Technology. For the synchronous online Web based class format the advantages were Convenience, Flexibility, Human Interaction, Learning Enhancement and Psychology, whereas the disadvantages included Isolation, Learning Environment and Technology. Concurrently, the study revealed the following factors as advantages of the WebCT Forum: Help Each Other, Interaction, Socialization, Classroom News, and Time Independent. The disadvantages uncovered were Complaints, Technical Problems and Isolation. Finally, advantages specified for the WebCT testing tool were Convenience, Flexibility and Innovations, and its disadvantages were Surroundings Not Conducive to Learning, and Technical Problems. Results indicate that not only classroom preference, learning style and personality type influence how students value a Web based course, but, most importantly, a student's lifestyle (number of personal commitments, how far they live, and life's priorities). The WebCT forum or bulletin board, and the WebCT testing or computerized testing were seen mostly by students, as good tools for encouraging classroom communication and testing because of the convenience and flexibility offered. Still, further research is needed both quantitatively and qualitatively to ascertain the true weight of the factors discovered in this study.
Modeling the Role of Boundary Spanners-in-Practice in the Nondeterministic Model of Engineering Design Activity
Boundary spanners-in-practice are individuals who inhabit more than one social world and bring overlapping place perspectives to bear on the function(s) performed within and across each world. Different from nominated boundary spanners, they are practitioners responsible for the 'translation' of each small world's perspectives thereby increasing collaboration effectiveness to permit the small worlds to work synergistically. The literature on Knowledge Management (KM) has emphasized the organizational importance of individuals performing boundary spanning roles by resolving cross-cultural and cross-organizational knowledge system conflicts helping teams pursue common goals through creation of "joint fields" - a third dimension that is co-jointly developed between the two fields or dimensions that the boundary spanner works to bridge. The Copeland and O'Connor Nondeterministic Model of Engineering Design Activity was utilized as the foundation to develop models of communication mechanics and dynamics when multiple simultaneous interactions of the single nondeterministic user model, the BSIP and two Subject Matter Experts (SMEs), engage during design activity in the Problem-Solving Space. The Problem-Solving Space defines the path through the volumes of plausible answers or 'solution spaces' that will satisfice the problem presented to the BSIP and SMEs. Further model refinement was performed to represent expertise seeking behaviors and the physical and mental models constructed by boundary spanners-in-practice during knowledge domain mapping. This was performed by mapping the three levels of communication complexity (transfer, translation and transformation) to each knowledge boundary (syntactic, semantic and pragmatic) that must be bridged during knowledge domain mapping.
Virtual Reality for Scientific Visualization: an Exploratory Analysis of Presentation Methods
Humans are very effective at evaluating information visually. Scientific visualization is concerned with the process of presenting complex data in visual form to exploit this capability. A large array of tools is currently available for visual presentation. This research attempts to evaluate the effectiveness of three different presentation models that could be used for scientific visualization. The presentation models studied were, two-dimensional perspective rendering, field sequential stereoscopic three dimensional rendering and immersive virtual reality rendering. A large section of a three dimensional sub surface seismic survey was modeled as four-dimensional data by including a value for seismic reflectivity at each point in the survey. An artificial structure was randomly inserted into this data model and subjects were asked to locate and identify the structures. A group of seventeen volunteers from the University of Houston student body served as subjects for the study. Detection time, discrimination time and discrimination accuracy were recorded. The results showed large inter subject variation in presentation model preference. In addition the data suggest a possible gender effect. Female subjects had better overall performance on the task as well as better task acquisition.
University Students and the Internet: Information Seeking Study
This study explored university students' information needs and seeking behaviors on the Internet. A Web-based survey was administrated one time. Two hundred responses were received from the target sample within the two weeks period of the study. Data were analyzed with descriptive statistics, factor analysis, and graphical representation. The study explored various issues related to the usability, preferences, and activities of the Internet, such as searching tools, e-mail, search engines, and preferred primary sources of everyday-life information needs. The study explored the perceptions of the students toward the Internet and the traditional library. Kuhlthau's model of the information-seeking process, which includes six stages and affective components, was utilized and modified in the construction of the Web survey. A study by Presno (1998), which includes the four types of Internet anxiety, was utilized in the construction of the Web survey. With regard to the six stages of Kuhlthau model, the majority of the respondents experienced stage 5, which was about information gathering; stage 3 had the next highest number of respondents. Very few respondents experienced stages 1 and 2. There was a systematic pattern in which, the earlier the stages the respondents were in, the more negative adjectives they selected, and vice versa. The feeling adjectives section showed a difference in the behavior between males and females. The results indicated that most students had Internet time delay anxiety. In general, the study found that students have a great interest in the Internet and consider it an important source of information for their personal, educational, and communication activities.
Costly Ignorance: The Denial of Relevance by Job Seekers: A Case Study in Saudi Arabia
Job centers aid businesses seeking qualified employees and assist job seekers to select and contact employment and training services. Job seekers are also offered the opportunity to assess their skills, abilities, qualifications, and readiness. Furthermore, job centers ensure that job seekers are complying with requirements that they must meet to benefit from job assistance programs such as unemployment insurance. Yet, claimants often procrastinate and/or suspend their job search efforts even though such actions can make them lose their free time and entitlements, and more importantly they may lose the opportunity to take advantage of free information, services, training, and financial assistance for getting a job to which they have already made a claim. The current work looks to Chatman's "small worlds" work, Johnson's comprehensive model of information seeking, and Wilson's "costly ignorance" construct for contributions to understanding such behavior. Identification of a particular trait or set of traits of job seekers during periods of unemployment will inform a new Job Seeking Activities Model (JSAM). This study purposely examines job seeker information behavior and the factors which influence job seekers' behavior, in particular, family tangible support as a social norm effect. A mixed method, using questionnaires for job hunting completers and non-completers and interviews for experts, was employed for data collection. Quantitative data analysis was conducted to provide the Cronbach α coefficient, Pearson's product moment correlation, an independent-sample t-test, effect size, and binary Logit regression. The qualitative data generated from the interview transcript for each section of the themes and subthemes were color coded. Finally, simultaneous triangulation was carried out to confirm or contradict the results from each method. The findings show that social norms, particularly uncontrolled social support provided by their families, are more likely to make job seekers ignore the relevant information about jobs available to them in favor ...
Building an Understanding of International Service Learning in Librarianship
From the very beginning, library education has been a mixture of theory and practice. Dewey required apprenticeships to be part of the first library school at the University of Chicago as a method to indoctrinate new professional. Today, acculturation is incorporated into the professional education through a large variety of experiential learning techniques, including internships, practicum, field work, and service learning projects, all of which are designed to develop some level of professional skills within an information organization. But, what is done for understanding library culture? It is said that one cannot truly recognize the extent of one's own cultural assumptions, until they have experienced another. This study followed a group of LIS graduate students that took that next step – going to Russia. By employing a critical hermeneutic methodology, this study sought to understand what value students gain by from working on an assessment project in an international school library. Using a horizon analysis, the researcher established the worldview of participants prior to their departure, analyzed their experience through post-experience interviews, and constructed an understanding of value. Among other concepts, the researcher looked specifically to see whether "library cultural competency", understanding library culture in global context, was developed through working on a service learning project within an international school library. This dissertation provides feedback for the program leaders and ideas for future research.
The Denial of Relevance: Biography of a Quest(ion) Amidst the Min(d)fields—Groping and Stumbling
Early research on just why it might be the case that “the mass of men lead lives of quiet desperation” suggested that denial of relevance was a significant factor. Asking why denial of relevance would be significant and how it might be resolved began to raise issues of the very nature of questions. Pursuing the nature of questions, in light of denial of relevance and Thoreau’s “quiet desperation” provoked a journey of modeling questions and constructing a biography of the initial question of this research and its evolution. Engaging literature from philosophy, neuroscience, and retrieval then combined with deep interviews of successful lawyers to render a thick, biographical model of questioning.