Search Results

The Global Village Playground: A qualitative case study of designing an ARG as a capstone learning experience.
The Global Village Playground (GVP) was a capstone learning experience designed to address institutional assessment needs while providing an integrated, contextualized, and authentic learning experience for students. In the GVP, students work on simulated and real-world problems as a design team tasked with developing an alternate reality game that makes an impact on the United Nations Millennium Development Goals. The purpose of this study was to evaluate the effectiveness of the design of the GVP as a capstone experience. The research design follows a qualitative case study approach to gather and analyze data collected from the instructors and students participating in the pilot implementation of the GVP. Results of the study show predominantly favorable reactions to various aspects of the course and its design. Students reported to have learned the most through interactions with peers and through applying and integrating knowledge in developing the alternate reality game that was the central problem scenario for the course. What students demonstrated to have learned included knowledge construction, social responsibility, open-mindedness, big picture thinking, and an understanding of their relationship to the larger society and world in which they live. Challenges that resulted from the design included the amount of necessary to build consensus and then develop an overarching game concept, the tension between guided and directed instruction, and the need to foster greater interdependence among students while encouraging them to become more self-directed.
Influence of pre and post testing on return on investment calculations in training and development.
When expenses become an issue, training is often one of the first budget items to be cut. There have been a number of evaluation studies about rates of return from training interventions. Most results are based on interviewing participants about the value of the intervention and its effect on their productivity. This often results in quadruple digit return on investment indications. Decision makers who control the budget often view these kinds of results with skepticism. This study proposes a methodology to evaluate training interventions without asking participants their opinions. The process involves measuring learning through a series of pre-tests and post-tests and determining if scores on pre-tests can be used as predictors of future return on investment results. The study evaluates a series of return on investment scores using analysis of variance to determine the relationship between pre-tests and final return on investment results for each participant. Data is also collected and evaluated to determine if the financial results of the organization during the period of the training intervention could be correlated to the results of the training intervention. The results of the study suggest that the proposed methodology can be used to predict future return on investment from training interventions based on the use of pre-tests. These rates of return can be used as a method of selecting between competing training intervention proposals. It is a process that is easily understood by the key decision makers who control the allocation of financial resources. More importantly, it is a process that can maximize the value of each dollar spent on training.
Back to Top of Screen