A Framework Supporting Educational Software Reuse:
Teacher Simulation Creation Environment

Cheryl D. Seals

Department of Computer Science
Virginia Tech

Blacksburg, Virginia
cseals@vt.edu


ABSTRACT

Teachers need more help in the classroom and computers with the right educational software may just be the help they need. Sadly, in most classrooms the computer is only used for drill and practice. These types of activities can free the instructor from mundane tasks, but does little to help students develop higher order reasoning and problem-solving skills. Studies have shown that students learn substantially more from exploratory learning (e.g. simulations), than from drill and practice routines. This research studies educational software simulations and utilizes the results to create a visual programming environment that will support the creation of simulations by teachers who are novice programmers. The expectation is that this will increase the usability of programming environments and give teachers a tool that will empower them to create and modify their own software.

Keywords

Visual Programming, End User Programming, Program Construction Kits, Software Reuse

INTRODUCTION

Seymore Papert described the design of Logo programming language as taking the best ideas of computer science about programming language design and "child engineering" them [5]. In the twenty-five years since Logo, there has been much progress in programming language research and in human computer interfaces. In the area of end user programming, however there is still need for improved usability: systems must be easy to learn, easy to use, flexible, and pleasurable to use [4]. The main issue in end user programming is the relationship between ease of use and programming power. This has been studied with children as the target audience, but there has been almost no investigation of how end user programming should be introduced to someone who is mature and skilled, but is still a novice programmer [1, 3].

The target population of this work is secondary school teachers. We believe that empowering teachers would greatly improve educational software by enabling them to use them successfully. Soloway points to the lack of money to create new educational software. One way to get new software into the classroom is to create an environment that will allow teachers to create new software [8]. To capitalize even more from this venture, the current work will include a web based repository that all teachers in the local or virtual community can reuse. The environment will be structured so that the reuse of components, agents, even entire simulations is easily carried out.

RESEARCH APPROACH

The evaluation plan for this work has several phases. Intrinsic evaluations of several visual programming environments are carried out as a precursor to empirical evaluations. Next empirical analysis will be performed in both laboratory and field settings. Based on these diverse analyses, a new environment will be built and evaluated in the public schools.

Intrinsic Evaluation with Scenarios and Claims

We have completed intrinsic evaluations several visual end user programming environments. The criterion for selecting these languages is that each represents a different type of educational simulation software. There will be at least one example from each of the following categories: simulation specific software that is procedure-based, mathematics-based or rule based (e.g. Star Logo, Toon Talk, Model-It, Agentsheets, Cocoa); viewable/observational software that are simulation environments with functionality that cannot be modified (e.g. SimCity, SimCalc); and observational software that are construction kits (e.g. ActivChemistry, Geometer's Sketchpad) [6]. An intrinsic evaluation is an analysis of an artifact directed at its functionality or features. Scriven contrasts this with empirical evaluations that involve observing an artifact in use [7]. The intrinsic analysis is using scenarios and claims analysis to understand the strengths and weaknesses of existing systems. [2] Assessing system features with claims analysis in a realistic context will provide guidance to system design.

In parallel with the intrinsic analysis, we have begun empirical evaluations on several environments. The empirical analysis will be performed initially in a controlled laboratory setting. Two evaluators will observe the experiment in process. The participant trained using minimalist instruction and is instructed to use the "think aloud" protocol during the session. They are trained on the basic use of the environment, how to build simulations in this environment, and how to reuse simulations. They are also given an interaction guide to aid with error recovery and as a review of helpful shortcuts. One evaluator is in the proximity of the participant in case of serious breakdown. The second evaluator is located out of view of the participant and designed so the evaluator can record critical incidents without distracting the participant, and to control the audio-visual equipment to record the session. The goal of the experiment is that teachers create real simulations quickly relying on their domain expertise and new skills they have learned about simulation creation. Some observation-based empirical studies will also be performed in the usability lab and the teacher's classroom.

Visual Programming Environment

The results of the intrinsic and empirical studies will provide requirements for a new visual programming environment that will support programming by the teacher community, and provide a framework for reuse. The claims from the intrinsic analysis and information gained while performing empirical studies is being analyzed and will become requirements for a new visual end user programming environment aimed at science simulations, while mitigating the problems that are in the studied systems.

The rationale for building simulations as educational material is very practical. As Kuyper argues [4], simulations are independent of time and place, which makes them more readily available for real experience. Simulations can also provide better conceptual model of a situation, and can be used to create virtual environments, taking learning into the realm of imagination [4]. The emphasis on reuse is tied to the need for teachers to benefit from one another's work. The current systems that we have explored have some low-level components that can be reused. We would like to explore the reuse of higher-level components, e.g. entire simulations. Finally we want a system that will support an on-line teacher community. Our initial plan is to support reuse of simulations and their components at various levels.

RESEARCH STATUS

At present intrinsic evaluations have been performed on three environments and empirical studies have been carried out with graduate students, and with teachers. The teacher's studies consisted of two parts. Session One was a learning session to acquaint the user with the system. The second session is focused on reuse to build new simulations. In general, users' reactions to the system have been positive and the majority of users specified applications of the simulation environment. They had little trouble with the visual aspects of the system, since all users were familiar with drawing systems. However a number of problems were observed in the following areas: variety of drawing tools provided, icons, rules/actions, graphically rewrite systems, and general environment problems (e.g. ease of direct manipulation). These findings have been utilized to develop requirements for the new system. For our initial studies, we utilized a system that depended on graphical rewrite rules, was very usable, and had a precedent/consequent style of rule making which was very familiar for programmers, but during our studies the combination of rule creation and graphical rewrite rules was confusing to novice users leading to much frustration.  For our final system we are utilizing Squeak, an object oriented system that will rely on message passing between objects instead of specific spatial constraints.  Systems requirements and design are complete and an initial prototype for the system is currently being created and will be tested during the first quarter of 2002. 

Interim Conclusions

Among experienced programmers, inheritance is the normal method for reuse, but in a graphical context this can be problematic. With novice programmers, the intuitive type of reuse for them is to learn by example, and the use a type of copy/paste reuse. Our findings showed that novice benefited from reuse and it got them started much more quickly than by building from scratch. Since our teachers have limited time we believe with an environment that supports copy/paste reuse and rule templates that allow users to easily create graphical simulations through direct manipulation, we can provide a minimalist environment that will support our teachers in the creation of specialized educational software to complement their lessons.

FUTURE WORK

The initial requirements of the system are complete, design and implementation have begun. The current plan is to complete implementation of the programming environment during the fall of 2001 and early spring of 2002. The users need mechanisms that facilitate the translation of their ideas into working simulations. To scope this work we will concentrate our efforts in building an end user programming environment that will support teachers in one content area. Also, during the spring of 2002 pilot studies of this environment will be performed with graduate students. Finally, in the spring and summer of 2002 summative empirical evaluations will be performed with middle and high school science teachers in the usability lab and in their classrooms.

 

EXPECTATIONS FOR DOCTORAL CONSORTIUM

My expectations for the doctoral consortium are to share my research work and experiences with other doctoral students. This will be a great opportunity to foster research collaborations. Also it will allow me to get more feedback from a scholarly group that can objectively critique my work and in refining future research directions. This will me an opportunity to present my work to a larger community.  My experience last year greatly helped my research by giving me invaluable feedback.

 

ACKNOWLEDGMENTS

This work was supported by a grant from the National Science Foundation. I am grateful to my advisor, Dr. Mary Beth Rosson, for her intellectual stimulation and support of my work. I would also like to thank other members of the HCI-Visual Programming Languages Group, Nathan Hamblen, Helena Mentis, Stephanie Peppard, graduate students at Virginia Tech that participated in our exploratory study and teachers from area schools for their participation in studies that are a part of the formative evaluation. I would also like to thank Alexander Repenning, Andri Ioannidou, and Jonathan Phillips, from the Center for Lifelong learning at the University of Colorado Boulder, for all their help and the opportunity to review and critique their software. I would also like to thank Mark Guzdial and the Squeakers at Georgia Tech.

 

REFERENCES

  1. Brand, C., Radar, C., Carlone, H. Lewis, C. (1998). Prospects and Challenges for Children Creating Science Models. Presented at NARST 1998 Annual Meeting, San Diego CA, 4/98.
  2. Carroll, J.M., and Rosson, M.B. "Managing Evaluation Goals for Training" Communications of the ACM. July 1995, Vol.38, No.7, 40-48.
  3. Gilmore, D.D., Pheasey, K., Underwood, J., and Underwood, G. "Learning graphical programming: An evaluation of KidSim," ESRC Centre for Learning Research Psychology Dept. University of Nottingham. 4.
  4. Kuyper, Michiel. (Dissertation). Knowledge engineering for usability: Model- mediated interaction Design of Authoring Instructional Simulations. University of Amsterdam, Department of Psychology, 1998. 5.
  5. Papert, Seymour. Mindstorms: children, computers, and powerful ideas. New York: Basic Books, 1980. 6.
  6. Schmucker, K., A Taxonomy of Simulation Software: A work in progress. Learning Technology Review, Spring 99, 40-75. 7.
  7. Scriven, M. The methodology of evaluation. In Perspectives of Curriculum Evaluation, R. Tyler, R. Gagne and M. Scriven, Eds. Rand McNally, Chicago, 1967, 39-83. 8.
  8. Soloway, E. (1998). Log on Education: No One is Making Money in Educational Software. Communications of the ACM. Vol.41, No. 2. 1998, 11-15.