AECT Handbook of Research

Table of Contents

17. Educational games and simulations: Technology in search of a research paradigm
PDF

17.1 Introduction
17.2 A Definitive Framework
17.3 Academic Games
17.4 Experiential Simulations
17.5 Symbolic Simulations
17.6 Instructional Design Implications Derived from Research
17.7 Recommendations for Future Research
  References
Search this Handbook for:

 

17.7 RECOMMENDATIONS FOR FUTURE RESEARCH

Early research on games and simulations typically compared the particular interactive exercise with regular classroom instruction on information-oriented achievement tests.

These "apples and oranges" comparisons did not yield definitive information about the effectiveness of the innovations. Further, key processes in the innovation often were not documented, student characteristics that may interact with the exercise in different ways were not investigated, and outcome measures often were poorly described.

Conducting definitive research on games and simulations, however, requires more than the specification of additional variables and the description of student interactions and outcome measures. Specifically, if an exercise is poorly designed, documentation of implementation and outcomes contributes little to an understanding of the potential of the innovation (Gredler, 1996). A three-step strategy is essential to conducting useful research on games and simulations. The steps are: (1) Document the design validity of the innovation; (2) verify the cognitive strategy and/or social interaction processes executed by students during the exercise in small-group tryout (formative evaluation) and redesign where necessary; and (3) conduct follow-up research on specific processes and effects (see Chapters 39 to 42 for specific approaches to research).

17.7.1 Document Design Validity

Several issues important in design validity for both games and simulations are: (1) a reliance on a knowledge domain and subject-area expertise, (2) the exclusion of chance or random strategies as a means to success, and (3) the avoidance of mixed-metaphor exercises and zero-sum games.

Particularly important for simulations is the analysis of the mode of delivery and the causal model for events in the exercise. Specifically, social-process simulations cannot be delivered by the computer; however, any of the symbolic simulations are legitimate computer-based exercises.

The causal model for the simulation, whether quantitative or qualitative, should reflect verifiable processes and interrelationships, not random events. Some early computer-based exercises inadvertently established Russian roulette situations for students in which the criteria for successful management are unknown to the participants. Students repeatedly assign values to a limited number of variables in the absence of a knowledge base and await the outcome. In such exercises, the students play against the odds established by the house (the computer program) instead of a real-world causal model (Gredler, 1992a).

Often, careful analysis is required to identify these variable-assignment exercises. An example is Lemonade Stand First, students repeatedly assign values to only three variables. Thus, the same limited number of decisions is made again and again. The three variables are: (1) the number of glasses of lemonade one wishes to sell, (2) selling price per glass, and (3) amount of advertising expenditures. After students input their selections, the program compares them with the preprogrammed model and informs them of the amount of profit or loss for that day. Sometimes this figure is accompanied by the statement that the weather was cloudy and rainy that day; thus, little or no profit was earned.

The inadequacy of the causal model also is noted by Vargas (1986). The exercise omits important considerations, such as the components of the lemonade (how much sugar and lemon to use in the brew), location of the stand, and the fact that few people would set up a stand in rainy weather. Thus, the exercise leads the prospective user to expect more than it delivers (p. 742). Further, the exercise is simply a guessing game for students as they attempt to discover the variable weightings that were selected by the programmer.

A much more useful exercise would be one that engages the students in planning for a lemonade stand in which weather data for past years as well as information on pedestrian traffic patterns, costs of resources, and so on are available. In this way, students' cognitive strategies may be facilitated.

17.7.2 Verification of Cognitive Strategy and/or Social Interaction Processes

A game or simulation that meets design criteria should then be implemented with a small group of students to determine the actual behaviors that it precipitates. This practice is a long-standing and accepted tenet of instructional design. However, many computer products marketed for schools do not undergo this test of effectiveness.

Important information can be obtained in tryouts of a game or simulation with answers to these questions: Do the students become frustrated and lose interest because the exercise is too difficult for them? Does success depend on skills other than those intended by the designers? What unanticipated behaviors do students execute during the exercise? What are learner attitudes toward the game or simulation?

Particularly important is the issue of task difficulty. A symbolic simulation that challenges the learner's naive conceptions or requires sophisticated research strategies beyond the learner's level of expertise is one that places high cognitive demand on the learner. Such a situation, in which the learner may be thrashing around with few resources for resolution, may generate reactions such as, "Why don't you just tell me what you want me to know?". (Perkins, 1991, p. 20).

Small-group tryout, in other words, is essential for determining whether intended processes and effects occur and the nature of unintended effects. Logistical difficulties in implementing the exercise also may be identified.

The researcher or designer then makes adjustments in the context for implementation, support materials for the exercise, or level of prerequisite knowledge and strategy use specified for the exercise, and implements the game or simulation with another small group. One alteration for a symbolic simulation, for example, may be to implement the exercise with a two-member team rather than as an individual exercise. In an experiential simulation, penalties for certain irresponsible actions may be added or the rules altered in order to deter unethical behavior.

17.7.3 Conduct Follow-up

Experiential exercises that meet design and formative evaluation criteria may then be further tested in group implementations. However, the type of research that is conducted on the exercise depends in part on the nature of the exercise and the purpose for which it was developed. Exercises that are designed to develop particular skills and capabilities traditionally provided by an existing instructional approach may be compared for effectiveness to that approach. For example, laboratory research simulations were developed for a viable option to the traditional "wet-lab" experience. In such a situation, comparisons of student performance between the computer-based and laboratory-based experiments on laboratory reports and posttests are logical. Also, system simulations that substitute learner operation of system components by computer-managed videodiscs may be compared to equipment-based instruction.

Similarly, business colleges often implement both data management simulations and case studies to provide students with experience in managing the finances of a company. Given similar goals, comparison research with these two types of exercises is legitimate. However, an important component of the research is to identify the types of student abilities, attitudes, and skills that interact with the exercises.

Other simulations, in contrast, typically address an instructional need that is not currently met by other forms of instruction. Diagnostic simulations, for example, were developed originally to bridge the gap between course work and hospital internships for medical students. Also, data universe and process simulations provide opportunities for students to conduct extended research and to confront their misconceptions and inadequate mental models. Such opportunities are not available in typical instructional situations.

For such simulations, a series of related exploratory studies is needed to determine the range and variety of reasoning and thinking strategies implemented by students and the effects. This research can make use of both quantitative and qualitative data. Pretests of domain knowledge and reasoning skills may be administered and then matched to the problem-solving strategies used by the students during the simulation and to other information to develop profiles of student interactions with these complex exercises.

Qualitative data in the form of analyses of students' problem-solving efforts may be obtained by (1) requesting students to verbalize their thoughts as they work through the exercise and by (2) videotaping the sessions. Transcriptions of the videotapes are then analyzed and coded to identify the specific steps implemented by each of the students. Semistructured individual interviews with students after their session(s) with the simulation can shed light on their noncognitive reactions to the experience.

Such research is time consuming and painstaking. However, strategy-based games and experiential and symbolic simulations offer opportunities for learning that are qualitatively different from those of direct instruction. The challenge is to develop the associated knowledge base for complex student-directed learning.


Updated August 3, 2001
Copyright © 2001
The Association for Educational Communications and Technology

AECT
1800 North Stonelake Drive, Suite 2
Bloomington, IN 47404

877.677.AECT (toll-free)
812.335.7675

AECT Home Membership Information Conferences & Events AECT Publications Post and Search Job Listings