|
|||
|
17. Educational
games and simulations: Technology in search of a research paradigm
|
17.7 RECOMMENDATIONS FOR FUTURE RESEARCHEarly research on games and simulations typically compared the particular interactive exercise with regular classroom instruction on information-oriented achievement tests. These "apples and oranges" comparisons did not yield definitive information about the effectiveness of the innovations. Further, key processes in the innovation often were not documented, student characteristics that may interact with the exercise in different ways were not investigated, and outcome measures often were poorly described. Conducting definitive research on games and simulations, however, requires more than the specification of additional variables and the description of student interactions and outcome measures. Specifically, if an exercise is poorly designed, documentation of implementation and outcomes contributes little to an understanding of the potential of the innovation (Gredler, 1996). A three-step strategy is essential to conducting useful research on games and simulations. The steps are: (1) Document the design validity of the innovation; (2) verify the cognitive strategy and/or social interaction processes executed by students during the exercise in small-group tryout (formative evaluation) and redesign where necessary; and (3) conduct follow-up research on specific processes and effects (see Chapters 39 to 42 for specific approaches to research). 17.7.1 Document Design ValiditySeveral issues important in design validity for both games and simulations are: (1) a reliance on a knowledge domain and subject-area expertise, (2) the exclusion of chance or random strategies as a means to success, and (3) the avoidance of mixed-metaphor exercises and zero-sum games. Particularly important for simulations is the analysis of the mode of delivery and the causal model for events in the exercise. Specifically, social-process simulations cannot be delivered by the computer; however, any of the symbolic simulations are legitimate computer-based exercises. The causal model for the simulation, whether quantitative or qualitative, should reflect verifiable processes and interrelationships, not random events. Some early computer-based exercises inadvertently established Russian roulette situations for students in which the criteria for successful management are unknown to the participants. Students repeatedly assign values to a limited number of variables in the absence of a knowledge base and await the outcome. In such exercises, the students play against the odds established by the house (the computer program) instead of a real-world causal model (Gredler, 1992a). Often, careful analysis is required to identify these variable-assignment exercises. An example is Lemonade Stand First, students repeatedly assign values to only three variables. Thus, the same limited number of decisions is made again and again. The three variables are: (1) the number of glasses of lemonade one wishes to sell, (2) selling price per glass, and (3) amount of advertising expenditures. After students input their selections, the program compares them with the preprogrammed model and informs them of the amount of profit or loss for that day. Sometimes this figure is accompanied by the statement that the weather was cloudy and rainy that day; thus, little or no profit was earned. The inadequacy of the causal model also is noted by Vargas (1986). The exercise omits important considerations, such as the components of the lemonade (how much sugar and lemon to use in the brew), location of the stand, and the fact that few people would set up a stand in rainy weather. Thus, the exercise leads the prospective user to expect more than it delivers (p. 742). Further, the exercise is simply a guessing game for students as they attempt to discover the variable weightings that were selected by the programmer. A much more useful exercise would be one that engages the students in planning for a lemonade stand in which weather data for past years as well as information on pedestrian traffic patterns, costs of resources, and so on are available. In this way, students' cognitive strategies may be facilitated. 17.7.2 Verification of Cognitive Strategy and/or Social Interaction ProcessesA game or simulation that meets design criteria should then be implemented with a small group of students to determine the actual behaviors that it precipitates. This practice is a long-standing and accepted tenet of instructional design. However, many computer products marketed for schools do not undergo this test of effectiveness. Important information can be obtained in tryouts of a game or simulation with answers to these questions: Do the students become frustrated and lose interest because the exercise is too difficult for them? Does success depend on skills other than those intended by the designers? What unanticipated behaviors do students execute during the exercise? What are learner attitudes toward the game or simulation? Particularly important is the issue of task difficulty. A symbolic simulation that challenges the learner's naive conceptions or requires sophisticated research strategies beyond the learner's level of expertise is one that places high cognitive demand on the learner. Such a situation, in which the learner may be thrashing around with few resources for resolution, may generate reactions such as, "Why don't you just tell me what you want me to know?". (Perkins, 1991, p. 20). Small-group tryout, in other words, is essential for determining whether intended processes and effects occur and the nature of unintended effects. Logistical difficulties in implementing the exercise also may be identified. The researcher or designer then makes adjustments in the context for implementation, support materials for the exercise, or level of prerequisite knowledge and strategy use specified for the exercise, and implements the game or simulation with another small group. One alteration for a symbolic simulation, for example, may be to implement the exercise with a two-member team rather than as an individual exercise. In an experiential simulation, penalties for certain irresponsible actions may be added or the rules altered in order to deter unethical behavior.
|
AECT 877.677.AECT
(toll-free) |