AECT Handbook of Research

Table of Contents

40: Qualitative Research Issues and Methods: An Introduction for Educational Technologists
PDF

  Introduction
40.1 Introduction to Qualitative Research
40.2 Qualitative Research Methods
40.3 Analyzing Qualitative Data
40.4 Writing Qualitative Research Reports
40.5 Ethical issues in Conducting Qualitative Research
40.6 Criteria for Evaluating Qualitiative Studies
40.7 Learning More about doing Qualitative Research
  References
Search this Handbook for:

 

40.6 CRITERIA FOR EVALUATING QUALITATIVE STUDIES

Criteria for evaluating the quality and rigor of qualitative studies vary somewhat, based on methods used. Most concerns, however, apply to most studies. Adler and Adler (1994) say that one of the primary criticisms of observational studies, whether participant or nonparticipant methods are used, is the question of their validity due to the subjectivity and biases of the researcher. These authors contend that this concern is one of the reasons studies based solely on observations are rarely published. They suggest that validity can be increased in three ways. Multiple observers in teams can cross-check data and patterns continually. The researcher can refine and test propositions and hypotheses throughout the study, in a grounded theory approach. Finally, the researcher can write using "verisimilitude" or "varisemblance" (P. 383), or writing that makes the world of the subjects real to the reader; the reader recognizes the authenticity of the results. Adler and Adler also address the issue of reliability in observational studies. They suggest systematically conducting observations repeatedly under varying conditions, particularly varying time and place. Reliability would be verified by emergence of similar results.

Borg and Gall (1989) listed several criteria for evaluating the quality of participant observation studies, including that:

  1. Using involved participant observers is less likely to result in erroneous reported data from individuals or organizations.
  2. The researcher should have relatively free access to a broad range of activities.
  3. The observations should be intense, that is, conducted over a long period of time.
  4. In more recent studies, both qualitative and quantitative data are collected.
  5. Using a "triangulation of methodology" (p. 393), researchers can be assured that the picture they present of the reality of a setting or situation is, clear and true. Multiple methods may be used to address research questions, but also, in line with Adler and Adler's (1994) recommendations for enhancing reliability, the same data may be collected from other samples at other times and in other places.
  6. Researchers should strive to gain an overall view of the issues and context, and then sample purposely in order to collect data that represent the range of realities of participants in those settings. Borg and Gall, as do others, caution that researchers be sensitive to both what is excluded as well as what is included.
  7. Finally, in all observational studies they recommend that researchers should be ready to observe, record, and analyze, not just verbal exchanges but also subtle cues by using unobtrusive measures.

Ethical issues also relate to the quality of a study. Issues specific to conducting interviews are delineated by Fontana and Frey (1994). They add to the general concerns already mentioned the issues of informed consent and the right to privacy and protection. They mention that there is some debate regarding whether covert methods for gathering data are ethical although they may reflect real life. They describe the dilemma a researcher may face in deciding how involved to become with respondents and suggest some degree of situational ethics, cautioning that a researcher's participation may enable or inhibit certain behaviors or responses. Finally, they raise the issue that interviewing itself is manipulative, still treating human beings as objects.

Hammersley (1990) provides additional criteria for assessing ethnographic research, many of which will apply to most qualitative studies. He puts forward two main criteria for judging ethnographic studies, namely, validity and relevance. He discusses the validity of a study as meaning the "truth" of the study. He suggests three steps for assessing the validity of ethnographic finds or conclusions. He recommends asking first if the findings or claims are reasonable; secondly "whether it seems likely that the ethnographer's judgment of matters relating to the claim would be accurate given the nature of the phenomena concerned, the circumstances of the research, the characteristics of the researcher, etc." (p. 61); finally, in cases in which the claim does not appear to be plausible or credible, evidence of validity is required to be examined. Clearly in reports of qualitative research studies, the reader must be provided enough information about the perspective, sampling and choice of subjects, and data collected in order to determine with some confidence the validity or "truth" represented in a study.

With regard to the second criterion, relevance, Hammersley advises that studies have broadly conceived public relevance or value. On a practical level, Nathan (1979, pp. 113-115), in Abt's book on the costs and benefits of applied social research, provides what he calls rules for relevant research. A selection includes:

  1. Be as evenhanded as you can.
  2. Focus on the most policy-relevant effects.
  3. When faced with a choice between the direct and the more elaborate expression of statistics and concepts, choose the former.
  4. Get your hands dirty.
  5. Be interdisciplinary.
  6. Sort out carefully description, analysis, and your opinions.

Lincoln and Guba (1985) describe criteria that are frequently cited for evaluating qualitative studies. They address the criticisms leveled at naturalistic research and determine that quality rests in trustworthiness of the study and its findings. They agree with others that conventional criteria are inappropriate for qualitative studies and that there do exist alternative criteria. These criteria are: (1) credibility, (2) transferability, (3) dependability, and (4) confirmability. These authors go on to recommend activities the researcher may undertake to ensure that these criteria will be inherent in the study. In particular, in order to make credible findings more likely, they recommend that prolonged engagement, persistent observation, and triangulation be done. Further, they recommend peer debriefing about the study and its methods, opening the researcher and the methods up for review. They also recommend analyzing negative cases in order to revise hypotheses; testing for referential adequacy by building in the critical examination of findings and their accompanying raw data; and conducting checks of data, categories used in analysis, and interpretations and findings with members of the subject audience.

Lincoln and Guba (1985) provide a similar level of helpful suggestions in the area of ensuring confirmability. They recommend triangulation with multimethods and various sources of data, keeping a reflexive journal and, most powerfully, conducting a confirmability audit. In their book, they include detailed descriptions of the steps in conducting an audit and recommend the following categories of data that can be used in the audit, including raw data, products of data analysis, products of the synthesis of data such as findings and conclusions, process notes, personal notes about intentions, and information about how instruments were developed.

In the tradition of Lincoln and Guba, Erlandson et al. (1993) describe these techniques for ensuring the quality of a study:

  • Prolonged engagement
  • Persistent observation
  • Triangulation
  • Referential adequacy
  • Peer debriefing
  • Member checking
  • Reflexive journal
  • Thick description
  • Purposive sampling
  • Audit trail

The Association for Educational Communications and Technology (AECT) has shown strong support for qualitative research in the field. For several years, the ECT Foundation and the Research and Theory Division supported the Special Research Award. The ECT Foundation has recently decided to support the creation of a Qualitative Research Award. Ann De Vaney (1995), currently the chair of this award committee, provided the following criteria, developed by numerous AECT members, which will be used to evaluate the quality of papers submitted for this award:

  1. Is the problem clearly stated? Does it have theoretical value and currency? Does it have practical value?
  2. Is the problem or topic situated in a theoretical framework? Is the framework clear and accessible? Does the document contain competing epistemologies or other basic assumptions that might invalidate claims?
  3. Is the literature review a critique or simply a recapitulation? Is it relevant? Does it appear accurate and sufficiently comprehensive?
  4. Are the theses stated in a clear and coherent fashion? Are they sufficiently demonstrated in an accessible manner? Are there credible warrants to claims made about the theses?
  5. Does the method fit the problem, and is it an appropriate one given the theoretical framework?
  6. Do the data collected adequately address the problem? Do they make explicit the researcher's role and perspective? Do the data collection techniques have a "good fit" with the method and theory?
  7. Are the data aggregates and analysis clearly reported? Do they make explicit the interpretive and reasoning process of the researcher?
  8. Does the discussion provide meaningful and warranted interpretations and conclusions?

Lest it appear that there is universal agreement about the quality criteria, it may be noted that the postmodern trend toward questioning and deconstruction have led to continued debate in this area. Wolcott, in his book about transforming qualitative data (1994, pp. 348-356), argues for rejecting validity in qualitative research, and then describes activities he undertakes to address the challenge of validity. These include "talk a little, listen a lot... begin writing early... let readers 'see' for themselves... report fully... be candid... seek feedback... try to achieve balance... write accurately."


Updated August 3, 2001
Copyright © 2001
The Association for Educational Communications and Technology

AECT
1800 North Stonelake Drive, Suite 2
Bloomington, IN 47404

877.677.AECT (toll-free)
812.335.7675

AECT Home Membership Information Conferences & Events AECT Publications Post and Search Job Listings