A Rationale for Using Synthetic Designs in Medical Education Research
- 73 Downloads
The extent to which the results of a study can be attributed to the intervention under investigation(i.e., internal validity) is an important consideration in interpreting study findings. There are many threats to the internal validity of designs frequently used in medical education research. Synthetic designs, which involve the integration of two or more weak designs, or the addition of design elements, may afford investigators greater control over confounding variables in medical education research. A rationale for using synthetic designs is presented and two examples of their use in medical education settings are examined. The concluding proposition is that synthetic designs allow investigators flexibility in planning research that is feasible in medical education settings. In addition, they may permit stronger causal inferences between interventions and results than traditional research designs.
Unable to display preview. Download preview PDF.
- Bland, C.J., Schmitz, C.C., Stritter, F.T., Henry, R.C. & Aluise, J.J. (1990). Successful Faculty in Academic Medicine: Essential Skills and How to Acquire Them. New York: Springer.Google Scholar
- Campbell, D.T. & Stanley, J.C. (1963). Experimental and Quasi-Experimental Designs for Research. Boston: Houghton Mifflin.Google Scholar
- Chen, H. (1990). Theory-driven evaluations. Newbury Park, CA: Sage.Google Scholar
- Cook, T.D. & Campbell, D.T. (1979). Quasi-Experimentation: Design and Analysis Issues for Field Settings. Chicago: Rand McNally.Google Scholar
- Cordray, D.S. (1986). Quasi-experimental analysis: a mixture of methods and judgement. In: W.M. Trochim (ed.), Advances in Quasi-experimental Analysis and Design, pp. 9–27. San Francisco: Jossey-Bass.Google Scholar
- Dauphinee, W.D. (1996). Response to Christine McGuire's invited address. Acad Med 71 (10 suppl): S127–128.Google Scholar
- Fletcher, R.H., Fletcher, S.W. & Wagner, E.H. (1982). Clinical Epidemiology – The Essentials. Baltimore, MD: Williams & Wilkins.Google Scholar
- Huck, S.W., Cormier, W.H. & Bounds, W.G. (1974). Reading Statistics and Research. New York: Harper and Row.Google Scholar
- McCall, W.A. (1923). How to Experiment in Education. New York: Macmillan.Google Scholar
- Rossi, P.H. & Freeman, H.E. (1993). Evaluation: A Systematic Approach (5th Ed.). Newbury Park, CA: Sage.Google Scholar
- Simpson, D.E. (1993). Increasing the pool of medical education researchers. AcadMed 69: 654–657.Google Scholar
- Trochim, W.M. (1986). Editor's notes. In: W.M. Trochim (ed.), Advances in Quasi-experimental Analysis and Design, pp. 1–7. San Francisco: Jossey-Bass.Google Scholar