Skip to main content

Abstract

The framework for the implementation of effective instruction depends upon three areas of competence: educational foundations, methodology, and process. Traditional methodologies have tried to accommodate both foundations (theory) and process. A serious problem arises in trying to apply static methodologies to dynamic environments. To emphasis this point, we look at lessons learned by instructional designers in the development of one class of technology-supported learning: computer-based training (CBT). Lessons learned cover the ‘how’, the ‘what’, the ‘who’ and the ‘why’ of CBT development and delivery. The point of this report of lessons learned is to describe conditions that were not considered in traditional methodologies that were associated with sub-optimal outcomes as a basis for our conclusion that static processes were applied to dynamic environments. The result is an ambiguity of methodologies that suggests a need to develop and establish a more dynamic methodology that reflects both the dynamics of learning processes as well as the dynamics of development processes. When these dynamics inform planning and implementation processes, the outcome is likely to be more effective learning.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Christensen, D. L., Dunnagan, C. B., & Tennyson, R. D. (1998). The future of instructional theory: Lessons learned. Journal of Structural Leaning & Intelligent Systems, 13(2), 103–113.

    Google Scholar 

  • Gagné, R. M. (1985). The conditions of learning (4th ed.). New York: Holt, Rinehart, and Winston.

    Google Scholar 

  • Gustafson, K. L. & Branch, R. M. (1997). Survey of instructional development models (3rd Ed.). Syracuse, NY: ERIC Clearinghouse on Information & Technology.

    Google Scholar 

  • Kaufman, R., Thiagarajan, S., & MacGillis, P. (Eds.) (1996). Guidebook for performance improvement: Working with individuals & organizations. San Francisco: Jossey-Bass.

    Google Scholar 

  • Montague, W. E. (1988). What works: Summary of research findings with implications for Navy instruction and learning (NAVEDTRA 115-1). Pensacola, FL: Chief of Naval Education and Training.

    Google Scholar 

  • Morris, R. C. T. (1994). Toward a user-centered information service. Journal of the American Society for Information Science, 45(1): 20–30.

    Article  Google Scholar 

  • Perez, R. S., & Emery, C. D. (1995). Designer thinking: How novices and experts think about instructional design. Performance Improvement Quarterly, 8(3), 80–89.

    Google Scholar 

  • Perez, R. S., & Neiderman, E. C. (1992). Modeling the expert training developer. In R. J. Seidel & P. Chatelier (Eds.), Advanced Training Technologies Applied to Training Design. New York, NY: Plenum Press.

    Google Scholar 

  • Richey, R. C., & Fields, D. F. (Eds.) (2000). Instructional design competencies: The standards (3rd Ed.). Syracuse, NY: ERIC Clearinghouse on Information and Technology & The International Board of Standards for Training, Performance & Instruction.

    Google Scholar 

  • Rowland, G. (1992). What do instructional designers actually do? An initial investigation of expert practice. Performance Improvement Quarterly, 5(2), 65–86.

    Google Scholar 

  • Seidel, R. J., & Perez, R S. (1994). An evaluation model for investigating the impact of innovative educational technology. In H. F. O’Niel & E. L. Baker (Eds.), Technology assessment in software applications (pp. 177–212). Hillsdale, NJ: Erlbaum.

    Google Scholar 

  • Spector, J. M. (on behalf of the Grimstad Group) (1995). Applying system dynamics to courseware development. Computers in Human Behavior, 11(2), 325–339.

    Google Scholar 

  • Spector, J. M. (1996). Creativity and constructivity in learning environments. Educational Media International, 33(2), 55–59.

    Google Scholar 

  • Spector, J. M., Arnold, E. M., & Wilson A. S. (1996). A Turing test for automatically generated instruction. Journal of Structural Learning, 12(4), 310–313.

    Google Scholar 

  • Spector, J. M., & Davidsen, P. I. (1997). Creating engaging courseware using system dynamics. Computers in Human Behavior, 13(2), 127–155.

    Google Scholar 

  • Tennyson, R. D. (1995). Four generations of instructional system development. Journal of Structural Learning, 12, 149–164.

    Google Scholar 

  • Tennyson, R. D., & Morrison, G. R. (2000). Instructional development: Foundations, process, and methodology. Columbus, OH: Merrill/Prentice-Hall.

    Google Scholar 

Download references

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2000 Kluwer Academic Publishers

About this chapter

Cite this chapter

Dunnagan, C.B., Christensen, D.L. (2000). Static and Dynamic Environments. In: Spector, J.M., Anderson, T.M. (eds) Integrated and Holistic Perspectives on Learning, Instruction and Technology. Springer, Dordrecht. https://doi.org/10.1007/0-306-47584-7_4

Download citation

  • DOI: https://doi.org/10.1007/0-306-47584-7_4

  • Publisher Name: Springer, Dordrecht

  • Print ISBN: 978-0-7923-6705-5

  • Online ISBN: 978-0-306-47584-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics