Skip to main content

Part of the book series: Evaluation in Education and Human Services ((EEHS,volume 42))

Abstract

The title Assessment 2000 would have sounded like science fiction a few decades ago, an opportunity to use my imagination in making creative and wild speculations about assessment in a distant future. However, less than half a decade before the due date, this chapter entails more modest and careful speculations, based on contemporary theories and on lessons gained from current practice. Indeed, it starts by introducing the most generic term currently used in educational literature with respect to assessment, i.e., alternative assessment. It briefly explains to what and why an alternative is sought and describes the main features of this type of assessment, as it is currently viewed. Of the various devices subsumed under the alternative assessment umbrella a focus is put on the portfolio describing its various types, uses and criteria for judgment. Next, criteria for evaluating alternative assessment and lessons to be learnt from current practice are discussed, and finally, a rationale for a pluralistic approach to assessment is presented.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (1985). Standards for educational and psychological testing. Washington, DC: National Education Association.

    Google Scholar 

  • Arter, J. A., & Spandel, V. (1992). Using portfolios of student work in instruction and assessment. Educational, Measurement: Issues and Practice, 11 (1), 36–44.

    Article  Google Scholar 

  • Asturias, H. (1994). Using students’ portfolios to assess mathematical understanding. Mathematics Teacher, 87 (9), 698–701.

    Google Scholar 

  • Badger, E. (1995). The effect of expectations on achieving equity in state-wide testing: Lessons from Massachusetts. In: Nettles, M. T., & A. L Nettles, Equity and excellence in educational testing and assessment, (pp. 289–308). Boston: Kluwer.

    Chapter  Google Scholar 

  • Belanoff, P., & Dickson, M. (Eds.). (1991). Portfolios: Process and product. Portsmouth: Boynton/ Cook.

    Google Scholar 

  • Ben-Shakhar, G., & Sinai, Y. (1991). Gender differences in multiple-choice tests: The role of differential guessing tendencies. Journal of Educational Measurement, 28, 23–35.

    Article  Google Scholar 

  • Birenbaum, M. (1994a). Toward adaptive assessment — the student’s angle. Studies in Educational Assessment, 20, 239–255.

    Google Scholar 

  • Birenbaum, M. (1994b). Effects of gender, test anxiety, and self regulation on students’ attitudes toward two assessment formats. Unpublished Manuscript. School of Education Tel Aviv University.

    Google Scholar 

  • Birenbaum, M., & Feldman, R. (1995, July). Relationships between learning patterns and attitudes toward two assessment formats. Paper prepared for presentation at the 16th International Conference of the Stress and Anxiety Research Society. Prague: Czech Republic.

    Google Scholar 

  • Birenbaum, M., & Gutvirtz, Y. (1995, January). Relationships between assessment preferences, cognitive style, motivation and learning strategies. Paper presented at the 11th conference of the Israeli Educational Research Association. Jerusalem.

    Google Scholar 

  • Birenbaum, M., & Tatsuoka, K. K. (1987). Open-ended versus multiple-choice response format ~ it does make a difference. Applied Psychological Measurement, 11, 385–395.

    Article  Google Scholar 

  • Birenbaum, M., Tatsuoka, K. K., & Gutvirtz, Y. (1992). Effects of response format on diagnostic assessment of scholastic achievement. Applied Psychological Measurement, 16, 353–363.

    Article  Google Scholar 

  • Blake Yancey, K. (Ed.). (1992). Portfolios in the writing classroom. Urbana, Il: National Council of Teachers of English.

    Google Scholar 

  • Camp, R. (1991). Portfolios evolving. Background and variations in sixththrough twelfth — grade classrooms. In P. Belanoff, & M. Dickson (Eds.), Portfolios process and product (pp. 194–205). Portsmouth, NH: Boyton/Cook Heineman.

    Google Scholar 

  • Camp, R. (1992). The place of portfolios in our changing views of writing assessment. In R. Bennett, & W. Ward (Eds.), Construction versus choice in cognitive measurement Hillsdale, NJ: Erlbaum.

    Google Scholar 

  • Cannell, J. J. (1989). The “Lake Wobegon” report How public educators cheat on standardized achievement tests. Albuquerque, NM: Friends for Education.

    Google Scholar 

  • Chapelle, C. (1988). Field independence: A source of language test variance. Language Testing, 5, 62–82.

    Article  Google Scholar 

  • Chapelle, C., & Roberts, C. (1986). Ambiguity tolerance and field dependence as predictors of proficiency in English as a second language. Language Learning, 36, 27–45.

    Article  Google Scholar 

  • Clarke, D., & Stephens, M. (1995). The ripple effect: The instructional implications of the systemic introduction of performance assessment in mathematics. In M. Birenbaum, & F.J.R.C. Dochy (Eds.), Alternatives in assessment of achievement, learning processes and prior knowledge. Boston: Kluwer.

    Google Scholar 

  • Collins, A. (1991). Portfolios for biology teacher assessment. Journal of Personnel Evaluation in Education, 5, 147–169.

    Article  Google Scholar 

  • Collins, A. (1993). Performance-based assessment of biology: Promises and pitfalls. Journal of Research in Science Teaching, 30, 1103–1120.

    Article  Google Scholar 

  • Crocker, L., & Schmitt, A. (1987). Improving multiple-choice test performance for examinees with different levels of test anxiety. Journal of Experimental Education, 55, 201–205.

    Google Scholar 

  • Cronbach, L. J. (1988). Five perspectives on validation argument. In H. Wainer, & H. Braun (Eds.), Test validity (pp. 3–17). Hillsdale, NJ: Erlbaum.

    Google Scholar 

  • D’Aoust, C. (1992). Portfolios: Process for students and teachers. In K. Blake Yancey (Ed.), Portfolios in the writing classroom, (pp. 39–48). Urbana, Il: National Council of Teachers of English.

    Google Scholar 

  • Darling-Hammond, L. (1995). Equity issues in performance-based assessment. In M. T. Nettles, & A. L Nettles, Equity and excellence in educational testing and assessment, (pp. 89–114). Boston: Kluwer.

    Chapter  Google Scholar 

  • Davis, A., & Felknor, C. (1994). Graduation by exhibition: The effects of high stakes portfolio assessments on curriculum and instruction on one high school. Paper presented at the Annual Meeting of the American Educational Research Association. New Orleans, LA.

    Google Scholar 

  • Delandshere, G., & Petrosky, A. R. (1994). Capturing teachers’ knowledge: Performance assessment. Educational Researcher, 23 (5), 11–18.

    Google Scholar 

  • Dunbar, S. B., Koretz, D. M., & Hoover, H. D. (1991). Quality control in the development and use of performance assessments. Applied Measurement in Education, 4, 289–303.

    Article  Google Scholar 

  • Frederiksen, J. R., & Collins A. (1989). A systems approach to educational testing. Educational Researcher, 18 (9), 27–32.

    Google Scholar 

  • Frederiksen, N. (1984), The real test bias: Influences of testing on teaching and learning. American Psychologist, 39, 193–202.

    Article  Google Scholar 

  • Freire, P. P. (1972). Pedagogy of the oppressed. Harmondsworth, Middlesex UK: Penguin Books.

    Google Scholar 

  • Gardner, H. (1983). Frames of mind. New York: Basic Books.

    Google Scholar 

  • Gardner, H. (1993). Multiple intelligences: The theory in practice. New York: Basic Books

    Google Scholar 

  • Gardner, H., & Hatch, T. (1989). Multiple intelligences go to school. Educational implications of the theory of multiple intelligences. Educational Researcher, 18, (8), 4–10.

    Google Scholar 

  • Gentile, C. (1992). Exploring new methods for collecting students’ schoolbased writing: NAEP’s 1990 portfolio study. Washington, DC: National Center for Education Statistics.

    Google Scholar 

  • Grandy, J. (1987). Characteristics of examinees who leave questions unanswered on the GRE general test under rights-only scoring. ETS Research Report 87-38. Princeton, NJ: Educational Testing Service.

    Google Scholar 

  • Hamm, M., & Adams, D. (1991). Portfolio assessment. The Science Teacher, 58(5) 18–21.

    Google Scholar 

  • Hamp-Lyons, L., & Condon, W, (1993). Questioning assumptions about portfolio-based assessment. College Composition and Communication, 44 (2), 176–190.

    Article  Google Scholar 

  • Hansen, J. (1984). Field dependence-independence and language testing: Evidence from six pacific island cultures. TESOL Quarterly, 18, 311–324.

    Article  Google Scholar 

  • Hansen, J. (1992). Literacy portfolios: Helping students know themselves. Educational Leadership, 49, 66–68.

    Google Scholar 

  • Hansen, J., & Stansfield, C. (1981). The relationship between field dependentindependent cognitive styles and foreign language achievement, Language Learning, 31, 349–367.

    Article  Google Scholar 

  • Herman, J. L., Aschbacher, R., & Winters, L. (1992). A practical guide to alternative assessment. Alexandria, VA: Association for Supervision and Curriculum Development.

    Google Scholar 

  • Hieronymus, A. N., & Hoover, H. D. (1987). Iowa tests of basic skills: Writing supplement teacher’s guide. Chicago: Riverside.

    Google Scholar 

  • Kleinsasser, A., Horsch, E., & Tastad, S. (1993, April). Walking the talk: Moving from a testing culture to an assessment culture. Paper presented at the Annual Meeting of the American Educational Research Association. Atlanta, GA.

    Google Scholar 

  • Knight, D. (1992). How I use portfolios in mathematics. Educational Leadership, 49, 71–72.

    Google Scholar 

  • Koppert, J. (1991). Primary performance assessment portfolio. Mountain Village, AK: Lower Yukon School District.

    Google Scholar 

  • Koretz, D., McCaffrey, D., Klein, S., Bell, R., & Stecher, B. (1992). The reliability of scores from the 1992 Vermont portfolio assessment program. Interim Report. Santa Monica, CA: RAND Institute of Education and Training.

    Google Scholar 

  • Koretz, D., Stecher, B., Klein, S, & McCaffrey, D. (1994). The Vermont portfolio assessment program: Findings and implications. Educational Measurement; Issues and Practice, 13 (3), 5–16.

    Article  Google Scholar 

  • Larsen R, L. (1991). Using portfolios in the assessment of writing in the academic disciplines. In P. Belanoff, & M. Dickson (Eds.). Portfolios: Process and product. Portsmouth, NH: Boynton/Cook.

    Google Scholar 

  • LeMahieu, P. (1993, April). Data from the Pittsburgh writing portfolio assessment. In J. Herman (Chair), Portfolio assessment meets the reality of data. Symposium conducted at the Annual Meeting of the American Educational Research Association, Atlanta, GA.

    Google Scholar 

  • Leiva, M. (1995). Empowering teachers through the evaluation process. Mathematics Teacher, 88, (19), 44–47.

    Google Scholar 

  • Linn, M. C. , De Benedictis, T., Delucchi, K., Harris, & A., Stage, E. (1987). Gender differences in National Assessment of Educational Progress in science items.: What does ‘I don’t know’ really mean? Journal of Research on Science Teaching, 24, 267–278.

    Article  Google Scholar 

  • Linn, R. L. (1994). Performance assessment: Policy promises and technical measurement standards. Educational Researcher, 23, (9), 4–14.

    Google Scholar 

  • Linn, R. L., Baker, E., & Dunbar, S. (1991). Complex, performance-based assessment: Expectations and validation criteria. Educational Researcher, 16. 15–21.

    Google Scholar 

  • Lu, C., & Suen, H. K. (1993, April). The interaction effect of individual characteristics and assessment format on the result of performance-based assessment. Paper presented at the Annual Meeting of the American Educational Research Association. Atlanta, GA.

    Google Scholar 

  • Lukhele, R., Thissen, D., & Wainer, H. (1994). On the relative value of multiple-choice, constructed response, and examinee-selected items on two achievement tests. Journal of Educational Measurement, 31, 234–250.

    Article  Google Scholar 

  • Madaus, G. F., & Kellaghan, T. (1993). The British experience with ‘authentic’ testing. Phi Delta Kappan, 74(6), 458–469.

    Google Scholar 

  • Maeroff, G. I. (1991). Assessing alternative assessment. Phi Delta Kappan. 72, 272–281.

    Google Scholar 

  • Messick, S. (1989). Validity. In R. L. Linn (Ed.), Educational Measurement (3rd ed., pp. 13–103). New York: Macmillan.

    Google Scholar 

  • Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23 (2) 13–23.

    Google Scholar 

  • Moss, P. A. (1994). Can there be validity without reliability? Educational Researcher, 23, (2)5 -12.

    Google Scholar 

  • Moss, P. A. (in press). Rethinking validity: Themes and variations in current theory. Educational Measurement: Issues and Practice.

    Google Scholar 

  • Mumme, J. (1991). Portfolio assessment in mathematics. Santa Barbara: California Mathematics Project. University of California, Santa Barbara.

    Google Scholar 

  • Nettles, M. T., & Bernstein A. (1995). Introduction: The pursuit of equity in educational testing and assessment. In: Nettles, M. T., & A. L Nettles, Equity and excellence in educational testing and assessment, (pp. 3–21). Boston: Kluwer.

    Chapter  Google Scholar 

  • Paulson, F. L., & Paulson, P. R. (1992, October). A draft for judging portfolios. Draft prepared for use at the NWEA Fifth Annual October Institute on Assessment Alternatives. Portland OR.

    Google Scholar 

  • Paulson, F. L., Paulson, P. R. , & Meyer, C. A. (1991). What makes a portfolio a portfolio? Educational Leadership, 48,. 60–63.

    Google Scholar 

  • Perkins, D. N. (1986). Thinking frames: An integrative perspective on teaching cognitive skills. In J. B . Baron, & R. S. Sternberg (Eds.), Teaching thinking skills: Theory and practice (pp. 41–61). New York: W. H. Freeman.

    Google Scholar 

  • Perkins, D. N. (1992). Smart schools. New York: The Free Press.

    Google Scholar 

  • Perkins, D. N., & Blythe, T. (1994). Putting understanding up front. Educational Leadership, 51, (5), 4–7.

    Google Scholar 

  • Perrone, V. (1994). How to Engage students in learning. Educational Leadership, 57(5), 11–13.

    Google Scholar 

  • Resnick, L. B., & Klopfer, L. E. (Eds.). (1989). Toward the thinking curriculum: Current cognitive research. Alexandria, VA: ASCD.

    Google Scholar 

  • Resnick, L. B., & Resnick, D. P. (1992). Assessing the thinking curriculum: New tools for educational reform. In B. R. Gifford, & C. O’Connor (Eds.), Changing assessments: Alternative views of aptitude, achievement and instruction (pp. 37–75).Boston, MA: Kluwer.

    Google Scholar 

  • Rocklin, T., & O’Donnell, A. M. (1987). Self adapted testing: A performanceimproving variant of computerized adaptive testing. Journal of Educational Psychology, 79, 315–319.

    Article  Google Scholar 

  • Ruiz-Primo, M. A., Baxter, G. P., & Shavelson, R. J. (1993). On stability of performance assessments. Journal of Educational Measurement, 30, 41–53.

    Article  Google Scholar 

  • Shavelson, R., J., Baxter, G. P., & Gao, X. (1993). Sampling variability of performance assessments. Journal of Educational Measurement, 30, 215–232.

    Article  Google Scholar 

  • Shavelson, R., J., Gao, X., & Baxter, G. P. (1995). On the content validity of performance assessments: Centrality of domain specification. In M. Birenbaum, & F.J.R.C. Dochy (Eds.), Alternatives in assessment of achievement, learning processes and prior knowledge. Boston: Kluwer.

    Google Scholar 

  • Shepard, L. A. (1993). Evaluating test validity. Review of Research in Education, 19, 405–450.

    Google Scholar 

  • Shulman, L. S. (1988). A union of insufficiencies: Strategies for teacher assessment in a period of educational reform. Educational Leadership, 45, 36–14.

    Google Scholar 

  • Simmons, R. (1994). The horse before the cart: Assessing for understanding. Educational Leadership, 51 (5), 22–23.

    Google Scholar 

  • Snow, R. E. (1993). Construct validity and constructed-response tests. In R. E. Bennett, & W. C. Ward (Eds.), Construction versus choice in cognitive measurement (pp. 45–60). Hillsdale NJ: Erlbaum.

    Google Scholar 

  • Sternberg, R. J. (1985). Beyond IQ: A triarchic theory of human intelligence. New York: Cambridge University Press.

    Google Scholar 

  • Swartz, R. J., & Perkins, D. N. (1991). Teaching thinking: Issues and approaches. Pacific Grove, CA: Midwest Publications.

    Google Scholar 

  • Tierney, R. J., Carter, M. A., & Desai, L. E. (1991). Portfolio assessment in the reading writing classroom. Norwood, AM: Christopher Gordon.

    Google Scholar 

  • Valencia, S. (1990). A portfolio approach to classroom reading assessment: The whys, whats and hows. The Reading Teacher, 44, 338–340.

    Google Scholar 

  • Valeri-Gold, M., Olson, J. R., & Deming, M. P. (1991–2). Portfolios: Collaborative authentic assessment opportunities for college developmental learners. Journal of Reading, 35 (4), 298–305.

    Google Scholar 

  • Wagner, R. K., & Sternberg, R. J. (1986). Tacit knowledge and intelligence in the everyday world. In R. J. Sternberg, & R. K. Wagner (Eds.), Practical intelligence: Nature and origins of competence in the everyday world (pp. 51–83). New York: Cambridge University Press.

    Google Scholar 

  • Wise, L. S., Plake, B. S., Johnson, P. L., & Roos, L. L. (1992). A comparison of self-adapted and computerized adaptive tests. Journal of Educational Measurement, 29, 329–339.

    Article  Google Scholar 

  • Wolf Palmer, D. (1989). Portfolio assessment: Sampling student work. Educational Leadership, 46, 35–39.

    Google Scholar 

  • Wolf, D., Bixby, J., Glenn III, J., & Gardner, H. (1991). To use their minds well: Investigating new forms of student assessment. Review of Research in Education, 17, 31–73.

    Google Scholar 

  • Worthen, B. R. (1993). Critical issues that will determine the future of alternative assessment. Phi Delta Kappan, 74, 444–456.

    Google Scholar 

  • Zeidner, M. (1987). Essay versus multiple-choice type classroom exams: The student’s perspective. Journal of Educational Research, 80, 352–258.

    Google Scholar 

  • Zoller, U., & Ben-Chaim, D. (1988). Interaction between examination-type anxiety state, and academic achievement in college science; An action-oriented research. Journal of Research in Science Teaching, 26, 65–77.

    Article  Google Scholar 

  • Zoller, U., & Ben-Chaim, D. (1990). Gender differences in examination-type performances, test anxiety, and academic achievements in college science education -A case study. Science Education, 74, 597–608.

    Article  Google Scholar 

Download references

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1996 Springer Science+Business Media New York

About this chapter

Cite this chapter

Birenbaum, M. (1996). Assessment 2000: Towards a Pluralistic Approach to Assessment. In: Birenbaum, M., Dochy, F.J.R.C. (eds) Alternatives in Assessment of Achievements, Learning Processes and Prior Knowledge. Evaluation in Education and Human Services, vol 42. Springer, Dordrecht. https://doi.org/10.1007/978-94-011-0657-3_1

Download citation

  • DOI: https://doi.org/10.1007/978-94-011-0657-3_1

  • Publisher Name: Springer, Dordrecht

  • Print ISBN: 978-94-010-4287-1

  • Online ISBN: 978-94-011-0657-3

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics