Collaborative knowledge building using the Design Principles Database

Article

Abstract

In this study we describe a mechanism for supporting a community of learning scientists who are exploring educational technologies by helping them to share and collaboratively build design knowledge. The Design Principles Database (DPD) is intended to be built and used by this community to provide an infrastructure for participants to publish, connect, discuss and review design ideas, and to use these ideas to create new designs. The potential of the DPD to serve as a collaborative knowledge-building endeavor is illustrated by analysis of a CSCL study focused on peer-evaluation. The analysis demonstrates how the DPD was used by the researchers of the peer-evaluation study in three phases. In the first phase, design principles were articulated based on a literature review and contributed to the DPD. In the second phase, a peer-evaluation activity was designed based on these principles, and was enacted and revised in a three-iteration study. In the third phase, lessons learned through these iterations were fed back to the DPD. The analysis indicates that such processes can contribute to collaborative development of design knowledge in a community of the learning sciences. Readers of ijCSCL are invited to take part in this endeavor and share their design knowledge with the community.

Keywords

Design-based research Design principles Collaborative knowledge building Peer-evaluation 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Alexander, C., Ishikawa, S., & Silverstein, M. (1977). A pattern language: Towns, buildings, and construction. New York: Oxford University Press.Google Scholar
  2. Barab, S. A., & Squire, K. D. (2004). Design-based research: Putting our stake in the ground. The Journal of the Learning Sciences, 13(1), 1–14.CrossRefGoogle Scholar
  3. Barab, S. A., Thomas, M., Dodge, T., Carteaux, R., & Tuzun, H. (2005). Making learning fun: Quest Atlantis, a game without guns. Educational Technology Research and Development, 53(1), 86–107.Google Scholar
  4. Bell, P., Hoadley, C. M., & Linn, M. C. (2004). Design-based research in education. In M. C. Linn, E. A. Davis & P. Bell (Eds.), Internet environments for science education (pp. 73–85). Mahwah, New Jersey: Lawrence Erlbaum.Google Scholar
  5. Birenbaum, M., Breuer, K., Cascallar, E., Dochy F., Dori, Y., Ridgway, J., & Weisemes R. (In press). A learning integrated assessment system. In R. Wiesemes, & G. Nickmans (Eds.), EARLI Series of Position Papers. Educational Research Review.Google Scholar
  6. Buckley, B. C., Gobert, J. D., Kindfield, A., Horwitz, P., Tinker, R., Gerlits, B., et al. (2004). Model-based teaching and learning with BioLogica™: What do they learn? How do they learn? How do we know? Journal of Science Education and Technology, 13(1), 23–41.CrossRefGoogle Scholar
  7. Collins, A. (1992). Toward a design science of education. In E. Scanlon & T. O'Shea (Eds.), New directions in educational technology (pp. 15–22). Berlin Heidelberg New York: Springer.Google Scholar
  8. Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. Journal of the Learning Sciences, 13(1), 15–42.CrossRefGoogle Scholar
  9. Cuddy, P., & Oki, J. (2001). Online peer-evaluation in basic pharmacology. Academic Medicine, 76(5): 532–533.CrossRefGoogle Scholar
  10. Davies, P. (2000). Computerized peer assessment. Innovations in Education and Training International, 37(4), 346–355.Google Scholar
  11. Dede, C. (2005). Why design-based research is both important and difficult. Educational Technology, 45(1), 5–8.Google Scholar
  12. Dominick, P. G., Reilly, R. R., & McGourty, J. (1997). The effects of peer feedback on team member behavior. Group and Organization Management, 22, 508–520.Google Scholar
  13. Dori, Y. J., & Belcher, J. W. (2005). How does technology-enabled active learning affect students' understanding of scientific concepts? The Journal of the Learning Sciences, 14(2), 243–279.CrossRefGoogle Scholar
  14. Falchikov, N. (2003). Involving student in assessment. Psychology Learning and Teaching, 3(2), 102–108.Google Scholar
  15. Falchikov, N., & Goldfinch, J. (2000). Student peer assessment in higher education: A meta-analysis comparing peer and teacher marks. Review of Educational Research, 70(3), 287–322.CrossRefGoogle Scholar
  16. Gamma, E., Helm, R., Johnson, R., & Vlissides, J. (1995). Design patterns: Elements of reusable object-oriented software. Reading, Massachusetts: Addison-Wesley.Google Scholar
  17. Guzdial, M., Rick, J., & Kehoe, C. (2001). Beyond adoption to invention: Teacher-created collaborative activities in higher education. Journal of the Learning Sciences, 10(3), 265–279.CrossRefGoogle Scholar
  18. Kali, Y. (2002). CILT2000: Visualization and modeling. Journal of Science Education and Technology, 11(3), 305–310.CrossRefGoogle Scholar
  19. Kali, Y., & Orion, N. (1997). Software for assisting high school students in the spatial perception of geological structures. Journal of Geoscience Education, 45, 10–21.Google Scholar
  20. Kali, Y., & Ronen, M. (2005). Design principles for online peer-evaluation: Fostering objectivity. In T. Koschmann, D. D. Suthers & T. W. Chan (Eds.), Proceedings of CSCL 2005. Computer support for collaborative learning: The Next 10 Years! (pp. 247–251). Mahwah, New Jersey: Lawrence Erlbaum.Google Scholar
  21. Kali, Y., Spitulnik, M., & Linn, M. (2004). Building community using the design principles database. In P. Gerjets, P. A. Kirschner, J. Elen & R. Joiner (Eds.), Instructional design for effective and enjoyable computer-supported learning: Proceedings of the first joint meeting of the EARLI SIGs instructional design and learning and instruction with computers. Tuebingen, Germany: Knowledge Media Research Center.Google Scholar
  22. Linn, M. C., Bell, P., & Davis, E. A. (2004). Specific design principles: Elaborating the scaffolded knowledge integration framework. In M. C. Linn, E. A. Davis & P. Bell (Eds.), Internet environments for science education (pp. 315–340). Mahwah, New Jersey: Lawrence Erlbaum.Google Scholar
  23. Linn, M. C., & Eylon, B.-S. (In press). Science education: Integrating views of learning and instruction. In P. A. Alexander & P. H. Winne (Eds.), Handbook of educational psychology (2nd edition). Mahwah, New Jersey: Lawrence Erlbaum Associates.Google Scholar
  24. Mann, B. (1999). Web course management “post and vote”: Peer assessment using generic Web tools. Australian Educational Computing, 14(1), 15–20.Google Scholar
  25. McConnell, D. (2002). Collaborative assessment as a learning event in E-learning environments. In G. Stahl (Ed.), Proceedings of CSCL 2002. Computer support for collaborative learning: Foundations for a CSCL community (pp. 566–567). Mahwah, New Jersey: Lawrence Erlbaum.Google Scholar
  26. Miller, P. J. (2003). The effect of scoring criteria specificity on peer and self-assessment. Assessment and Evaluation in Higher Education, 28(4), 383–394.CrossRefGoogle Scholar
  27. Nicol, D., Littlejohn, A, & Grierson, H. (2005). The importance of structuring information and resources within shared workspaces during collaborative design learning. Open Learning, 20(1), 31–49.CrossRefGoogle Scholar
  28. Quintana, C., Reiser, B. J., Davis, E. A., Krajcik, J., Fretz, E., Golan-Duncan, R., et al. (2004). A scaffolding design framework for software to support science inquiry. Journal of the Learning Sciences, 13(3), 337–386.CrossRefGoogle Scholar
  29. Ronen, M., Kohen-Vacs, D., & Raz-Fogel, N. (In press). Adopt & adapt: Structuring, sharing and reusing asynchronous collaborative pedagogy. Proceedings of ICLS 2006. Bloomington, IN.Google Scholar
  30. Ronen, M., & Langley, D. (2004). Scaffolding complex tasks by open online submission: Emerging patterns and profiles. Journal of Asynchronous Learning Networks, 8(4), 39–61.Google Scholar
  31. Scradamalia, M., & Bereiter, C. (1994). Computer support for knowledge-building communities. The Journal of the Learning Sciences, 3(3), 265–283.CrossRefGoogle Scholar
  32. Shaffer, D. W. (2005). Epistemic games. Innovate, 1(6). Reprinted in Computer Education (In press).Google Scholar
  33. Simon, H. A. (1969). The sciences of the artificial. Cambridge, Massachusetts: MIT.Google Scholar
  34. Suthers, D. D., Toth, E. E., & Weiner, A. (1997). An integrated approach to implementing collaborative inquiry in the Classroom. In R. Hall, N. Miyake & N. Enydey (Eds.), Proceedings of CSCL 1997: The second international conference on Computer Support for Collaborative learning (pp. 272–279). Mahwah, New Jersey: Lawrence Erlbaum.Google Scholar
  35. The Design-Based Research Collective (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher, 32(1), 5–8.Google Scholar
  36. Topping, K. (1998). Peer assessment between students in colleges and universities. Review of Educational Research, 68(3), 249–276.CrossRefGoogle Scholar
  37. Tufte, E. R. (1983). The visual display of quantitative information. Cheshire, Connecticut: Graphics.Google Scholar
  38. Wu, H. K., Krajcik, J., & Soloway, E. (2001). Promoting understanding of chemical representations: Students' use of a visualization tool in the classroom. Journal of Research in Science Teaching, 38, 821–842.CrossRefGoogle Scholar
  39. Zariski, A. (1996). Student peer assessment in tertiary education: Promise, perils and practice. In J. Abbott & L. Willcoxson (Eds.), Proceedings of the 5th annual teaching learning forum: Teaching and learning within and across disciplines (pp. 189–200). Perth: Murdoch.Google Scholar

Copyright information

© International Society of the Learning Sciences, Inc.; Springer Science + Business Media, LLC 2006

Authors and Affiliations

  1. 1.Department of Education in Science and TechnologyTechnion‐Israel Institute of TechnologyHaifaIsrael

Personalised recommendations