How university teachers design assessments: a cross-disciplinary study
There are dissonances between educators’ aspirations for assessment design and actual assessment implementation in higher education. Understanding how assessment is designed ‘on the ground’ can assist in resolving this tension. Thirty-three Australian university educators from a mix of disciplines and institutions were interviewed. A thematic analysis of the transcripts indicated that assessment design begins as a response to an impetus for change. The design process itself was shaped by environmental influences, which are the circumstances surrounding the assessment design, and professional influences, which are those factors that the educators themselves bring to the process. A range of activities or tasks were undertaken, including those which were essential to all assessment design, those more selective activities which educators chose to optimise the assessment process in particular ways and meta-design processes which educators used to dynamically respond to environmental influences. The qualitative description indicates the complex social nature of interwoven personal and environmental influences on assessment design and the value of an explicit and strategic ways of thinking within the constraints and affordances of a local environment. This suggests that focussing on relational forms of professional development that develops strategic approaches to assessment may be beneficial. The role of disciplinary approaches may be significant and remains an area for future research.
KeywordsAssessment Academic context Academic experiences Academic practice Teaching skills
This work was supported by the Office for Learning and Teaching under Grant ID12-2254. We do not have any financial interests or benefits from any direct application of this work.
- Bennett, S., Thomas, L., Agostinho, S., Lockyer, L., Jones, J., & Harper, B. (2011). Understanding the design context for Australian university teachers: Implications for the future of learning design. Learning, Media and Technology, 36(2), 151–167. doi: 10.1080/17439884.2011.553622.CrossRefGoogle Scholar
- Dedoose, V. (2012). Web application for managing, analyzing, and presenting qualitative and mixed method research data. Los Angeles, CA: SocioCultural Research Consultants.Google Scholar
- Gibbs, G., & Simpson, C. (2004). Conditions under which assessment supports students’ learning. Learning and teaching in higher education, 1(1), 3–31.Google Scholar
- Guba, E. G. (1990). The alternative paradigm dialog. In E. G. Guba (Ed.), The paradigm dialog (pp. 17–30). Newbury Park: Sage.Google Scholar
- Patton, M. Q. (1999). Enhancing the quality and credibility of qualitative analysis. Health Services Research, 34(5 Pt 2), 1189–1208.Google Scholar
- Schön, D. A. (1983). The reflective practitioner: How professionals think in action (Vol. 5126). New York: Basic books.Google Scholar
- Strauss, A., & Corbin, J. M. (1990). Basics of qualitative research: Grounded theory procedures and techniques. Los Angeles: Sage Publications Inc.Google Scholar