Investigating Heuristic Evaluation as a Methodology for Evaluating Pedagogical Software: An Analysis Employing Three Case Studies

  • Mike Brayshaw
  • Neil Gordon
  • Julius Nganji
  • Lipeng Wen
  • Adele Butterfield
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8523)

Abstract

This paper looks specifically at how to develop light weight methods of evaluating pedagogically motivated software. Whilst we value traditional usability testing methods this paper will look at how Heuristic Evaluation can be used as both a driving force of Software Engineering Iterative Refinement and end of project Evaluation. We present three case studies in the area of Pedagogical Software and show how we have used this technique in a variety of ways. The paper presents results and reflections on what we have learned. We conclude with a discussion on how this technique might inform on the latest developments on delivery of distance learning.

Keywords

Heuristic evaluation pedagogy pedagogical software disability technology enhanced learning flexible learning 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Albion, P.: Heuristic evaluation of educational multimedia: from theory to practice. In: ASCILITE 1999: 16th Annual Conference of the Australasian Society for Computers in Learning in Tertiary Education: Responding to Diversity, Brisbane, Australia (1999)Google Scholar
  2. 2.
    Anderson, J.R.: The expert module. In: Foundations of Intelligent Tutoring Systems, pp. 21–53 (1988)Google Scholar
  3. 3.
    Benson, L., Elliott, D., Grant, M., Holschuh, D., Kim, B., Kim, H., Lauber, E., Loh, S., Reeves, T.: Heuristic Evaluation Instrument and Protocol for E-Learning Programs (2001), http://it.coe.uga.edu/~treeves/edit8350/HEIPEP.html (accessed October 31, 2013)
  4. 4.
    Butterfield, A., Brayshaw, M.: A Pedagogically Motivated Guided Enquiry Based Tutor for C#, Submitted Article (2014)Google Scholar
  5. 5.
    Granic, A., Cukusic, M.: Usability Testing and Expert Inspections Complemented by Educational Evaluation: A Case Study of an e-Learning Platform. Educational Technology & Society 14, 107–123 (2011)Google Scholar
  6. 6.
    Jeffries, R., Desurvire, H.: Usability Test versus Heuristic Evaluation. SIGCHI Bulletin 24(4) (October 1992)Google Scholar
  7. 7.
    Kahn, P., O’Rourke, K.: Guide to curriculum design: enquiry-based learning. Higher Education Academy (2004) (March 30)Google Scholar
  8. 8.
    Nganji, J.T., Brayshaw, M., Tompsett, B.C.: Ontology-Based E-Learning Personalisation for Disabled Students in Higher Education. Italics 9(4) (2011) ISSN 1473-7507Google Scholar
  9. 9.
    Nganji, J.T., Brayshaw, M., Tompsett, B.: Ontology-Driven Disability-Aware E-Learning Personalisation with ONTODAPS. Campus Wide Information Systems 30(1), 17–34 (2013)CrossRefGoogle Scholar
  10. 10.
    Nganji, J.T., Brayshaw, M.: Designing Personalised Learning Resources for Disabled Students Using an Ontology-Driven Community of Agents. In: Isaias, P., Baptista Nunes, M. (eds.) Information Systems Research and Exploring Social Artefacts: Approaches and Methodologies, pp. 81–102. Information Science Reference, Hershey (2013), doi:10.4018/978-1-4666-2491-7.ch005Google Scholar
  11. 11.
    Nganji, J.T., Nggada, S.H.: Disability-Aware Software Engineering for Improved System Accessibility and Usability. International Journal of Software Engineering and Its Applications (IJSEIA) 5(3), 47–62 (2011)Google Scholar
  12. 12.
    Nielsen, J., Molich, R.: Heuristic evaluation of user interfaces. In: Proc. ACM CHI 1990 Conf., Seattle, WA, April 1-5, pp. 249–256 (1990)Google Scholar
  13. 13.
    Nielsen, J., Mack, R.: Usability Inspection Methods. John Wiley, New York (1994) ISBN 0-471-01877-5Google Scholar
  14. 14.
    Nilesen, J., Mack, R. (eds.): Usability Inspection Methods. John Wiley, New York (1994)Google Scholar
  15. 15.
    Quinn, C.N.: Pragmatic evaluation: lessons from usability. In: Christie, A., Vaughan, B. (eds.) 13th Annual Conference of the Australasian Society for Computers in Learning in Tertiary Education (ASCILITE 1996). Australia Society for Computers in Learning in Tertiary Education, Adelaide (1996)Google Scholar
  16. 16.
    Squires, D., Preece, J.: Predicting quality in educational software: Evaluating for learning, usability and the synergy between them. Interacting with Computers 11, 467–483 (1999)CrossRefGoogle Scholar
  17. 17.
    Wen, L., Brayshaw, M., Gordon, N.: Personalized Content Provision for Virtual Learning Environments via the Semantic Web. Italics (2012) ISSN 1473-7507Google Scholar
  18. 18.
    Weiser, M., Gold, R., Brown, J.S.: The origins of ubiquitous computing research at PARC in the late 1980s. IBM Systems Journal 38(4), 693–696 (1999)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Mike Brayshaw
    • 1
  • Neil Gordon
    • 1
  • Julius Nganji
    • 1
  • Lipeng Wen
    • 1
  • Adele Butterfield
    • 1
  1. 1.Department of Computer ScienceUniversity of HullHullUnited Kingdom

Personalised recommendations