Heuristic Evaluation for Serious Immersive Games and M-instruction

  • Neil Gordon
  • Mike BrayshawEmail author
  • Tareq Aljaber
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9753)


Two fast growing areas for technology-enhanced learning are serious games and mobile instruction (M-instruction or M-Learning). Serious games are ones that are meant to be more than just entertainment. They have a serious use to educate or promote other types of activity. Immersive Games frequently involve many players interacting in a shared rich and complex – perhaps web-based - mixed reality world, where their circumstances will be multi and varied. Their reality may be augmented and often self-composed, as in a user-defined avatar in a virtual world. M-instruction and M-Learning is learning on the move; much of modern computer use is via smart devices, pads, and laptops. People use these devices all over the place and thus it is a natural extension to want to use these devices where they are to learn. This presents a problem if we wish to evaluate the effectiveness of the pedagogic media they are using. We have no way of knowing their situation, circumstance, education background and motivation, or potentially of the customisation of the final software they are using. Getting to the end user itself may also be problematic; these are learning environments that people will dip into at opportune moments. If access to the end user is hard because of location and user self-personalisation, then one solution is to look at the software before it goes out. Heuristic Evaluation allows us to get User Interface (UI) and User Experience (UX) experts to reflect on the software before it is deployed. The effective use of heuristic evaluation with pedagogical software [1] is extended here, with existing Heuristics Evaluation Methods that make the technique applicable to Serious Immersive Games and mobile instruction (M-instruction). We also consider how existing Heuristic Methods may be adopted. The result represents a new way of making this methodology applicable to this new developing area of learning technology.


Heuristic evaluation Serious games M-instruction M-learning 


  1. 1.
    Brayshaw, M., Gordon, N., Nganji, J., Wen, L., Butterfield, A.: Investigating heuristic evaluation as a methodology for evaluating pedagogical software: an analysis employing three case studies. In: Zaphiris, P., Ioannou, A. (eds.) LCT 2014, Part I. LNCS, vol. 8523, pp. 25–35. Springer, Heidelberg (2014)Google Scholar
  2. 2.
    Zyngier, D.: (Re) conceptualising student engagement: doing education not doing time. Teach. Teach. Educ. 24(7), 1765–1776 (2008)CrossRefGoogle Scholar
  3. 3.
    Gordon, N.: Enabling personalised learning through formative and summative assessment. In: Technology-Supported Environments for Personalized Learning: Methods and Case Studies, pp. 268–283. Information Science Publishing, Hershey (2009)Google Scholar
  4. 4.
    Susi, T., Johannesson, M., Backlund, P.: Serious games: An overview. Technical report, School of Humanities and Informatics, University of Skövde, Sweden (2007)Google Scholar
  5. 5.
    Anderson, J.R.: Learning to program in LISP. Cogn. Sci. 8, 87–129 (1984)CrossRefGoogle Scholar
  6. 6.
    O’Shea, T., Self, J.: Learning and Teaching with Computers: Artificial Intelligence in Education. The Harvester Press Limited, Brighton (1983)Google Scholar
  7. 7.
    Papert, S.: MIND-STORMS: Children, Computers, and Powerful Ideas. The Harvester Press, Brighton (1980)Google Scholar
  8. 8.
    Skinner, B.F.: The technology of teaching. Proc. R. Soc. Ser. B 162, 427–443 (1965)CrossRefGoogle Scholar
  9. 9.
    Spector, J.M., Davidsen, P.I.: Designing technology enhanced learning environments. In: Instructional and Cognitive Impacts of Web-Based Education, pp. 241–261 (2000)Google Scholar
  10. 10.
    Dede, C.: Immersive interfaces for engagement and learning. Science 323(5910), 66–69 (2009)CrossRefGoogle Scholar
  11. 11.
    Quinn, C.: mLearning: Mobile, Wireless, in your Pocket Learning. LineZine, Fall 2000 (2000).
  12. 12.
    Greenfield, A.: Everyware: The Dawning Age of Ubiquitous Computing. New Riders, Berkeley (2010)Google Scholar
  13. 13.
    Gordon, N.: Flexible Pedagogies: technology-enhanced learning. Higher Education Academy, NIACE (2014).
  14. 14.
    Nielsen, J.: Usability inspection methods. In: Conference Companion on Human Factors in Computing Systems. ACM (1994) Google Scholar
  15. 15.
    Nielsen J.: How to conduct a Heuristic Evaluation (Online) (2015).
  16. 16.
    Nielsen, J., Mack, R.L.: Heuristic Evaluation. Usability Inspection Methods, pp. 25–62. Wiley, New York (1994)Google Scholar
  17. 17.
    Nielsen, J., Molich, R.: Heuristic evaluation of user interfaces. In: Proceedings of ACM CHI 1990 Conference, Seattle, WA, 1–5 April, pp. 249–256 (1990) Google Scholar
  18. 18.
    Benson, L., Dean, E., Grant, M., Holschuh, D., Kim, B., Kim, H., Lauber, E., Loh, S., Reeves, T.: Heuristic Evaluation Instrument and Protocol for E-Learning Programs (Online) (2001).
  19. 19.
    Squires, D., Preece, J.: Predicting quality in educational software: evaluating for learning, usability and the synergy between them. Interact. Comput. 11, 467–483 (1999)CrossRefGoogle Scholar
  20. 20.
    Wen, L., Brayshaw, M., Gordon, N.: Personalized content provision for virtual learning environments via the semantic web. Innovations Teach. Learn. Inf. Comput. Sci. 11, 14–27 (2012)CrossRefGoogle Scholar
  21. 21.
    Nganji, J.T., Brayshaw, M.: Designing and reflecting on disability-aware e-learning systems: the case of ONTODAPS. In: IEEE 14th International Conference on Advanced Learning Technologies (ICALT), Athens, Greece (2014)., doi: 10.1109/ICALT.2014.167
  22. 22.
    Butterfield, A.M., Brayshaw, M.: A pedagogically motived guided inquiry based tutor for C#. In: Gordon, N.A., Graham, D. (eds.) Proceedings of the HEA STEM (Computing) Learning Technologies 2014 Workshop, University of Hull (2014).
  23. 23.
    Michael, D.R., Chen, S.L.: Serious Games: Games that Educate, Train, and Inform. Muska & Lipman/Premier-Trade, New York (2005)Google Scholar
  24. 24.
    Egenfeldt-Nielsen, S.: Beyond edutainment: Exploring the educational potential of computer games. (2005)Google Scholar
  25. 25.
    Gordon, N., Brayshaw, M., Grey, S.: Maximising gain for minimal pain: utilising natural game mechanics. Innovation Teach. Learn. Inf. Comput. Sci. 12(1), 27–38 (2013)CrossRefGoogle Scholar
  26. 26.
    Taylor, J.: Towards a task model for mobile learning: a dialectical approach. Int. J. Learn. Technol. 2(2), 138–158 (2006)CrossRefGoogle Scholar
  27. 27.
    Aljaber, T., Gordon, N., Kambhampati, C., Brayshaw, M.: An evaluation framework for mobile health education software. In: Proceedings of the 2015 Science and Information Conference (2015) Google Scholar
  28. 28.
    Holzinger, A., Nischelwitzer, A., Meisenberger, M.: Lifelong-learning support by M-instruction example scenarios. eLearn 11, 2 (2005)CrossRefGoogle Scholar
  29. 29.
    Korn, M., Bødker, S.: Looking ahead: how field trials can work in iterative and exploratory design of ubicomp systems. In: Proceedings of the 2012 ACM Conference on Ubiquitous Computing. ACM (2012)Google Scholar
  30. 30.
    Silva, J.L., Campos, J., Harrison, M.: Formal analysis of ubiquitous computing environments through the APEX framework. In: Proceedings of the 4th ACM SIGCHI Symposium on Engineering Interactive Computing Systems. ACM (2012)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.Department of Computer ScienceUniversity of HullHullUK

Personalised recommendations