Evaluating the Next Generation of Multimedia Software
Developing new multimedia, software applications is a particularly fascinating but challenging task, depending, as it does, on the thorough evaluation of new concepts and prototypes. As complexity and choice increases exponentially, future development will be increasingly challenging for emerging, intelligent, multimedia and virtual reality applications, including Web 3D. In the past, we have been able to rely on generic evaluative heuristics. But are they generalisable to the newer generations of software? In this study, participants saw demonstrations of two interactive, virtual reality, multimedia applications, namely Second Life and 3d mailbox. They used them to generate questionnaires to capture those key aspects of such applications that were important to them. They did not find it difficult to generate questions grounded in their own experience. The resulting items turned out to be validated by significant levels of internal consistency across dependent variables. Surprisingly, however, the new heuristics bore little resemblance to traditional or current, cognitive items. The overwhelming influence was that of the immersive impact of such applications rather than standard design issues or cognitive user factors. Clearly, new software innovations require equally new innovations in evaluation techniques in general and context specific heuristics in particular, but we do not yet have a conceptual foundation upon which to base them.
Unable to display preview. Download preview PDF.
- 1.Adams, R.: Universal access through client-centred cognitive assessment and personality. In: Stary, C., Stephanidis, C. (eds.) UI4ALL 2004. LNCS, vol. 3196, pp. 3–15. Springer, Heidelberg (2004)Google Scholar
- 3.Adams, R., Langdon, P., Clarkson, P.J.: A systematic basis for developing cognitive assessment methods for assistive technology. In: Keates, S., Langdon, P., Clarkson, P.J., Robinson, P. (eds.) Universal Access and Assistive Technology, pp. 53–62. Springer, London (2002)Google Scholar
- 4.Adams, R., Langdon, P.: Principles and concepts for information and communication technology design. Journal of Visual Impairment and Blindness 97, 602–611 (2003)Google Scholar
- 5.Adams, R., Smith, S.: Paradigms, technology and demand characteristics for universal access: can we treat different technologies the same? In: HCII 2005 UAHCI Proceedings (2005)Google Scholar
- 6.Bach, J.: Heuristics of Software Testability Viewed 31/01/08 (2003), http://www.satisfice.com/tools/testable.pdf
- 7.Clermont, M.: Heuristics for the automatic identification of irregularities in spreadsheets. ACM SIGSOFT Software Engineering Notes 30, 1–6 (2005)Google Scholar
- 8.Gallant, L.M., Boone, G.M., Heap, A.: Five heuristics for designing and evaluating web-based communities. First Monday, 3, 1–12 (2007)Google Scholar
- 11.Nielsen, J., Mack, R.L.: Usability Inspection Methods. John Wiley & Sons, New York (1994)Google Scholar
- 14.Simon, B., Anderson, R., Hoyer, C., Su, J.: Preliminary Experiences with a Tablet PC Based System to Support Active Learning in Computer Science Courses. In: ITiCSE 2004: Proceedings of the 9th annual SIGCSE conference on Innovation and technology in computer science education (2004)Google Scholar