“How Do I Evaluate THAT?” Experiences from a Systems-Level Evaluation Effort

  • Pardha S. Pyla
  • H. Rex Hartson
  • Manuel A. Pérez-Quiñones
  • James D. Arthur
  • Tonya L. Smith-Jackson
  • Deborah Hix
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5619)

Abstract

In this paper we describe our experience deriving evaluation metrics for a systems-level framework called Ripple that connects software engineering and usability engineering life cycles. This evaluation was conducted with eight teams of graduate students (falling under four types of development models) competing in a joint software engineering and usability engineering course to create a software solution for a real world client. We describe the challenges of evaluating systems-level frameworks and the approach we used to derive metrics given our evaluation context. We conclude with the outcome of this evaluation and the effectiveness of the metrics we employed.

Keywords

Systems-level evaluation evaluation metrics goal-question metric 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Hartson, H.R., Andre, T.S., Williges, R.C.: Criteria for evaluating usability evaluation methods. International Journal of Human-Computer Interaction 15(1), 145–181 (2003)CrossRefGoogle Scholar
  2. 2.
    Monk, A.F.: Experiments are for small questions, not large ones like “What usability evaluation method should I use? Human-Computer Interaction 13(3), 199–201 (1998)CrossRefGoogle Scholar
  3. 3.
    Pyla, P.S., et al.: Towards a model-based framework for integrating usability and software engineering life cycles. In: Interact 2003 Workshop on Closing the Gaps: Software Engineering and Human Computer Interaction. 2003: Université catholique de Louvain, Institut d’ Administration et de Gestion (IAG) on behalf of the International Federation for Information Processing (IFIP), pp. 67–74 (2003)Google Scholar
  4. 4.
    Pyla, P.S., et al.: What we should teach, but don’t: Proposal for a cross pollinated HCI-SE curriculum. In: Frontiers in Education (FIE) Conference, Savannah, Georgia, pp. S1H17–22 (2004)Google Scholar
  5. 5.
    Pyla, P.S., et al.: Ripple: An event driven design representation framework for integrating usability and software engineering life cycles. In: Seffah, A., Gulliksen, J., Desmarais, M. (eds.) Human-centered software engineering: Integrating usability in the software development lifecycle, pp. 245–265. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  6. 6.
    The Standish Group, The CHAOS Report (1994)Google Scholar
  7. 7.
    The Standish Group, Unfinished Voyages. A follow-up to The CHAOS Report (1995)Google Scholar
  8. 8.
    The Standish Group, Extreme CHAOS (2001)Google Scholar
  9. 9.
    Pyla, P.S., et al.: Evaluating Ripple: Experiences from a Cross Pollinated SE-UE Study. In: CHI 2007 Workshop on Increasing the Impact of Usability Work in Software Development, 4 pages (2007)Google Scholar
  10. 10.
    Basili, V.R., Weiss, D.: A methodology for collecting valid software engineering data. IEEE Transactions on Software Engineering SE-10(6), 728–738 (1984)CrossRefGoogle Scholar
  11. 11.
    Fenton, N.E., Pfleeger, S.L.: Software metrics: A rigorous and practical approach, 2nd edn. International Thomson Computer Press (1997)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Pardha S. Pyla
    • 1
  • H. Rex Hartson
    • 1
  • Manuel A. Pérez-Quiñones
    • 1
  • James D. Arthur
    • 1
  • Tonya L. Smith-Jackson
    • 1
  • Deborah Hix
    • 1
  1. 1.Center for Human-Computer Interaction, Virginia TechBlacksburgUSA

Personalised recommendations