Advertisement

A Formative Evaluation of a Comprehensive Search System for Medical Professionals

  • Veronika Stefanov
  • Alexander Sachs
  • Marlene Kritz
  • Matthias Samwald
  • Manfred Gschwandtner
  • Allan Hanbury
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8138)

Abstract

Medical doctors need rapid and accurate answers, which they cannot easily find with current search systems. This paper describes a formative evaluation of a comprehensive search system for medical professionals. The study was designed to guide system development. The system features included search in text and 2D images, machine translated summaries of search results, as well as query disambiguation and suggestion features, and a comprehensive search user interface. The study design emphasizes qualitative user feedback, based on realistic simulated work tasks and data collection with spontaneous and prompted self-report, written and spoken feedback in response to questionnaires, was well as audio and video recordings, and log files. Results indicate that this is a fruitful approach to uncovering problems and eliciting requirements that would be harder to find in a component-based evaluation testing each feature separately.

Keywords

Pilot Test Query Text Task Scenario Interactive Information Retrieval Multilingual Environment 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Hersh, W.R., Hickam, D.H.: How well do physicians use electronic information retrieval systems? a framework for investigation and systematic review. JAMA 280(15), 1347–1352 (1998)CrossRefGoogle Scholar
  2. 2.
    Hoogendam, A., Stalenhoef, A.F.H., de Vries Robbé, P.F., Overbeke, A.J.P.M.: Answers to Questions Posed During Daily Patient Care Are More Likely to Be Answered by UpToDate Than PubMed. J. Med. Internet Res. 10(4) (2008)Google Scholar
  3. 3.
    Ely, J.W., Osheroff, J.A., Maviglia, S.M., Rosenbaum, M.E.: Patient-care questions that physicians are unable to answer. JAMA 14, 407–414 (2007)Google Scholar
  4. 4.
    Kelly, D.: Methods for evaluating interactive information retrieval systems with users. Foundations and Trends in Information Retrieval, vol. 3(1-2) (2009)Google Scholar
  5. 5.
    Zhang, P., Plettenberg, L., Klavans, J.L., Oard, D.W., Soergel, D.: Task-based interaction with an integrated multilingual, multimedia information system: A formative evaluation. In: JCDL 2007, Vancouver, Canada, June 17-22. ACM (2007)Google Scholar
  6. 6.
    Beckers, T., Dungs, S., Fuhr, N., Jordan, M., Kriewel, S.: ezDL: An interactive search and evaluation system. In: SIGIR 2012 Workshop on Open Source Information Retrieval, OSIR 2012 (2012)Google Scholar
  7. 7.
    Cunningham, H., Tablan, V., Roberts, I., Greenwood, M.A., Aswani, N.: Information Extraction and Semantic Annotation for Multi-Paradigm Information Management. In: Lupu, M., Mayer, K., Tait, J., Trippe, A.J. (eds.) Current Challenges in Patent Information Retrieval. The Information Retrieval Series, vol. 29. Springer (2011)Google Scholar
  8. 8.
    García Seco de Herrera, A., Markonis, D., Eggel, I., Müller, H.: The medGIFT Group in ImageCLEFmed 2012. In: CLEF (Online Working Notes/Labs/Workshop) (2012)Google Scholar
  9. 9.
    Kiryakov, A., Ognyanov, D., Manov, D.: OWLIM – A Pragmatic Semantic Repository for OWL. In: Dean, M., Guo, Y., Jun, W., Kaschek, R., Krishnaswamy, S., Pan, Z., Sheng, Q.Z. (eds.) WISE 2005 Workshops. LNCS, vol. 3807, pp. 182–192. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  10. 10.
    Health On the Net Foundation: The HON Code of Conduct for medical and health Web sites, HONcode (2013), http://www.hon.ch/HONcode/Patients/Conduct.html
  11. 11.
    Kritz, M., Gschwandtner, M., Stefanov, V., Hanbury, A., Samwald, M.: Utilization and perceived problems of online medical resources and search tools among different groups of European physicians. J. Med. Internet Res. (forthcoming, 2013)Google Scholar
  12. 12.
    Borlund, P.: The IIR evaluation model: A framework for evaluation of interactive information retrieval systems. Information Research 8(3), 152 (2003)Google Scholar
  13. 13.
    Brooke, J.: SUS: A “quick and dirty” usability scale. In: Jordan, P.W., Thomas, B., Weerdmeester, B.A., McClelland, A.L. (eds.) Usability Evaluation in Industry. Taylor and Francis, London (1996)Google Scholar
  14. 14.
    TechSmith Corporation: Morae usability testing software, version 3.3.2 (2013), http://www.techsmith.com/morae.html
  15. 15.
    Hornbaek, K., Law, E.L.C.: Meta-analysis of correlations among usability measures. In: Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems, pp. 617–626 (2007)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Veronika Stefanov
    • 1
  • Alexander Sachs
    • 2
  • Marlene Kritz
    • 2
  • Matthias Samwald
    • 3
  • Manfred Gschwandtner
    • 2
  • Allan Hanbury
    • 1
  1. 1.Information & Software Engineering Group, Institute of Software Technology, and Interactive SystemsVienna University of TechnologyViennaAustria
  2. 2.Society of Physicians in ViennaViennaAustria
  3. 3.Section for Medical Expert and Knowledge-Based Systems, Center for Medical Statistics, Informatics, and Intelligent SystemsMedical University of ViennaViennaAustria

Personalised recommendations