Digital Libraries: A Generic Classification and Evaluation Scheme
Evaluation of digital libraries (DLs) is essential for further development in this area. Whereas previous approaches were restricted to certain facets of the problem, we argue that evaluation of DLs should be based on a broad view of the subject area. For this purpose, we develop a new description scheme using four major dimensions: data/collection, system/technology, users, and usage. For each of these dimensions, we describe the major attributes. Using this scheme, existing DL test beds can be characterised. For this purpose, we have performed a survey by means of a questionnaire, which is now continued by setting up a DL meta-library.
KeywordsDigital Library Evaluation Scheme Description Scheme Test Collection Physical Library
Unable to display preview. Download preview PDF.
- 1.C. Borgman. From Gutenberg to the Global Information Infrastructure: Access to Information in the Networked World. MIT Press, 2000.Google Scholar
- 2.B. Buttenfield. User evaluation for the alexandria digital library project. Allerton Inst. WS, http://edfu.lis.uiuc.edu/allerton/95/s2/buttenfield, 1995.
- 3.J. Cram. Six impossible things before breakfast: A multidimensional approach to measuring the value of libraries. In Proc. 3rd Northumbria Intl. Conf. on Performance Measurement in Libraries and Information Services, pages 19–29, Newcastle upon Tyne, 1999. Information North.Google Scholar
- 4.P. Hernon and E. Altman. Service quality in academic libraries. Ablex, Norwood, NJ, 1996.Google Scholar
- 5.L Hill, Ron Dolin, James Frew, R.B. Kemp, M. Larsgaard, D.R. Montello, Mary-Anna Rae, and J. Simpson. User evaluation: Summary of the methodologies and results for the alexandria digital library. In Proc. ASIS, pages 225–243, Medford, NJ, 1997. Information Today.Google Scholar
- 6.L. Kovacs and A. Micsik. A public service for surveys and decision polls. In Proc. DEXA 2000, pages 307–311, September 2000.Google Scholar
- 7.Roberta Lamb. Using online information resources: Reaching for the *.*’s. In Digital Libraries’ 95 Conference Proceedings., New York, 1995. ACM.Google Scholar
- 8.T. K. Landauer. The Trouble with Computers: Usefulness, Usability, and Productivity. MIT Press, Cambridge, Mass., 1995.Google Scholar
- 9.G. Marchionini, C Plaisant, and A. Komlodi. The people in digital libraries: Multifaceted approaches to assessing needs and impact. In Digital library use: Social practice in design and evaluation. MIT Press, Cambridge, Mass., 2001. (in press).Google Scholar
- 10.J. Nielsen and R.L. Mack, editors. Usability Inspection Methods. John Wiley & Sons, New York, 1994.Google Scholar
- 11.Catherine Plaisant and Anita Komlodi. Evaluation challenges for a federation of heterogeneous information providers: The case of nasa’s earth science information partnerships. In Proc. WET ICE 2000, IEEE WS on Evaluating Collaborative Enterprises, 2000.Google Scholar
- 12.J. Preece, Y. Rogers, H. Sharp, D. Benyon, S. Holland, and T. Carey. Human-Computer Interaction. Addison Wesley, Reading, Mass., 1994.Google Scholar
- 13.T. Saracevic and L. Covi. Challenges for digital library evaluation. In Proceedings ASIS, volume 37, pages 341–350, 2000.Google Scholar
- 14.E. Voorhees and D. Harman. Overview of the eighth text retrieval conference (trec-8). In The Eighth Text REtrieval Conference (TREC-8), pages 1–24. NIST, Gaithersburg, MD, USA, 2000.Google Scholar