International Journal on Digital Libraries

, Volume 17, Issue 4, pp 273–286 | Cite as

Results of a digital library curriculum field test

  • Sanghee Oh
  • Seungwon Yang
  • Jeffrey P. Pomerantz
  • Barbara M. Wildemuth
  • Edward A. Fox
Open Access


The DL Curriculum Development project was launched in 2006, responding to an urgent need for consensus on DL curriculum across the fields of computer science and information and library science. Over the course of several years, 13 modules of a digital libraries (DL) curriculum were developed and were ready for field testing. The modules were evaluated in DL courses in real classroom environments in 37 classes by 15 instructors and their students. Interviews with instructors and questionnaires completed by their students were used to collect evaluative feedback. Findings indicate that the modules have been well designed to educate students on important topics and issues in DLs, in general. Suggestions to improve the modules based on the interviews and questionnaires were discussed as well. After the field test, module development has been continued, not only for the DL community but also others associated with DLs, such as information retrieval, big data, and multimedia. Currently, 56 modules are readily available for use through the project website or the Wikiversity site.


Digital libraries Curriculum evaluation Educational resources Computer science  Information science 

1 Introduction

Digital libraries (DLs) have been a perennial topic of research and education in the fields of both Computer Science (CS) and Information and Library Science (ILS). The concept of digital libraries is quite broad, covering a variety of aspects pertaining to information theories, technology, and applications, making it difficult to determine the material that needs to be taught about digital libraries in higher education [1, 2]. This has been a primary concern for graduates in both CS and ILS, in particular. DLs present the opportunity for graduates to further apply new technologies to development of information systems or to provide information services to diverse and widely distributed populations who need access to digital information.

The DL Curriculum Development project was launched in 2006, responding to an urgent need for consensus on DL curriculum across the fields [3, 4, 5]; it can serve as an integrative and firm foundation for DL education in both ILS and CS. Top-ranked universities of DL education in ILS and CS, i.e., the School of Information and Library Science at the University of North Carolina at Chapel Hill (UNC-CH) and the Department of Computer Science at Virginia Tech, have been leading this project, with funding from the National Science Foundation. Many experts in the area of DL have contributed to developing and evaluating the DL curriculum. Basically, the goal of this project was to develop and validate a curriculum covering the area of DLs. This led to the development of educational materials for the DL curriculum that are applicable in graduate-level DL classes for both CS and ILS departments. The project was officially completed in 2009 but the DL community has continued to further develop the curriculum.

This project first proposed a framework for a DL curriculum, including 10 core and 51 sub-topics in DLs, based on a review of past DL course syllabi obtained from both the CS and the ILS departments in various institutes. Within the framework, 13 sub-topics were initially selected to be developed into “modules” that would support DL instructors. In this context, modules are focused lesson plans, including learning objectives, body of knowledge, learning activities, resources, and logistics to manage a class session. The 13 modules were developed collaboratively by researchers at UNC-CH and Virginia Tech, and the advisory board of the project with about 50 experts and instructors who teach about digital libraries around the world. They have been described elsewhere [6, 7, 8] and are available on the project website and in Wikiversity.

Once the module development was completed, DL instructors and their students in real classrooms used the modules and evaluated the effectiveness of the modules in enhancing students’ learning about DLs. The purpose of this paper is to present the findings from this field testing of the modules in classrooms. Through the methods of instructor interviews and student questionnaires, 15 instructors and their students in DL courses participated in the field testing and shared their experiences utilizing one or more of the thirteen modules. Findings indicate that they provided positive feedback on the modules’ effectiveness and gave constructive suggestions for improvement.

This paper is organized as follows. Section 2 provides some background on DL education. The project overview is described in Sect. 3. Our methods and study results are presented in Sects. 4 and 5, respectively, followed by further discussions in Sect. 6. Section 7 covers implications, and we present further enhancements and conclusions in Sect. 8.

2 DL education

Research in DLs has been thriving since the early 1990s. A significant amount of funding from the NSF, DARPA, and NASA has been invested to support DL research and practice, in particular a federal program called Digital Library Initiatives (DLI) [9]. Many DL research institutions and universities have received benefits from the funding and have contributed to developing advanced information systems and applications for DLs [10, 11]. A number of DL research projects and programs have been conducted not only in the US, but also internationally, for DL services, systems, development, management, implementation, and evaluation [12]. Among these topics, DL education is crucial because it is highly related to the success or failure of future DLs. Without having appropriate training and education, DL developers may risk building software that is seriously flawed due to not being fully aware of critical system requirements [13], efficient and effective techniques for implementation [14, 15, 16, 17, 18], or key ingredients of success [19]. A poorly designed DL can cause problems in usability and interoperability [20, 21, 22, 23, 24] and this could jeopardize its suitability [25] or long-term visibility with regard to digital preservation [26].

Some progress in assessing the status and emerging demand of DL education has been made since the late 1990s. In 1999, Spink and Cool’s survey of faculty in ILS and CS schools worldwide [27], found that about 20 institutions from eight countries (12 in the US and each one from Canada, New Zealand, UK, Malaysia, Singapore, Australia, and Brazil) offered DL courses. Most were at the graduate level, covering a variety of DL topics, such as theoretical and historical foundations, technical infrastructure, knowledge organization, collection development, information access and utilization, social, economic and policy issues, and professional issues. Two years later, Saracevic and Dalbello found that, among 56 American Library Association accredited ILS schools, 15 programs offered independent DL courses and 32 programs included DLs as a topic of other courses to some degree in the US and Canada, with a focus on information technology, integrating DLs with foundations, knowledge presentation, and archives of ILS programs [1].

In the 2000s, the need for developing a formal curriculum with integrative and interdisciplinary topics in the fields of ILS and CS for DL education emerged and was widely discussed [1, 28, 29]. A number of textbooks [30, 31, 32, 33, 34, 35, 36] and other references were published in order to provide appropriate guidance for DL education [37]. The number of schools offering DL courses worldwide increased to 42 in 2003 [38]. Topics pertinent to DLs have been identified and compared across DL courses from various ILS and CS schools [28, 38, 39, 40]. The DL courses in these schools were consistent in that they cover topics in theory and technology, including organization, resource descriptions, intellectual property, preservation, collaboration, management, and access. DL courses in CS programs have focused on metadata, databases, information retrieval, and DL software systems in particular. The course objectives and descriptions were similar in face-to-face and online offerings, but the level of interaction with students (for example, having experiences of digitizing or creating digital libraries) differed.

In spite of the increasing number of institutions that offer DL courses, most of the courses have been stand-alone without integrating any of the wide range of materials available in the field [41]. A variety of approaches to developing DL education have been discussed internationally [29, 42, 43, 44, 45, 46, 47] but there was neither a formal curriculum of study, which could be widely applied to achieve balance between theories and practice in DLs, nor research on effective ways of evaluating a DL curriculum. Most of the previous studies were focused on investigating the trends and major topics covered in DL courses using surveys and content analysis of existing DL course syllabi or programs [38, 39, 40, 41]. Liu compared aspects such as course outlines, textbooks, and assignments, across DL courses, but this study was still limited to analyzing the course content [38]. No studies have involved DL instructors and their students in DL curriculum evaluation, nor have they investigated how the DL courses have been taught and how effectively the course contents and practice have influenced students’ learning about DLs.

Therefore, a DL curriculum development project was launched in 2006 in order to develop, evaluate/validate, and disseminate curricular and educational materials that are useful for training university graduate students in both ILS and CS. An environmental scan of the DL courses from this project provided an overview of the topics covered in DL courses and led to development of the core topics of the proposed DL curriculum. Modules of these topics have been designed and developed based on 5S theory [48] and practice in DLs. Additionally, the effectiveness of the modules has been thoroughly reviewed and tested by instructors and students of DL courses in the field. This paper first describes the development of the DL curriculum, then reports and discusses the findings from the field tests of specific modules.

3 DL curriculum development project overview

A versatile digital library curriculum was developed through the three phases of the project: (1) DL curriculum framework development, (2) module development and review, and (3) module field testing.

3.1 Phase 1: DL curriculum framework development

The DL curriculum framework was developed, considering both theoretical and empirical approaches to DL education. Theoretically, the framework was developed based on the 5S framework of DLs, which designated the five important aspects of DLs: streams, structures, spaces, scenarios, and societies [48]. In addition, the Computing Curriculum 2001 (a joint effort of ACM and IEEE-CS) included a digital libraries module in Information Management, and it was carefully examined [49]. For the empirical development of the framework, two types of analyses were conducted. First, the research literature was examined in order to identify research topics of interest; all the papers presented at the ACM International Conference on Digital Libraries and the Joint Conference on Digital Libraries, as well as all the papers published in D-Lib Magazine were analyzed [53]. In a second study, 25 DL syllabi from ILS and CS taught between 2006 and 2007 were collected and analyzed [50, 51]. Weekly topics and readings to be covered in each course were examined. A total of 1777 titles for readings were identified which included books, book chapters, journals, journal articles, reports, or online sources. There was significant consensus on reading assignments, as well as the topics to be covered in the courses. These analyses led to identifying 10 core topics (overview, digital objects, collection development, information/knowledge organization, architecture, user behaviors/interactions, services, preservation, management and evaluation, and DL education and research) and the 51 sub-topics in the framework [8]. Details of the DL curriculum framework are available at

3.2 Phase 2: module development & review

Each of the 51 sub-topics in the DL curriculum framework was a candidate for development as a module. Prior to module development, a template for the modules was designed to specify the minimal critical components of the modules. With input from the project’s advisory board, the final version of the module template was adopted (see Fig. 1).
Fig. 1

Module template (also available at:

Among the 51 sub-topics, 13 topics were carefully selected after considering the frequency of topics discovered through the analysis of the syllabi of DL courses. The modules were primarily developed by the project team members. Once a module was developed, external experts in the field of DLs were invited to conduct a preliminary evaluation of it. Experts were selected from the DL research and teaching community, especially among those who have particular expertise in the topic covered by a certain module. They inspected the assigned module carefully, considering various aspects, such as: (1) its coverage of the topic, (2) the currency and appropriateness of the readings, and (3) any assignments or exercises associated with the topic. Their comments were recorded and shared on the project’s wiki site (a closed one) [52]. All together, 32 different experts evaluated the 13 modules. The results from these reviews were incorporated in updates of the modules.

3.3 Phase 3: module field testing

Once modules had been reviewed by experts and revised accordingly, they were ready for field testing. The field testing was performed with two sources of evaluation in mind, DL instructors and their students. Instructors of DL courses in universities around the world were invited to implement one or more modules within the context of the existing DL course that they teach. After completing class sessions with the modules, instructors were interviewed individually about their perspectives on each module’s usefulness. Compared to the preliminary evaluation of the experts, this module field testing emphasized feedback on modules based on instructors’ direct experiences with implementation of the modules in their class environments.

Additionally, students taught by the instructors in DL courses were invited to complete questionnaires through which they could share their perceptions and experiences of learning through the activities of the modules. The primary challenge of the student questionnaire was to disambiguate their perceptions of the modules from their perceptions of their interactions with the instructors in classes. Therefore, the student evaluation questionnaire focused on students’ evaluations of the module content and their efforts to learn the knowledge and practices presented in the module. This paper reports on the findings from this field testing of the modules provided by instructors and their students.1

4 Methods

4.1 Interviews with instructors

An email invitation to instructors who teach courses on DLs at the graduate level in any institute in CS and ILS was circulated in the JESSE and the ASIS&T SIG DL listservs in late 2007. Fifteen instructors responded, indicating their willingness to participate in the module field test. A total of 15 different modules were tested in 43 implementations. Among the 15 instructors, seven of them volunteered to participate in semi-structured interviews and to discuss their experiences using the modules. Prior to the module field testing, no specific guidelines were given to instructors about how to implement the modules in their class sessions, in order to respect individual instructors’ teaching styles and to test the flexibility of the modules to be used. Prior to the interviews, a copy of an informed consent information sheet was sent to participating instructors via email and they provided oral informed consent at the time of the interview. During the interviews, the questions focused on the ways in which the instructors implemented the modules in their class sessions, with questions about the module’s learning objectives, the body of knowledge covered in the module, the suggested readings, the suggested learning activities, the logistics of implementing the module, and the overall structure of the module. The interview guide is available on the project’s Wikiversity site ( The interviews averaged 36.7 min each, several of which covered use of multiple modules; interview length ranged from 13 to 88 min. Two of the interviews were conducted face-to-face, and the remaining five by phone/Skype. At the end of the interviews, several instructors provided their lecture notes (e.g., PowerPoint slides) or class resources to show how the modules had been implemented in their class sessions. These materials helped to specify which parts of the modules had been used in their class sessions. The interviews were audio-recorded and later transcribed for a descriptive analysis for the purpose of identifying important aspects of how the modules were used in class, as well as which aspects of each module were considered useful or could be improved, from the instructor’s perspective. Other information shared by instructors, such as their class environments (e.g., face-to-face, online) and the benefits and challenges of using the modules in class, were marked and reported as well.

4.2 Student questionnaires

Immediately after the completion of classes that utilized modules, students were invited to complete an online questionnaire. The instructors were asked to encourage their students to complete the online questionnaire, using a standard script provided by the researchers. Students’ participation was voluntary and instructors did not know which students participated in the study. An invitation email sent directly to the students included an informed consent sheet which described the study procedure and contained a link to a Web-based questionnaire, hosted by Qualtrics at UNC-CH. Students provided their implicit consent when they accessed the link to the questionnaire. After the invitation email, students received two additional emails as reminders a week apart. After completing the questionnaire, one student from each class was randomly selected and received a $5 Starbucks gift card, delivered via email.
Fig. 2

Overview of field testing participation

The evaluation questionnaire was designed to be completed in two minutes, evaluating the module contents and students’ efforts using the modules in their class sessions. The questionnaire items were drawn from several existing measures. Snare’s end of semester questionnaire was used to evaluate whether class lectures, learning activities, and assignments were appropriate [53]. Students’ satisfaction with the module style of learning, course topics, and important concepts was evaluated with suggested questions from McGorry’s quality evaluation tool of online programs [54]. Additionally, the evaluation handbook by Flashlight [55] and course evaluation questionnaires in nursing [56] were reviewed and used to develop the evaluation questionnaire used in the current study. A total of 17 statements were provided and the students were asked to rate their agreement with each statement, using a 5-point Likert scale (from strongly disagree \(=\) 1 to strongly agree \(=\) 5). Their prior knowledge of the module topic was investigated on a 4-point rating scale (from no prior knowledge \(=\) 1 to full knowledge of the topics covered in the module \(=\) 4).

In order to provide a more reliable/stable representation of the students’ views, each module implementation was treated as a distinct case during data analysis. This procedure was especially important because the class sizes and the number of students responding in each class varied dramatically (e.g., from 2 responses from a class of 10, to 25 responses from a class of 36). One implication of this procedure is that only modules for which multiple classes had hosted a field test are included in our current analysis. For each class, a mean score was calculated for each questionnaire item. The class means for all implementations of a particular module were then averaged to obtain an aggregated mean rating on each item for each module. It should be noted that, for some implementations, specific questionnaire items were not applicable to that implementation (e.g., some instructors did not incorporate any of the learning activities in their offering of the module, so items 5–7 were not applicable). These were treated as having missing data in the class means and the aggregated means.

5 Results

During Spring, Summer, and Fall 2008, as well as Spring 2009, 15 instructors from 14 universities participated in field testing 15 modules. A total of 749 students enrolled in DL classes taught by the 15 instructors were invited to participate in the student surveys. Approximately half of the enrolled students participated in the survey (376 students, 50.20 %).

While 15 different modules were evaluated in the field tests, only nine modules were tested in more than one class. This paper reports findings from the 37 module field tests of those nine modules, including responses from the 329 students who were exposed to those modules. The relationship between the full set of field tests and those presented in this paper is illustrated in Fig. 2.

Among the 37 module field tests, nine were conducted in CS or engineering departments (by two of the participating instructors) and the remaining 28 field tests were conducted in ILS schools or departments.

Of the 14 universities participating in the field tests, 12 were in the US; one of the others was in Europe and one was in Asia. Of the 12 US universities, nine are public and three are private universities. The 2010 Carnegie Classification of Institutions of Higher Education was used to describe additional characteristics of the 12 US participating universities. Eight of the nine public universities are research-oriented (classified as either “very high” or “high” research activities) and one of them is teaching-oriented (classified as Master’s College & Universities). One of the three private universities is research-oriented and two of them are teaching-oriented. Eight of the 12 universities are classified as “large” institutions, with equal to or greater than 7500 full-year unduplicated credit headcounts; three are “medium,” with 2500–7500; and one is “small,” with less than 2500.

Table 1 shows the number of classes and the number of students participating in the evaluation of the nine modules. Module 3-b: Digitization and module 4-b: Metadata were the most popular modules for instructors, which were field tested seven times each.
Table 1

Module field test participation overview


Number of classes in which a field test was conducted

Total number of students responding to questionnaires

Total number of students enrolled

1-b, history of digital libraries and library automation




3-b, digitization




4-b, metadata




5-b, application software




6-a, information needs




6-b, online information seeking and search strategy




6-d, interaction design, usability assessment




9-c, DL evaluation, user studies




9-e, intellectual property








This table includes information on the module field tests that were performed in more than one class

5.1 Interviews with instructors

Seven out of 15 instructors participated in the interviews and shared their teaching experiences with eight different modules. They used the modules in face-to-face, online, or blended (having both face-to-face and online sessions) classes. In general, instructors spent about 3 h per class session (i.e., 3 h were devoted to teaching a single module).

Instructors commented that the modules were appropriate for covering important topics that they taught in their DL courses. They also mentioned that the modules were well structured and very detailed, including learning objectives, topics, and specific guidelines for possible exercises. Additional comments included that the modules could be used as a checklist against the instructors’ own materials, comparing topics, readings, exercises, and assignments. The modules helped instructors to think about the topics that they could assign for class discussion. Instructors added things listed in the modules to their class materials. An instructor who taught more than one module in her class affirmed that the modules were well articulated with one another (i.e., it was easy to move from one module to the next). These modules could be useful as a teaching guide or resource for instructors who are teaching the course for the first time, as well. One instructor provided hard copies of the modules to students after class for use during their study time, utilizing additional resources provided in the modules.

While the instructors commented on many positive aspects of using the modules, they also noted that the modules did not always perfectly fit with the scope of their class sessions. Sometimes instructors separated them into more than one session (mostly one and a half or two) due to the time limits they have within a class session. Or, they divided the topics in a module into multiple parts and taught them in different sessions. Table 2 shows a brief summary of how each of the six modules was used and the suggestions from the instructors for improving the module.
Table 2

A summary of instructor interviews on each module


How the module was used in class

Instructors’ suggestions

3-b, digitization (

\(\bullet \) Instructors agreed that the learning objectives of this module consider both theoretical and practical aspects of digitization to be equally important

\(\bullet \) Instructors pointed out that some of the readings are outdated and suggested adding digital imaging guidebooks to teach recent techniques, such as resizing images, digital file formats, and software or hardware considerations

\(\bullet \) Body of knowledge covers topics from standards and management to advanced issues and challenges associated with digitization processes

\(\bullet \) The module was originally designed to be completed in one and a half hours, but instructors often spent three hours or more or offered two sessions in order to cover an introduction to digitization and to cultural, technical, and legal issues associated with digitization

\(\bullet \) The learning activity of this module is to build a small-scale image collection by taking photos and creating deliverables. Most instructors have given similar assignments to their students as a course project rather than as a class activity

4-b, metadata (

\(\bullet \) This module includes two learning objectives—(1) to explain the basic principles and design of metadata schema and (2) to design a metadata schema and assign values to materials in a DL

\(\bullet \) This module does not call for a prerequisite course. Instructors noted, however, that students in their DL courses had probably taken the intro course of Information Organization in advance, so they preferred to cover advanced levels of metadata standards such as CDWA (Categories for the Description of Works of Art) and VRA (Visual Resources Association), which are useful when students build their project DLs

\(\bullet \) One instructor taught the basics and issues of metadata before students built their own metadata schema for their group projects. After their metadata building, the instructor led a class discussion about metadata applications and their experiences of building one

\(\bullet \) A class activity in which metadata is assigned to a physical object (e.g., a book, a newspaper, a coffee mug, a plant) was new, entertaining, and useful for stimulating students’ learning, but the assigned time of 15–20 min in the module is too short to cover students’ discussions and it took about twice as long in class

\(\bullet \) One instructor pointed out that the body of knowledge is too broad to cover in one session, so taught topics, such as metadata harvesting, in a separate session, “Protocol and Standard Issues in DL”, along with other standards in building a digital library in general

5-b, application software (

\(\bullet \) Instructors agreed that this module was challenging for ILS students; understanding technical functions and features of DL software was difficult since they were used to scanning and choosing systems without fully understanding how the systems work

\(\bullet \) Readings of this module were too comprehensive since there were at least two or three readings about each type of DL software. Instructors assigned those associated with one or two applications they mainly use in their DL courses and then showed a demo of how to use other DL software in class

\(\bullet \) Instructors agreed that this module was useful and well structured to gain an overview, learn the architecture, technical requirements and features, standards, and content of different types of DL software (e.g., EPrints, DSpace, Greenstone, CONTENTdm)

\(\bullet \) Instructors did not use the learning activities, i.e., a group presentation or writing a short paper on DL applications. Instead, they would rather have their students focus on reviewing a DL application that they choose for students’ DL projects

\(\bullet \) One instructor pointed out that students use only CONTENTdm for building a project DL during coursework. This module provided an opportunity for them to learn other types of DL software and compare their functions

\(\bullet \) The assigned 2 h of class time was too short to cover everything. One instructor suggested dividing the module into two sessions, one for installing and the other for configuring DL application software

6-b, online information seeking and search strategy (

\(\bullet \) Overall, this module includes two topics—(1) an overview of online information seeking behavior theories and models and (2) the case studies of search strategy development in DLs

\(\bullet \) One instructor used readings from this module selectively, focusing on DL settings, excluding those about general theories of information behaviors

\(\bullet \) One instructor pointed out that this module covers important fundamental concepts, methods, and strategies related to online information seeking behaviors but does not consider deeply the practical aspects of how to use search systems in DLs

\(\bullet \) One instructor suggested using search exercises in DLs with specific guidelines for why students have to learn those search strategies and how to complete the exercises effectively. This instructor showed a demo of students’ learning activities on how to design a DL system for better searching, using software such as Dreamweaver

6-d, interaction design, usability assessment (

\(\bullet \) This module covers the basic concepts and processes related to designing interfaces for digital libraries and developing strategies for evaluating DL interfaces with the methods of usability testing (e.g., heuristic evaluation). One instructor pointed out that the body of knowledge of this module mostly discusses DL interface evaluation, focusing less on interface design. Learning activities of this module pertained to developing user personas and tasks for the DL interface evaluation, as well

\(\bullet \) One instructor suggested including in-class exercises or activities in order to design and develop the functions and features of different levels of DL interfaces, discussing what should be included in the main pages, search interfaces, or interfaces for collection or item levels in DLs

\(\bullet \) The topic of accessibility appeared in the module, but the instructor preferred to expand the scope of the DL interface design for disabled populations in depth

9-c, DL evaluation, user studies (

\(\bullet \) This module covers the basics of DL evaluation and a discussion comparing strengths and weaknesses of multiple approaches to DL evaluations, with learning activities relevant to developing evaluation plans and analyzing evaluation reports

\(\bullet \) One instructor suggested adding topics to the DL evaluation criteria. In her class, she gave an assignment to students in which they were to identify what they considered to be the most important criterion of DL evaluation, justify it, and apply their criterion in evaluating a DL that they chose. The instructor also asked students to develop a proposal with which the DLs that they had developed throughout the semester could be evaluated, and to include it in the final report of the project

\(\bullet \) One instructor indicated that the body of knowledge is good enough to cover the basic and critical information, such as definitions, components, procedures, and constructs of DL evaluation

\(\bullet \) The instructor, however, did not teach the parts related to data collection, sampling or analysis of user studies in the module because students had already learned them from a required course on Research Methods prior to enrolling in the DL course

The three additional modules in Table 1 (1-b history of digital libraries and library automation, 6-a information needs, and 9-e intellectual property) were excluded in Table 2, since none of instructors who field tested those three modules participated in an interview

5.2 Student questionnaires

The nine modules shown in Table 1 were taught in 37 classes. A total of 329 students provided their feedback on these modules using a 5-point Likert scale (1 \(=\) strongly disagree and 5 \(=\) strongly agree). The mean ratings on each statement in the questionnaire were aggregated across classes and presented in Table 3. Grand means in the right-end column of Table 3 were calculated by averaging the individual class mean ratings across the nine modules. Grand means in the bottom row of the table were calculated by averaging the individual class mean ratings across the questionnaire items. The student responses will be discussed by first comparing the ratings across modules and across questionnaire items, then discussing the student perceptions of each module.
Table 3

Aggregated mean ratings (standard deviations) of the students’ questionnaire responses

Questionnaire items

1-b, hist. of DL

3-b, digitization

4-b, metadata

5-b, app software

6-a, info needs

6-b, info seek, search strat.

6-d, interaction design

9-c, DL eval., user studies

9-e, intel. property

grand mean per item

1. Clearly outlined objectives and outcomes were provided\(^{\mathrm{b}}\)

4.2 (0.59)

4.1 (0.82)

3.8 (0.90)

2.8 (1.23)

3.6 (1.20)

4.1 (0.52)

3.8 (0.70)

4.2 (0.50)

3.7 (1.03)

3.8 (0.44)

2. The module was well organized\(^{\mathrm{d}}\)

4.2 (0.58)

4.1 (0.91)

3.9 (0.78)

3.0 (1.10)

3.1 (0.97)

4.0 (0.60)

3.6 (0.81)

4.1 (0.53)

3.6 (1.00)

3.7 (0.44)

3. The amount of work required for this module was appropriate\(^{\mathrm{d}}\)

3.9 (0.70)

3.7 (0.92)

3.9 (0.84)

3.1 (1.13)

3.4 (1.54)

4.1 (0.80)

3.5 (0.87)

4.0 (0.58)

3.2 (1.00)

3.6 (0.36)

4. The assigned readings helped me better understand the subject matter\(^{\mathrm{d}}\)

4.0 (0.90)

4.1 (0.89)

3.8 (0.97)

2.8 (1.21)

3.3 (0.97)

4.2 (0.84)

3.5 (0.87)

3.9 (1.01)

3.3 (1.10)

3.7 (0.46)

5. Given the module’s objectives, the learning activities and/or assignments were appropriate\(^{\mathrm{d}}\)

3.8 (1.01)

4.0 (0.84)

4.0 (0.75)

2.9 (1.20)

3.4 (1.34)

3.9 (0.78)

3.7 (0.70)

4.2 (0.52)

3.5 (1.17)

3.7 (0.40)

6. The learning activities and/or assignments required thinking and understanding.\(^{\mathrm{a}}\)

4.1 (0.66)

4.1 (0.78)

4.2 (0.67)

2.7 (1.20)

3.4 (1.30)

4.0 (0.71)

3.9 (0.70)

4.4 (0.50)

3.6 (1.34)

3.8 (0.51)

7. The learning activities and/or assignments were stimulating\(^{\mathrm{c}}\)

3.6 (0.84)

3.8 (0.95)

4.0 (0.74)

2.9 (1.10)

3.3 (1.34)

3.4 (0.88)

4.1 (0.88)

4.2 (0.93)

3.3 (1.61)

3.6 (0.44)

8. Assignments for this module helped me understand what will be expected of me as a professional\(^{\mathrm{c}}\)

3.5 (0.84)

3.8 (0.89)

3.4 (0.94)

3.1 (0.97)

3.9 (1.32)

3.3 (0.71)

3.1 (0.83)

4.1 (0.69)

3.7 (1.30)

3.5 (0.35)

9. I learned useful professional skills from this module\(^{\mathrm{a}}\)

3.5 (0.85)

3.8 (0.87)

3.6 (0.84)

2.9 (0.90)

3.6 (0.98)

3.6 (0.79)

3.4 (0.81)

4.0 (0.65)

3.4 (1.14)

3.5 (0.30)

10. I know significantly more about this subject than before I took this module\(^{\mathrm{d}}\)

4.1 (0.95)

3.8 (1.10)

3.8 (0.96)

2.9 (1.18)

4.0 (1.03)

3.9 (0.67)

3.5 (0.93)

3.9 (1.10)

3.3 (1.13)

3.7 (0.39)

11. Class lectures added to my understanding of the subject.\(^{\mathrm{a}}\)

3.6 (0.95)

3.7 (0.96)

3.8 (0.86)

2.9 (1.13)

3.2 (1.10)

3.9 (0.79)

3.5 (0.87)

3.7 (1.25)

3.5 (1.20)

3.5 (0.31)

12. I gained a good understanding of the basic concepts related to this subject\(^{\mathrm{b}}\)

4.1 (0.70)

4.0 (0.89)

3.9 (0.77)

2.8 (1.14)

3.1 (1.23)

4.0 (0.43)

3.9 (0.66)

3.9 (0.57)

3.5 (1.15)

3.7 (0.46)

13. I learned to interrelate important issues related to this subject\(^{\mathrm{b}}\)

3.9 (0.65)

3.8 (0.98)

3.8 (0.70)

3.5 (1.02)

4.1 (0.79)

3.8 (0.87)

3.8 (0.70)

3.8 (0.78)

3.3 (1.04)

3.8 (0.23)

14. This module stimulated me to think critically about the subject matter\(^{\mathrm{d}}\)

3.9 (0.66)

3.6 (0.89)

3.9 (0.74)

3.6 (1.09)

3.9 (0.93)

3.6 (0.90)

3.7 (1.02)

3.8 (0.75)

3.6 (1.28)

3.7 (0.14)

15. I feel that this learning module served my needs well\(^{\mathrm{b}}\)

3.9 (0.73)

3.7 (0.93)

3.7 (0.80)

3.5 (1.02)

3.9 (0.60)

3.8 (0.84)

3.3 (1.02)

4.0 (0.74)

3.6 (0.89)

3.7 (0.22)

16. I was very satisfied with this learning module\(^{\mathrm{b}}\)

3.9 (0.83)

3.6 (1.08)

3.6 (0.86)

3.2 (0.83)

3.9 (0.60)

3.9 (0.79)

3.3 (1.02)

3.8 (1.05)

3.4 (1.12)

3.6 (0.27)

17. Overall, considering its content, design, and structure, this module was effective\(^{\mathrm{d}}\)

4.1 (0.71)

3.8 (0.99)

3.8 (0.86)

3.6 (0.93)

4.0 (0.70)

4.0 (0.60)

3.4 (0.87)

4.1 (0.57)

3.7 (0.87)

3.8 (0.24)

Grand mean per class

3.9 (0.23)

3.9 (0.18)

3.8 (0.18)

3.1 (0.30)

3.6 (0.34)

3.9 (0.25)

3.6 (0.26)

4.0 (0.19)

3.4 (0.16)

3.7 (0.29)

Sources of statements: \(^{\mathrm{a}}\) from Snare [53], \(^{\mathrm{b}}\) from McGorry [54], \(^{\mathrm{c}}\) from Flashlight [55], \(^{\mathrm{d}}\) from Neal [56]

Overall, 9-c: DL evaluation and user studies received the highest ratings (\(M = 4.0\)), followed by 1-b: History of digital libraries, 3-b: Digitization and 6-b: Information seeking and search strategies (each \(M = 3.9\)). Individual class means for these modules ranged from 3.5 to 4.2. 5-b: Application software received the lowest ratings (each \(M = 3.1\)). Individual class means for this module ranged from 2.7 to 3.6.

It is also useful to look at students’ ratings on item 17, “Overall, considering its content, design, and structure, this module was effective.” On this item, 1-b: History of digital libraries and9-c: DL evaluation and user studies received the high average rating of 4.1. Individual classes rated these modules on this item as high as 4.4 and as low as 3.5. 6-d: Interaction design received the lowest mean rating on this item (\(M = 3.4\)). Individual classes rated this module on this item as high as 3.9 and as low as 3.1.

The mean ratings (across multiple offerings of each module) on each item ranged from 3.5 to 3.8. The mean across all items and all modules was 3.7, and responses on item 17, “Overall, considering its content, design, and structure, this module was effective,” averaged 3.8. Thus, we can conclude that students generally found the modules to be at least moderately effective. Student perspectives on each of the modules are discussed below; the content of all the modules is available at (

Class means for 1-b: History of DL and library automation ranged from 3.5 to 4.2 across all the questionnaire items, averaging 3.9. It was among the highest-rated modules, as noted above. The lowest ratings related to the assignments associated with the module and whether students would learn useful professional skills. The assignments involved focused in-class discussions and writing a case study of a project, so could likely be augmented with learning activities of a more applied nature.

Class means for 3-b: Digitization ranged from 3.6 to 4.1 across all the questionnaire items, averaging 3.9. Like module \(1-b\), it was among the highest-ranked modules. The lowest ratings were for item 14, “This module stimulated me to think critically about the subject matter” and item 16 “I was very satisfied with this learning module.” Students gave particularly high ratings for the items related to clearly outlined objectives and outcomes, module organization, readings, learning activities, and their ability to learn basic concepts from the module. The learning activity for this module was to build a digital image collection, and it’s likely that the hands-on nature of this activity was appreciated by the students.

Class means for 4-b: Metadata ranged from 3.6 to 4.2 across all the questionnaire items, averaging 3.8. As with module \(1-b\), students did not have confidence that the module helped them understand what would be expected of them as a professional. However, they were quite satisfied with the appropriateness and the intellectual demands embodied in the accompanying learning activities (in-class exercises that involved assigning metadata to physical objects). Instructors, however, commented that the learning activities were entertaining but used up more class time than they expected.

Class means for 5-b: Application software ranged from 2.8 to 3.6, averaging 3.1. It was the lowest-rated module among those involved in the field testing. The students did not find the assigned readings to be helpful. In addition, the student ratings indicated that improvements could be made by clarifying the module’s objectives and rethinking the amount of work required to complete the module. Instructors commented in their interviews that this module provides a good overview of technical aspects of software but it is too comprehensive, covering many different applications. This perspective is consistent with students’ perceptions that they had difficulty carrying out all the activities and learning all the important topics included in the module.

Class means for 6-a: Information needs/relevance ranged from 3.1 to 4.1 across all the questionnaire items, averaging 3.6. Responses to two items were the lowest: one related to the organization of the module and the other related to students’ beliefs that they gained a good understanding of relevant concepts from this module. The students did believe they learned to interrelate important issues by completing this module, and the overall effectiveness of the module was rated more highly than almost any single aspect of it.

Class means for 6-b: Online information seeking and search strategy ranged from 3.3 to 4.2 across all the questionnaire items, averaging 3.9. These ratings place it among the highest-ranked modules involved in the field test. The lowest-rated items pertained to the learning activities (not stimulating enough), the inability of the module to help students understand their future roles as professionals, and their ability to learn professional skills by completing the module. One of the instructors criticized this module saying that the learning activities need to be more practical; they included exercises in transaction log analysis, interviewing DL users, and conducting think-aloud protocols.

Class means for 6-d: Interaction design and usability assessment ranged from 3.1 to 4.1 across all questionnaire items, averaging 3.6. The student ratings were lowest in relation to their beliefs that the module helped them understand what would be expected of them as a professional; they also gave low ratings to items related to their satisfaction with the module, whether it served their needs well, and its overall effectiveness. In contrast, they did find the module’s learning activities (developing user personas, developing user tasks, and conducting a heuristic evaluation) to be stimulating. One of the instructors pointed out that it may help students’ learning if the module includes learning activities for practical and professional skill development, such as how to design interfaces with specific guidelines and how to use software to develop an effective design for a DL interface.

Class means for 9-c: DL evaluation, user studies ranged from 3.7 to 4.4 across all questionnaire items, averaging 4.0; this was the highest-rated module involved in the field test. While all the ratings were high, the items receiving the very highest ratings (4.2–4.4) indicated that the students found the learning activities to be appropriate and to require thinking and understanding and the module’s objectives to be clear. The instructors agreed that this module covers the basic and important topics in DL evaluation, though one suggested that content and learning activities related to DL evaluation criteria would be a useful addition to this module.

Class means for 9-e: Intellectual property ranged from 3.2 to 3.7 across all questionnaire items, averaging 3.4. It was one of the the lowest-rated modules among those involved in the field testing. The students did not find the learning activities to be stimulating, and did not believe that they significantly increased their knowledge of intellectual property issues by completing the module. The item related to the overall effectiveness of the module was rated higher than any specific aspect of the module.

6 Discussion

Field testing of the modules in this project provided a rare opportunity to implement the modules in real classrooms and to receive critical evaluations on a variety of aspects from the DL instructors who are experts and have many years of experience teaching the courses. In addition, students’ feedback provided information about how the modules influenced students’ perceptions of ways to learn about important topics related to DLs. Surveying students was useful for obtaining their overall understanding and experiences of the learning modules in classrooms. Interviews with instructors provided details about the classroom environments that we were unable to learn from the student questionnaires. Instructors’ experiences with and suggestions for the modules were very helpful for improving the modules in five areas, specifically: the learning objectives, the body of knowledge, readings, learning activities, and logistics for offering each module.
  • Learning objectives: Instructors agreed that most of the learning objectives assigned to each module were reasonable. They also made suggestions to tweak them in a few cases, covering not only theoretical but practical aspects of the modules as they apply to the context of DLs, in particular.

  • Body of knowledge: Instructors liked the format of lecture notes designed to list important topics and provide additional information with bullet points. They believed that the body of knowledge helped students to achieve the class objectives by discussing important issues and problems associated with the topics of each module. In several cases, however, the subjects covered in the body of knowledge were too comprehensive to be taught in one class session, so instructors had to reorganize the contents and often divided them into multiple sessions.

  • Readings: Instructors used the assigned readings in the modules selectively, based on what they think important or the number of readings they could reasonably cover in one session, since there sometimes were too many readings in one module. Instructors also provided readings that they used in class but that had not been included in the modules. They emphasized the importance of including recent publications in the readings so as to inform students about up-to-date technology.

  • Learning activities: Some instructors liked that there are learning activities in each module that can help students apply what they learned in class to practical cases. The detailed instructions for the activities were useful, but it seemed to take longer for the students to complete the activities than the assigned time allowed in the modules. There were other instructors who did not choose to use the activities in the modules since they already offered similar activities in the format of a course project, which was covered not in one session but throughout the semester; instructors used their own activities as well, which were different from those in the modules.

  • Logistics, e.g., feasibility, level of effort required in class, prerequisites, class time, etc.: In general, instructors agreed that the modules were desirable for use in their current DL courses, and that the level of effort required for students to prepare class readings and to learn and demonstrate knowledge gained in the modules was appropriate. A couple of instructors indicated that most of the prerequisite knowledge for the DL course had been covered in the foundation courses, such as Information Organization and Research Methods, in their programs. In terms of the class time assigned in the modules, instructors noted that it needs to be modified to reflect the situations in classroom environments; for example, one and half hours of class time was sometimes not enough to cover the material in one module.

Instructors thoroughly reviewed the body of knowledge and readings within the modules in particular and provided evaluative comments on the topics in a module, addressing whether the module covered the basic concepts and important features and functions related to DL building or DL evaluation. The emphasis on topics could be in line with the fact that most of the previous studies on DL curriculum evaluation/reviews examined the primary topics covered in the courses and compared them across DL courses [38, 39, 40] in that there were certain topics that DL instructors believe are so important that they are considered as fundamentals of digital librarianship. DL courses have been known for offering many creative assignments in classes, including DL building projects [38].

In addition to the comments on the five areas above, several instructors suggested that discussion questions be developed on which students can work in and out of class so they can study critical issues related to the modules and present what they learned from the modules in their own words, individually, in small groups, or in class. This could be important in online classrooms, in particular. One instructor indicated that he requires students to learn class materials by themselves in online but asynchronous class environments, and to post answers to discussion questions every week.

The modules were originally designed to satisfy the needs of interdisciplinary education for graduate-level DL courses in both ILS and CS. The application of modules in classes of CS could be different from those tested in classes in the ILS programs, depending on the core areas of knowledge on which each discipline focuses [29]. One instructor who has a CS background but taught DL courses in an ILS program commented that the current module contents could better serve ILS students than CS students and suggested customized assignments for CS students, such as providing opportunities for CS students to fully install, configure, and contribute to developing software in open source communities.

Each module in this project includes lecture-style notes to be used in a class session to teach an assigned topic. This approach was intended to enhance flexibility; instructors can choose any number of modules from the framework and implement them for their own class purposes. Therefore, modules are independent from one another, although we noted relationships with other modules in the module template to suggest the order in which the modules can be taught when an instructor chooses to use multiple modules. In real class environments, however, it may not be wise to utilize modules independently all the time. For example, most DL instructors assigned a semester-long group project to students, to build a small-scale DL. It could be important to consider the big picture in module development, looking for ways to connect class lectures and activities with the assigned semester-long class projects.

For effective module field testing, we used both instructor interviews and student questionnaires to collect data, but there were limitations to our findings. Fifteen instructors taught one or more modules in their DL courses but interview data were collected from only seven instructors due to conflicts in scheduling the interviews. Therefore, the interview data may not represent the opinions of all DL instructors who used the modules. Due to the limited data collection through the interviews, there were cases in which students’ responses on questionnaires were not fully interpretable.

7 Implications

Results from this module field test have both theoretical and practical implications in DL curriculum development and evaluation. The module field tests were effective in validating the contents and the practicality of the modules by obtaining constructive and critical feedback from instructors and students who had experience using the modules. The modules have been updated based on the findings from the module field tests. Furthermore, the subjective comments related to the overall aspect of the module design have been reflected in the development of new modules. Given the results of the formative evaluations reported here and the additional improvements to the curriculum modules, it is clear that this project has made a substantive contribution to DL education.

In addition, the procedures and methods used for the module field tests can be replicated in evaluating new educational materials in DL education and other areas of CS and ILS curricula. Interviews with the instructors were extremely useful in further development of the modules. Those curricula that offer multiple sections of the same course may want to conduct debriefing interviews or meetings with the participating instructors as part of their curriculum improvement efforts. In addition, the student questionnaires focused on the value of the modules to the students, rather than on more generic aspects of students’ reactions to this instruction. Such feedback from students could be gathered in other areas of the curriculum and used to improve modules or courses in those areas.

Overall, the DL project contributed to the research and practice of DL education in several ways. The DL project was designed and developed to have a broader impact on educating DL professionals—graduate-level students as well as current DL designers and administrators—through improved learning experiences. The DL module framework identifies core topics that need to be taught for training current and future DL professionals. The modules are readily useable in DL classes and have been shared with DL instructors and those who would like to study DLs independently through a variety of channels; all of the modules and resources are available from the project website for public use ( Long term, those graduates and learners of DL courses will have a stronger background of DL development and management. The DLs they create will benefit their stakeholders and users by allowing them to access and use valuable data about their resources with critical information.

Moreover, during the project time period of three years, this project put effort into developing a DL community and to raising its visibility by providing a variety of outreach programs, such as DL tutorials and workshops for module reviews and development at international conferences on DLs (e.g., TPDL, ICADL, and JCDL).Thanks to the DL community, module development has been continued through the Wikiversity site (

8 Further work and conclusion

This paper mainly reports on the findings from a field test of a number of DL modules, which was the final phase of the three-year NSF-funded project for DL curriculum building. The modules are the essential outputs of this project. We put a great deal of effort into designing and developing the modules in collaboration with a number of DL experts. Prior to distributing the modules to the DL community, we wanted to test the modules’ functions and capabilities of delivering the primary content of knowledge and practice in DL to students in classrooms. Thanks to cooperation and support from many DL instructors and students, we were able to test our modules in a variety of class environments. We obtained fruitful feedback from the participants for improving the current and future modules. In general, this project is meaningful not only in building a DL curriculum and developing modules as core resources for DL education, but also for initiating a collaborative project involving many experts in DLs and for developing a DL community as part of the goal for better education. The project would not have been successful without the continuous guidance from our advisory board members, DL experts’ constructive feedback on the modules, and instructors who volunteered to use the modules in their classrooms and participated in the field testing.

Although funding for this project ended, the module development, review, and field testing have been continued by the DL community. First, in the past few years, nine more modules have been developed and are available for use. Therefore, a total of 24 modules are currently available for use in DL courses. Second, the modules developed from this project have been used as resources not only for DL courses and the DL community, but also for the Information Retrieval (IR) community. Ten of the DL modules have topics that overlap with IR, and each module is about a particular IR software package, such as Hadoop Map-Reduce, Lemur, NLTK, R, Solr, Weka, and WordNet. These modules were developed by teams of graduate students studying IR at Virginia Tech. They were field tested and refined in two successive years of offerings of that class. We expect that the modules field tested during the project will also continue to evolve, based on the feedback received. Third, new modules also contribute to research in Big Data. Six modules for LucidWorks Big Data software ( were proposed. These modules were developed in 2012 in the above-mentioned IR class, by project teams, and field tested and refined by other student teams. LucidWorks generously supported these efforts, and students thus gained both IR and Big Data experience with an integrated suite of relevant software.

Furthermore, new modules focusing on particular software packages related to multimedia have been developed and field tested in two offerings of the Multimedia, Hypertext, and Information Access course at Virginia Tech. One package connects with the software used in a popular undergraduate course, originally developed at Georgia Tech, on Media Computation. Two others relate to music software, in particular Audacity ( and PureData ( The final module in the set discusses the fingerprint analysis software from the National Institute of Standards and Technology Biometric Image Software team. Therefore, a total of 56 modules have been developed, more than three times the number that were initially proposed; they are now available for public use from the Wikiversity site ( We hope the broader digital libraries, information retrieval, and multimedia communities will find these to be useful, and will help us refine and enhance them further, so they can be helpful for both teachers and learners across the information disciplines.


  1. 1.

    This module field test, involving both DL instructors and their students, was approved by the Institutional Review Boards from both UNC-CH and Virginia Tech.



This project was funded by NSF through Virginia Tech (VT) Grant IIS-0535057 and University of North Carolina at Chapel Hill (UNC-CH) Grant IIS-0535060. We would like to thank the DL experts, advisory board members, module reviewers, students, and DL instructors for participating in this project and providing us constructive feedback. Work with the fingerprint analysis software was funded by National Institute of Justice Grant No. 2009-DN-BX-K229. We also thank LucidWorks for its support of our work with their software, and Kiran Chitturi (formerly at VT, now at LucidWorks), for his assistance with the modules, software, and data involved.


  1. 1.
    Saracevic, T., Dalbello, M.: A survey of digital library education. In: Proceedings of the American Society for Information Science and Technology, vol. 38, pp. 209–223 (2001)Google Scholar
  2. 2.
    Spink, A., Cool, C.: Developing digital library education: international perspective on theory and practice. In: Aparac T., et al. (eds.) Digital Libraries: Interdisciplinary Concepts, Challenges and Opportunities. Proceeding of the Third International Conference on Conceptions of Library and Information Science (CoLIS3), pp. 55–62 (1999)Google Scholar
  3. 3.
    Fox, E.A., Marchionini, G.: Digital libraries: extending traditional values: guest editor’s introduction. Commun. ACM 44, 30–32 (2001). doi: 10.1145/374308.374329 CrossRefGoogle Scholar
  4. 4.
    Marchionini, G., Fox, E.A.: Progress toward digital libraries: augmentation through integration; guest editor’s introduction to special issue on digital libraries. Inf. Process. Manag. 35, 219–225 (1999)CrossRefGoogle Scholar
  5. 5.
    Fox, E.A., Suleman, H., Madalli, D., Cassel, L.: Chapter 4: digital libraries. In: Singh, M. (ed.) Practical Handbook of Internet Computing. Chapman Hall/CRC Press, Baton Rouge (2004)Google Scholar
  6. 6.
    Pomerantz, J., Oh, S., Yang, S., Fox, E.A., Wildemuth, B.M.: The core: Digital library education in library and information science programs. D-Lib Mag. 12(11). (2006)
  7. 7.
    Pomerantz, J., Wildemuth, B.M., Oh, S., Yang, S., Fox, E.A.: Digital library education in computer science programs. In: Proceeding of the 7th ACM/IEEE-CS Joint Conference on Digital Libraries, pp. 177–178. ACM, New York (2007). doi: 10.1145/1255175.1255208
  8. 8.
    Yang, S., Fox, E., Wildemuth, B.M., Pomerantz, J., Oh, S.: Core topics in digital library education. In: Theng, Y.L., Foo, S., Goh, D.H.L., Na J.C. (eds.) Handbook of Research on Digital Libraries: Design, Development and Impact, pp. 493–505. IGI Global, Hershey (2009)Google Scholar
  9. 9.
    Fox, E.A.: The digital libraries initiative: update and discussion. Guest editor’s introduction to special section. In: Bulletin of the American Society of Information Science, vol. 26, pp. 7–11 (1999)Google Scholar
  10. 10.
    Lesk, M.: Perspectives on DLI-2—growing the field. D-Lib Mag. 5(7/8). (1999)
  11. 11.
    Zia, L.L.: The NSF National Science, Mathematics, Engineering, and Technology Education Digital Library (NSDL) program. Commun. ACM 44, 83 (2001)CrossRefGoogle Scholar
  12. 12.
    Saracevic, T., Covi, L.: Challenges for digital library evaluation. In: Proceedings of the 63rd Annual Meeting of the American Society for Information Science (ASIS), vol. 37, pp. 341–350 (2000)Google Scholar
  13. 13.
    Gladney, H., Fox, E.A., Ahmed, Z., Ashany, R., Belkin, N., Zemankova, M.: Digital library: gross structure and requirements. In: Schnase, J., Leggett, J., Furuta, R., Metcalfe, T. (eds.) Report from a March 1994 Workshop, in Digital Libraries ’94. College Station, TX, pp. 101–107 (1994)Google Scholar
  14. 14.
    Bhargava, B., Annamalai, M.: Communication costs in digital library databases. In: Database and Expert Systems Applications (DEXA ’95). Lecture Notes in Computer Science (LNCS), vol. 978, pp. 1–13. Springer, Heidelberg (1995)Google Scholar
  15. 15.
    French, J.C., Viles, C.L.: Ensuring retrieval effectiveness in distributed digital libraries. J. Vis. Commun. Image R 7, 61–73 (1996)Google Scholar
  16. 16.
    Moffat, A., Witten, I.: A compression-based digital library. In: DESIDOC Bulletin of Information Technology, vol. 17, pp. 31–41 (1998)Google Scholar
  17. 17.
    Goncalves, M.A. et al.: The effectiveness of automatically structured queries in digital libraries. In: Proceedings ACM-IEEE Joint Conference on Digital Libraries (JCDL ’2004) Tucson, AZ, June 7–11, pp. 98–107 (2004)Google Scholar
  18. 18.
    Fox, E.A., Mather, P.: Chapter 12: scalable storage for digital libraries. In: Feng, D., Siu, W.C., Zhang, H. (eds.) Multimedia Information Retrieval and Management, pp. 265–288. Springer, Berlin (2003)CrossRefGoogle Scholar
  19. 19.
    Suleman, H., Fox, E.A., Abrams, M.: Building quality into a digital library. In: Proceedings of the Fifth ACM Conference on Digital Libraries, DL ’00, June 2–7, 2000, San Antonio, TX. ACM Press, New York (2000)Google Scholar
  20. 20.
    Paepcke, A., Chang, C.C.K., Garcia-Molina, M., Winograd, T.: Interoperability for digital libraries worldwide. Commun. ACM 41, 33–43 (1998)Google Scholar
  21. 21.
    Payette, S., Blanchi, C., Lagoze, C., Overly, E.A.: Interoperability for digital objects and repositories: the Cornell/CNRI experiments. D-Lib Mag. 5(5) (1999).
  22. 22.
    Miller, P.: Interoperability. What is it and why should I want it? Ariadne 24 (2000).
  23. 23.
    Paepcke, A., et al.: Towards interoperability in digital libraries: overview and selected highlights of the Stanford Digital Library Project. In: Technical Report, Stanford University, CA (1997)Google Scholar
  24. 24.
    Lynch, C., Garcia-Molina, H.: Interoperability, scaling, and the digital libraries research agenda. In: A report on the May 18–19, 1995 IITA Digital Libraries Workshop, IITA, Reston, VA (1995)Google Scholar
  25. 25.
    NSDL: NSDL sustainability standing committee home page. (2004)
  26. 26.
    Waugh, A., Wilkinson, R., Hills, B., Dell’oro, J.: Preserving digital information forever. In: Proceedings of the Fifth ACM Conference on Digital Libraries, DL ’00, June 2–7, 2000, San Antonio, TX, pp. 175–184. ACM Press, New York (2000)Google Scholar
  27. 27.
    Spink, A., Cool, C.: Education for digital libraries. D-Lib Mag. 5(5) (1999).
  28. 28.
    Coleman, A.: Interdisciplinarity: the road ahead for education in digital libraries. D-Lib Mag. 8(7/8). (2002)
  29. 29.
    Weech, T.L.: Multidiscplinarity in education for digital librarianship. In: Proceedings of the 2007 Informing Science and IT Education Joint Conference, pp. 11–21 (2007)Google Scholar
  30. 30.
    Arms, W.Y.: Digital Libraries. MIT Press, Cambridge (2000)Google Scholar
  31. 31.
    Borgman, C.L.: From Gutenberg to the Global Information Infrastructure: Access to Information in the Networked World. MIT Press, Cambridge (2000)Google Scholar
  32. 32.
    Chowdhury, G.G., Chowdhury, S.: Introduction to Digital Libraries. Facet Publishing, London (2003)Google Scholar
  33. 33.
    Deegan, M., Tanner, S.: Digital Futures: Strategies for the Information Age. Facet Publishing, London (2002)Google Scholar
  34. 34.
    Lesk, M.: Understanding Digital Libraries, 2nd edn. Elsevier, Boston (2005)Google Scholar
  35. 35.
    Levy, D.M.: Scrolling Forward: Making Sense of Documents in the Digital Age. Arcade, New York (2001)Google Scholar
  36. 36.
    Tennant, R.: Managing the Digital Library. Reed Press, New York (2004)Google Scholar
  37. 37.
    Bearman, D.: Digital libraries. Annu. Rev. Inf. Sci. Technol. 41(1), 223–272 (2007)CrossRefGoogle Scholar
  38. 38.
    Liu, Y.Q.: Is the education on digital libraries adequate? New Libr. World 105, 60–68 (2004)Google Scholar
  39. 39.
    Ma, Y., Clegg, W., O’Brien, A.: A review of progress in digital library education. In: Handbook of Research on Digital Libraries: Design, Development, and Impact, pp. 533–542. IGI Global, Hershey (2009)Google Scholar
  40. 40.
    Shuva, N.Z., Audunson, R.A.: Curriculum contents of digital library education (DLE) in Europe. In: Collaboration in International and Comparative Librarianship, pp. 273–296. IGI Global, Hershey (2014)Google Scholar
  41. 41.
    Ma, Y., Clegg, W., O’Brien, A.: Digital library education: the current status. In: Proceedings of the 6th ACM/IEEE-CS Joint Conference on Digital Libraries, Chapel Hill NC (2006)Google Scholar
  42. 42.
    Bakar, A.B.A., Bakeri, A.: Education for digital libraries in Asian countries, In: Asia-Pacific Conference on Library & Information Education & Practice, pp. 458–463 (2009)Google Scholar
  43. 43.
    Baro, E.E.: A survey of digital library education in library schools in Africa. OCLC Syst. Serv. 26, 214–223 (2010)CrossRefGoogle Scholar
  44. 44.
    Blummer, B.: Graduate and post-MLS study in digital libraries. J. Access Serv. 3, 53–60 (2005)CrossRefGoogle Scholar
  45. 45.
    Koltay, T., Boda, I.: Digital library issues in Hungarian LIS curricular: examples from three library schools. Libr. Rev. 57, 430–441 (2008)CrossRefGoogle Scholar
  46. 46.
    Tammaro, A.M.: A curriculum for digital librarians: a reflection on the European debate. New Libr. World 108, 229–246 (2007)CrossRefGoogle Scholar
  47. 47.
    Terry, W.: Analysis of courses and modules: education for digital librarianship. In: Proceedings of Digital Library Education, Villa Morghen, Firenze, 24–25 March. (2006)
  48. 48.
    Fox, E.A., Goncalves, M.A., Shen, R.: Theoretical Foundations for Digital Libraries: The 5s (societies, Scenarios, Spaces, Structures, Streams) Approach. Synthesis Lectures on Information Concepts, Retrieval, and Services. Morgan & Claypool Publishers, San Francisco. doi: 10.2200/S00434ED1V01Y201207ICR022. (2012)
  49. 49.
    The Joint Task Force on Computing Curricular, IEEE Computer Society, Association for Computing Machinery: Computing curricular 2001 computer science: final report. (2001)
  50. 50.
    Pomerantz, J., Oh, S., Yang, S., Fox, E.A., Wildemuth, B.M.: The core: digital library education in library and information science programs. D-Lib Mag. 12(11) (2006).
  51. 51.
    Pomerantz, J., Wildemuth, B.M., Oh, S., Yang, S., Fox, E.A.: Digital library education in computer science programs. In: Proceeding of the 7th ACM/IEEE-CS Joint Conference on Digital Libraries, pp. 177–178. ACM, New York (2007). doi: 10.1145/1255175.1255208
  52. 52.
    Oh, S., Wildemuth, B.M., Pomerantz, J., Yang, S., Fox, E.A.: Using a Wiki as a platform for formative evaluation. In: Proceedings of the 72nd Annual Meeting of the American Society for Information Science and Technology, vol. 46 (2009)Google Scholar
  53. 53.
    Snare, C.E.: An alternative end-of-semester questionnaire. PS Polit. Sci Polit. 33, 823–825 (2000)CrossRefGoogle Scholar
  54. 54.
    McGorry, S.Y.: Measuring quality in online programs. Internet High. Educ. 6, 159–177 (2003)CrossRefGoogle Scholar
  55. 55.
    Ehrmann, S.C., Zúñiga, R.E.: The flashlight\(^{{\rm TM}}\) evaluation handbook, including the flashlight\(^{{\rm TM}}\) current student inventory version 1.0. teaching, learning, and technology group, American Association for Higher Education (1997)Google Scholar
  56. 56.
    Neal, E.: Nursing course evaluation questionnaire (2004, unpublished)Google Scholar

Copyright information

© The Author(s) 2015

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors and Affiliations

  • Sanghee Oh
    • 1
  • Seungwon Yang
    • 2
  • Jeffrey P. Pomerantz
    • 3
  • Barbara M. Wildemuth
    • 3
  • Edward A. Fox
    • 4
  1. 1.School of InformationFlorida State UniversityTallahasseeUSA
  2. 2.School of Library and Information Science, Center for Computation and TechnologyLouisiana State UniversityBaton RougeUSA
  3. 3.School of Information and Library ScienceUniversity of North Carolina at Chapel HillChapel HillUSA
  4. 4.Department of Computer ScienceVirginia TechBlacksburgUSA

Personalised recommendations