Skip to main content
Log in

Viewing and exploring the subject area of information literacy assessment in higher education (2000–2011)

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

This pioneering approach to the subject area of Information Literacy Assessment in Higher Education (ILAHE) aims at gaining further knowledge about its scope from a terminological-spatial perspective and also at weighting and categorizing relevant terms on the basis of levels of similarity. From a retrospective and selective search, the bibliographic references of scientific literature on ILAHE were obtained from the most representative databases (LISA, ERIC and WOS), comprising the period 2000–2011 and restricting results to English language. Keywords in titles, descriptors and abstracts of the selected items were labelled and extracted with Atlas.ti software. The main research topics in this field were determined through a co-words analysis and graphically represented by the software VOSviewer. The results showed two areas of different density and five clusters that involved the following issues: evaluation-education, assessment, students-efficacy, learning-research, and library. This method has facilitated the identification of the main research topics about ILAHE and their degree of proximity and overlapping.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • Aharony, N. (2010). Information literacy in the professional literature: An exploratory analysis. Aslib Proceedings, 62(3), 261–282.

    Article  Google Scholar 

  • Arum, R., & Roksa, J. (2010). Learning to Reason and Communicate in College: Initial Report of Findings from the CLA Longitudinal Study Learning to Reason and Communicate in College: Initial Report of Findings from the CLA Longitudinal Study (p. 34).

  • Bawden, D. (2012). On the gaining of understanding : Syntheses, themes and information analysis. Library and Information Research, 36(112), 147–162.

    Google Scholar 

  • Brasley, S., Beile, P., & Katz, I. (2009). Assessing information competence of students using iSkills: A commercially-available, standardized instrument. In S. Hiller (Ed.), Library Assessment Conference. Seattle, Washington (US).

  • Brown, C. P., & Kingsley-Wilson, B. (2010). Assessing organically: Turning an assignment into an assessment. Reference Services Review, 38(4), 536–556.

    Article  Google Scholar 

  • Bussert, L., Diller, K. R., & Phelps, S. F. (2009). Voices of authentic assessment: Stakeholder experiences implementing sustainable information literacy assessments. In A. of R. Libraries (Ed.), Proceedings of the 2008 Library Assessment Conference: Building effective, sustainable, practical assessment (pp. 165–176). Washington DC.

  • Cameron, L., Wise, S. L., & Lottridge, S. M. (2007). The development and validation of the information literacy test. College and Research Libraries, 68(3), 229–236.

    Article  Google Scholar 

  • Catts, R. (2005). Information skills survey, technical manual. Canberra: CAUL.

    Google Scholar 

  • Chen, K., & Lin, P. (2011). Information literacy in university library user education. Aslib Proceedings, 63(4), 399–418.

    Article  Google Scholar 

  • Dunaway, M. K., & Orblych, M. T. (2011). Formative assessment: Transforming information literacy instruction. Reference Services Review, 39(1), 24–41.

    Article  Google Scholar 

  • Ehrmann, S. C. (2004). Beyond computer literacy : Implications of technology for the content of a college education. Library Education, 90(4), 6–13.

    Google Scholar 

  • Fourie, I., & van Niekerk, D. (2001). Follow-up on the use of portfolio assessment for a module in research information skills: An analysis of its value. Education for Information, 19(2), 107–126.

    Google Scholar 

  • Gross, M., & Lathan, D. (2008). Self-views of information-seeking skills: Undergraduates` understanding of what it means to be information literate. OCLC Research.

  • Head, A. J., & Eisenberg, M. B. (2010). Project Information Literacy Progress Report, truth be told: How college students evaluate and use information in the digital age (pp. 1–72). Washington DC.

  • Indiana University. (2013). National survey of student engagement. Retrieved from http://nsse.iub.edu/.

  • Jacobs, N. (2002). Co-term network analysis as a means of describing the information landscapes of knowledge communities across sectors. Journal of Documentation, 58(5), 548–562.

    Article  Google Scholar 

  • Mueller, J. (2013). Authentic assessment toolbox. Retrieved from http://jfmueller.faculty.noctrl.edu/toolbox/.

  • Muhr, T. (1991). ATLAS/ti: A Prototype for the support of text interpretation. Qualitative Sociology, 14(4), 349–371.

    Article  Google Scholar 

  • Nazim, M., & Ahmad, M. (2007). Research trends in information literacy: A bibliometric study. SRELS Journal of Information Management, 44(1), 53–62.

    Article  Google Scholar 

  • Nichols, J. T. (2009). The 3 directions: Situated information literacy. College and Research Libraries, 70(6), 515–530.

    Article  Google Scholar 

  • Oakleaf, M. (2009). Using rubrics to assess information literacy: An examination of methodology and interrater reliability. Journal of the American Society for Information Science and Technology, 60(5), 969–983.

    Article  Google Scholar 

  • Oakleaf, M., Millet, M. S., & Kraus, L. (2011). All together now: Getting faculty, administratiors, and staff engaged in information literacy assessment. Portal: Libraries and the Academy, 11(3), 831–852.

    Article  Google Scholar 

  • Pinto, M. (2010). Design of the IL-HUMASS survey on information literacy in higher education: A self-assessment approach. Journal of Information Science, 36(1), 86–103.

    Article  Google Scholar 

  • Pinto, M., Cordón, J. A., & Gómez, R. (2010). Thirty years of information literacy (1977–2007): A terminological, conceptual and statistical analysis. Journal of Librarianship and Information Science, 42, 3–19.

    Article  Google Scholar 

  • Pinto, M., Escalona, M. I., & Pulgarín, A. (2013a). Information literacy in social sciences and health sciences: A bibliometric study (1974–2011). Scientometrics, 95(3), 1071–1094.

    Article  Google Scholar 

  • Pinto, M., Pulgarin, A., & Escalona, I. (2013). Viewing information literacy concepts: A comparison of two branches of knowledge. Scientometrics.

  • Pinto, M., & Sales, D. (2008). INFOLITRANS: A model for the development of information competence for translators. Journal of Documentation, 64(3), 413–437.

    Article  Google Scholar 

  • Rabine, J., & Cardwell, C. (2000). Start making sense practical approaches to outcomes assessment for libraries. Research Strategies, 17, 319–335.

    Article  Google Scholar 

  • Rader, H. B. (2002). Information literacy 1973–2002: A selected literature review. Library Trends, 51(2), 242–259.

    MathSciNet  Google Scholar 

  • SAILS. (2000). Project SAILS (standardized assessment of information literacy skills). Kent State University.

  • Salisbury, F., & Ellis, J. (2003). Online and face-to-face: evaluating methods for teaching information literacy skills to undergraduate arts students. Library Review, 52(5), 209–217.

    Article  Google Scholar 

  • Sharma, S. (2007). From chaos to clarity : Using the Research portfolio to teach and assess information literacy skills. Journal of Academic Librarianship, 33(1), 127–135.

    Article  Google Scholar 

  • Shenton, A. K., & Fitzgibbons, M. (2010). Making information literacy relevant. Library Review, 59(3), 165–174.

    Article  Google Scholar 

  • Shepard, L. A. (2000). The role of assessment in a learning culture. Educational Researcher, 29(7), 4–14.

    Article  Google Scholar 

  • Somerville, M. M., Lampert, L. D., Dabbour, K. S., Harlan, S., & Schader, B. (2007). Toward large scale assessment of information and communication technology literacy: Implementation considerations for the ETS ICT literacy instrument. Reference Services Review, 35(1), 8–20.

    Article  Google Scholar 

  • Somerville, M. M., Smith, G. W., & Macklin, A. S. (2008). The ETS iSkills (TM) assessment: a digital age tool. Electronic Library, 26(2), 158–171.

    Article  Google Scholar 

  • Stewart, S. L. (1999). Assessment for library instruction: The cross/angelo model. Research Strategies, 16(3), 165–174.

    Article  Google Scholar 

  • Van Eck, N. J., & Waltman, L. (2010). Software survey: VOSviewer, a computer program for bibliometric mapping. Scientometrics, 84(2), 523–538.

    Article  Google Scholar 

  • Walsh, A. (2009). Information literacy assessment where do we start? Journal of Librarianship and Information Science, 41(1), 19–28.

    Article  Google Scholar 

  • Walton, G., & Hepworth, M. (2011). A longitudinal study of changes in learners’ cognitive states during and following an information literacy teaching intervention. Journal of Documentation, 67(3), 449–479.

    Article  Google Scholar 

  • Webber, S., & Johnson, B. (2006). Working towards the information literate university. In G. Walton & A. Pope (Eds.), Information literacy: recognising the need (pp. 47–58). Oxford: Chandos.

    Google Scholar 

Download references

Acknowledgments

This research has been funded by the Spanish Research Program I+D+I (Research, Development and Innovation), through the project “Information Competencies Assessment of Spanish University Students in the Field of Social Sciences” (EDU 2011-29290). I’m grateful to professor Joaquin Granell for his help in translating the manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to María Pinto.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Pinto, M. Viewing and exploring the subject area of information literacy assessment in higher education (2000–2011). Scientometrics 102, 227–245 (2015). https://doi.org/10.1007/s11192-014-1440-2

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-014-1440-2

Keywords

Navigation