Development and Evaluation of a Web-Based E-Learning System

  • L. Schmidt
Conference paper


In this contribution the development of a web-based e-learning system is described and a method for evaluation is introduced, whereby the e-learning portal INTEGRAL II will be used as an exemplification. The proposed approach is based on system monitoring of user interaction by data capture in a server-log-file, and it additionally integrates external data. As such it is a user-based approach, which typically applies to prototyping or early stages of the product life-cycle. Due to its non-reactive character, the hypothesized biasing effects are supposed to be minimal. Especially server log-files are means of gathering objective, quantitative data which can not be derived by other means. Additionally the procedure is not very complex, that is, it is easy to conduct and does not take too much time to collect data. Log-files are records that are generated by the system automatically and incorporate information about access to and quitting the system, location of the user, time spent with the system and actions within the system. As illustrated in this contribution, these data can be analyzed and interpreted in seven steps. The analysis reveals specific results for the INTEGRAL II system as well as more abstract suggestions for usability testing in general. Log-file analyses in combination with other measures are powerful means of evaluating a web-based system. What information is provided by log-files? What technical framework must be considered for the interpretation? How can these implicit data be used to evaluate web-based systems? These questions are considered in detail in this contribution.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    L. Schmidt and H. Luczak (Eds.), E-Learning-Kooperation in der Arbeitswissenschaft. Stuttgart: Ergonomia, 2005.CrossRefGoogle Scholar
  2. [2]
    K. Shuyler and E. Maddox, Evaluating Your SimpleSite with Log File Analysis - How do your students use your site? Is your site effective? Technical Report PETTT-02-CT-01. Program for Educational Transformation through Technology, University of Washington, Seattle (WA), 2002.Google Scholar
  3. [3]
    A. Baravalle and V. Lanfranchi, “Remote Web Usability Testing”, Behavior Research Methods, Instruments & Computers 35, Nr. 3, pp. 364-368, 2003.Google Scholar
  4. [4]
    J. I. Hong, J. Heer, S. Waterson, and J. A. Landay, “WebQuilt: A Proxy-based Approach to Remote Web Usability Testing”, ACM Transactions on Information Systems 19, Nr. 3, pp. 263-285, 2001.CrossRefGoogle Scholar
  5. [5]
    L. Paganelli and F. Paternó, “Tools for Remote Usability Evaluation of Web Applications through Browser Logs and Task Models”, Behavior Research Methods, Instruments & Computers 35, Nr. 3, S. 369-378, 2003.Google Scholar
  6. [6]
    P. M. Hallam-Baker and B. Behlendorf, Extended Log File Format : W3C Working Draft WD-logfile-960323. Scholar
  7. [7]
    R. Fuller and J. J. de Graaf, Measuring User Motivation from Server Log Files. Scholar
  8. [8]
    D. M. Hilbert and D. F. Redmiles, “Extracting Usability Information from User Interface Events”. ACM Computing Surveys 32, pp. 384-421, 1999CrossRefGoogle Scholar
  9. [9]
    M. Sweeney, M. Maguire M, and B. Shackel, “Evaluating user-computer interaction: a framework”. International Journal of Human-Computer-Interaction 38, pp. 689-711, 1991.Google Scholar

Copyright information

© Springer 2007

Authors and Affiliations

  • L. Schmidt
    • 1
  1. 1.Research Institute for CommunicationInformation Processing and ErgonomicsNeuenahrer Straβe 20Germany

Personalised recommendations