E-Assessment Systems and Online Learning with Adaptive Testing

Chapter
Part of the Studies in Computational Intelligence book series (SCI, volume 528)

Abstract

In this paper we explain how a systematic approach to design Assessment as a Service on a cloud with SOA architecture can exploit add on functionalities like online learning with interactive assessment as a type of formative and integrative assessment for the learning process. The idea is realized by a system (computer based program) that systematically asks questions and leads the students towards knowledge construction and discovery. The system should be as simple as possible and intelligent enough to enable the realization of this idea. This online learning tool with adaptive testing uses software agents, where several strategies define the agent’s behavior. We observed that the strategy “3 correct answers in a row” performs the best.

Keywords

E-testing E-learning Computer based testing 

References

  1. 1.
    Armenski, G., Gusev, M.: E-testing based on service oriented architecture. In: Proceedings of the 10th CAA International Computer Assisted Assessment Conference, pp. 17–23. © Loughborough University (2006)Google Scholar
  2. 2.
    Armenski, G., Gušev, M.: Infrastructure for e-testing. Facta Univ. Ser. Electron. Energ. 18(2), 181–204 (2005). http://factaee.elfak.ni.ac.yu/fu2k52/contents.html
  3. 3.
    Armenski, G., Gusev, M.: The architecture of an ultimate’ e-assessment system. In: ICT Innovations 2012, Web Proceedings. Association for Information and Communication Technologies ICT-ACT (2009)Google Scholar
  4. 4.
    Bădică, C., Budimac, Z., Burkhard, H.D., Ivanović, M.: Software agents: Languages, tools, platforms. Comput. Sci. Inf. Syst./ComSIS 8(2), 255–298 (2011)CrossRefGoogle Scholar
  5. 5.
    Beck, J., Stern, M., Haugsjaa, E.: Applications of ai in education. Crossroads 3(1), 11–15 (1996)CrossRefGoogle Scholar
  6. 6.
    Blackboard: (2013). http://www.blackboard.com
  7. 7.
    Brusilovsky, P.: Developing adaptive educational hypermedia systems: From design models to authoring tools. Authoring tools for advanced technology learning environment, pp. 377–409. Kluwer Academic Publishers, Dordrecht (2003)Google Scholar
  8. 8.
    Brusilovsky, P., Peylo, C.: Adaptive and intelligent web-based educational systems. Int. J. Artif. Intell. Educ. 13(2), 159–172 (2003)Google Scholar
  9. 9.
    Clark, R., Mayer, R.: E-learning and the science of instruction: proven guidelines for consumers and designers of multimedia learning. Pfeiffer (2011)Google Scholar
  10. 10.
    Classroom, T.M.V.: (2012). http://manhattan.sourceforge.net
  11. 11.
    Collins, A.M., Stevens, A.L.: A Cognitive Theory of Interactive Teaching. Bolt Beranek and Newman Inc, Cambridge (1981)Google Scholar
  12. 12.
    Crisp, G.: Interactive e-assessment: Moving beyond multiple-choice questions. Centre for Learning and Professional Development. Adelaide (Australien): University of Adelaide (2009). http://ipac.kacst.edu.sa/eDoc/2009/173798_1.pdf [10.11.2010]
  13. 13.
    Dagger, D., O’Connor, A., Lawless, S., Walsh, E., Wade, V.: Service-oriented e-learning platforms: from monolithic systems to flexible services. IEEE Internet Comput. 11(3), 28–35 (2007)CrossRefGoogle Scholar
  14. 14.
    Davies, W.M., Davis, H.C.: Designing assessment tools in a service oriented architecture. In: 1st International ELeGI Conference on Advanced Technology for Enhanced Learning, Vico Equense-Naples, Italy, 15–16 Mar 2005, pp. 1−7 (2005)Google Scholar
  15. 15.
    Environment, F.L.: (2012). http://fle3.uiah.fi
  16. 16.
    eSurvey Creator: (2013). http://www.esurveycreator.com
  17. 17.
    eSurvey Pro: (2013). http://www.esurveypro.com
  18. 18.
    ETS: (2012). https://www.ets.org/
  19. 19.
    EL Framework: (2012). http://www.elframework.org/
  20. 20.
    Franklin, S., Graesser, A.: Is it an agent, or just a program? A taxonomy for autonomous agents. Intelligent Agents III Agent Theories, Architectures, and Languages, pp. 21–35 (1997)Google Scholar
  21. 21.
    Free Online Surveys: (2013). http://www.freeonlinesurveys.com
  22. 22.
    Gierłowski, K., Nowicki, K.: A highly scalable, modular architecture for computer aided assessment e-learning systems. Distance Education Environments and Emerging Software Systems: New Technologies, IGI Global (2011)Google Scholar
  23. 23.
    Gusev, M., Armenski, G.: On-line learning and etesting. In: Information Proceedings of the 24th International Conference on Technology Interfaces ITI 2002, pp. 147–152. IEEE (2002)Google Scholar
  24. 24.
    IMS Global Learning Consortium: Abstract Framework: (2012). http://www.imsglobal.org/af/index.html
  25. 25.
    IMS Content Packing Specification: (2012). http://www.imsglobal.org/content/packaging/
  26. 26.
    IMS Learning Design Specification: (2012). http://www.imsglobal.org/learningdesign/
  27. 27.
    Kaernbach, C.: Simple adaptive testing with the weighted up-down method. Atten. Percept. Psychophys. 49(3), 227–229 (1991)CrossRefGoogle Scholar
  28. 28.
    Koper, R., van Es, R.: Modelling units of learning from a pedagogical perspective. Online Educ. Using Learn. Objects. 40, 43−58 (2004)Google Scholar
  29. 29.
    LeAP Project Case Study: Implementing Web Services in an Education Environment: (2012). http://www.education.tas.gov.au/admin/ict/projects/imsdoecasestudy/LeAPProjectCaseSummary.pdf
  30. 30.
    LRN: (2012). http://dotlrn.org
  31. 31.
    Luecht, R., Sireci, S.: A review of models for computer-based testing (2012). http://research.collegeboard.org/publications/content/2012/05/review-models-computer-based-testing
  32. 32.
    Moodle: (2013). http://moodle.org
  33. 33.
    OKI Project: (2012). http://www.okiproject.org
  34. 34.
    Paramythis, A., Loidl-Reisinger, S.: Adaptive learning environments and e-learning standards. In: Second European Conference on E-Learning, pp. 369–379 (2003)Google Scholar
  35. 35.
    Parshall, C., Davey, T., Pashley, P.: Innovative items for computerized testing. In: van der Linden, W.J., Glas, C.A.W. (eds.) Elements of Adaptive Testing, Statistics, for Social and Behavioral Sciences, pp. 215–230. Springer, Berlin (2002)Google Scholar
  36. 36.
    Patelis, T.: An overview of computer-based testing. The College Board, RN-09, Office of Research and Development (2000). http://www.collegeboard.com/research/html/rn09.pdf
  37. 37.
    Pearson Vue: (2013). http://www.pearsonvue.com
  38. 38.
    Prometric: (2013). http://www.prometric.com
  39. 39.
    Ristov, S., Gusev, M., Armenski, G., Bozinoski, K., Velkoski, G.: Architecture and organization of e-assessment cloud solution. In: 2013 IEEE Global Engineering Education Conference (EDUCON), pp. 736–743 (2013)Google Scholar
  40. 40.
    Russell, S., Norvig, P.: Artificial Intelligence: A Modern Approach, 3rd edn. Prentice Hall, Englewood Cliffs (2002)Google Scholar
  41. 41.
    Scalise, K., Gifford, B.: Computer-based assessment in e-learning: A framework for constructing “intermediate constraint” questions and tasks for technology platforms. J. Technol. Learn. Assess. 4(6), 5−44 (2006)Google Scholar
  42. 42.
    Sclater, N., Howie, K.: User requirements of the ultimate online assessment engine. Comput. Educ. 40(3), 285–306 (2003)CrossRefGoogle Scholar
  43. 43.
    Sclater, N., Low, B., Barr, N.: Interoperability with CAA: does it work in practice? In: Proceedings of the Sixth International Computer Assisted Assessment Conference, Loughborough University, 317−326 (2002)Google Scholar
  44. 44.
    Tattersall, C., Hermans, H.: Ounls assessment model. In: January 2006 TENCompetence WP6 Meeting (2006). http://dspace.ou.nl/handle/1820/558
  45. 45.
    TENCompetence—Building the European Network for Lifelong Competence Development: (2012). http://www.tencompetence.org/
  46. 46.
    Thissen, D., Mislevy, R.: Testing algorithms. In: Wainer, H., Dorans, N., Green, B., Steinberg, L., Flaugher, R., Mislevy, R., Thissen, D. (eds.) Computerized Adaptive Testing: A Primer. Lawrence Erlbaum Associates Inc, London, 101−133 (2000)Google Scholar
  47. 47.
    Thompson, N., Wiess, D.: Computerised and adaptive testing in educational assessment. Trans. Comput. Based Assess. New Approach. Skills Assess Implications Large-Scale Test. 127–133 (2009)Google Scholar
  48. 48.
    Thurlow, M., Lazarus, S., Albus, D., Hodgson, J.: Computer-based testing: Practices and considerations. Synth Rep. 78 (2010)Google Scholar
  49. 49.
    Vossen, G., Westerkamp, P.: E-learning as a web service. In: Proceedings of the Seventh International Database Engineering and Applications Symposium, pp. 242–249. IEEE (2003)Google Scholar
  50. 50.
    WebCT: (2013). http://www.webct.com
  51. 51.
    Williamson, D.M., Bennett, R.E., Lazer, S., Bernstein, J., Foltz, P.W., Landauer, T.K., Walter, D., Way, D., Sweeney, K.: Automated Scoring for the Assessment of Common Core Standards (2010). https://www.ets.org/s/commonassessments/pdf/AutomatedScoringAssessCommonCoreStandards.pdf

Copyright information

© Springer-Verlag Berlin Heidelberg 2014

Authors and Affiliations

  1. 1.Faculty of Information Sciences and Computer Engineering, Ss. Cyril and Methodious UniversitySkopjeMacedonia

Personalised recommendations