A Framework for e-Assessment on Students’ Devices: Technical Considerations

  • Bastian KüppersEmail author
  • Ulrik Schroeder
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 829)


This paper presents FLEX, a framework for electronic assessment on students’ devices. Basic requirements to such a framework and potential issues related with these requirements are discussed, as well as their state of the art. Afterwards, the client-server architecture of FLEX is presented, which is designed to meet all requirements previously identified. The FLEX client and the FLEX server are discussed in detail with focus on utilized technologies and programming languages. The results of first trials with the existing prototype are discussed in relation to the identified basic requirements. Thereafter, assessment of programming courses is discussed as use case of FLEX, which makes use of the extensibility of client and server. The paper closes with a summary and an outlook.


Computer based examinations Computer aided examinations e-Assessment Bring You Own Device BYOD Reliability Equality of Treatment 


  1. 1.
    Themengruppe Change Management & Organisationsentwicklung: E-Assessment als Herausforderung - Handlungsempfehlungen für Hochschulen. Arbeitspapier Nr. 2. Berlin: Hochschulforum Digitalisierung. (2015).
  2. 2.
    James, R.: Tertiary student attitudes to invigilated, online summative examinations. Int. J. Educ. Technol. High. Educ. 13, 19 (2016). Scholar
  3. 3.
    Berggren, B., Fili, A., Nordberg, O.: Digital examination in higher education–experiences from three different perspectives. Int. J. Educ. Dev. Inf. Commun. Technol. 11(3), 100–108 (2015)Google Scholar
  4. 4.
  5. 5.
    Biella, D., Engert, S., Huth, D.: Design and delivery of an e-assessment solution at the University of Duisburg-Essen. In: Proceedings of EUNIS 2009 (2009)Google Scholar
  6. 6.
    Bücking, J.: eKlausuren im Testcenter der Universität Bremen: Ein Praxisbericht (2010).
  7. 7.
    Brooks, D.C., Pomerantz, J.: ECAR Study of Undergraduate Students and Information Technology (2017).
  8. 8.
    Poll, H.: Student Mobile Device Survey 2015: National Report: College Students (2015).
  9. 9.
    Willige, J.: Auslandsmobilität und digitale Medien. Arbeitspapier Nr. 23. Berlin: Hochschulforum Digitalisierung. (2016).
  10. 10.
    Forgó, N., Graupe, S., Pfeiffenbring, J.: Rechtliche Aspekte von E-Assessments an Hochschulen (2016).
  11. 11.
    RWTH Aachen University: Richtlinien zur Aufbewahrung, Aussonderung, Archivierung und Vernichtung von Akten und Unterlagen der RWTH Aachen (2016).
  12. 12.
    German Bundestag: Basic Law for the Federal Republic of Germany (2012).
  13. 13.
    Frank, A.J.: Dependable distributed testing: can the online proctor be reliably computerized? In: Marca, D.A. (ed.) Proceedings of the International Conference on E-Business. SciTePress, S.l (2010)Google Scholar
  14. 14.
  15. 15.
    Inspera Assessment.
  16. 16.
  17. 17.
  18. 18.
    Dahinden, M.: Designprinzipien und Evaluation eines reliablen CBA-Systems zur Erhebung valider Leistungsdaten. Ph.D. thesis (2014).
  19. 19.
    Küppers, B., Politze, M., Schroeder, U.: Reliable e-assessment with git practical considerations and implementation (2017).
  20. 20.
    Akherfi, K., Gerndt, M., Harroud, H.: Mobile cloud computing for computation offloading: issues and challenges. Appl. Comput. Inform. 14, 1–16 (2016). ISSN 2210-8327CrossRefGoogle Scholar
  21. 21.
    Kovachev, D., Klamma, R.: Framework for computation offloading in mobile cloud computing. Int. J. Interact. Multimed. Artif. Intell. 1(7), 6–15 (2012). Scholar
  22. 22.
    Buxmann, P., Hess, T., Lehmann, S.: Software as a service. Wirtschaftsinformatik 50(6), 500–503 (2008). Scholar
  23. 23.
  24. 24.
  25. 25.
    Küppers, B., Schroeder, U.: Bring Your Own Device for e-Assessment – a review. In: EDULEARN 2016 Proceedings, pp. 8770–8776 (2016). ISSN 2340-1117
  26. 26.
  27. 27.
    eAssessment an der Universität Basel, Basel (2017).
  28. 28.
    Sadiki, J.: E-Assessment with BYOD, SEB and Moodle at the FFHS (2017).
  29. 29.
    Kavanagh, M.; Lozza, D.; Messenzehl, L.: Moodle-exams with Safe Exam Browser (SEB) on BYOD (2017).
  30. 30.
  31. 31.
  32. 32.
    WISEflow implemented on the University of Agder, Norway.
  33. 33.
    Søgaard, T.M.: Mitigation of Cheating Threats in Digital BYOD exams. Master’s thesis (2016).
  34. 34.
    March, S.T., Smith, G.F.: Design and natural science research on information technology. Decis. Support Syst. 15(4), 251–266 (1995). ISSN 0167-9236CrossRefGoogle Scholar
  35. 35.
    Kaur, R., Kaur, A.: Digital signature. In: 2012 International Conference on Computing Sciences, pp. 295–301 (2012).
  36. 36.
    Morgan, R.L., Cantor, S., Carmody, S., Hoehn, W., Klingenstein, K.: Federated security: the Shibboleth approach. EDUCAUSE Q. 27(4), 12–17 (2004)Google Scholar
  37. 37.
    Küppers, B., Kerber, F., Meyer, U., Schroeder, U.: Beyond lockdown: towards reliable e-assessment. In: GI-Edition - Lecture Notes in Informatics (LNI), P-273, pp. 191–196 (2017). ISSN 1617-5468Google Scholar
  38. 38.
    Seshadri, A., Luk, M., Shi, E., Perrig, A., van Doorn, L., Khosla, P.: Pioneer: verifying code integrity and enforcing untampered code execution on legacy systems. ACM SIGOPS Oper. Syst. Rev. 39(5), 1–16 (2005). Scholar
  39. 39.
    Garay, J.A., Huelsbergen, L.: Software integrity protection using timed executable agents. In: Proceedings of the 2006 ACM Symposium on Information, Computer and Communications Security, pp. 189–200 (2006).
  40. 40.
    Electron Framework.
  41. 41.
  42. 42.
  43. 43.
    Namiot, D.; Sneps-Sneppe, M.: On micro-services architecture. Int. J. Open Inf. Technol. 2(9) (2014)Google Scholar
  44. 44.
    Küppers, B., Politze, M., Zameitat, R., Kerber, F., Schroeder, U.: Practical security for electronic examinations on students’ devices. In: Proceedings of SAI Computing Conference 2018 (2018, in Press)Google Scholar
  45. 45.
  46. 46.
    Zameitat, R., Küppers, B.: JDB – Eine Bibliothek für Java-Debugging im Browser (in Press)Google Scholar
  47. 47.
    Fielding, R.T., Taylor, R.N.: Principled design of the modern Web architecture (2002).
  48. 48.
  49. 49.
    Langr, J., Hunt, A., Thomas, D.: Pragmatic unit testing in Java 8 with Junit, 1st edn. Pragmatic Bookshelf, Raleigh (2015). ISBN 978-1-94122-259-1Google Scholar
  50. 50.
    Queirós, R., Leal, J.P.: Programming exercises evaluation systems - an interoperability survey. In: Helfert, M., Martins, M.J., Cordeiro, J. (eds.) CSEDU (1), pp. 83–90. SciTePress (2012)Google Scholar
  51. 51.
    Caliskan-Islam, A., Liu, A., Voss, C., Greenstadt, R.: De-anonymizing programmers via code stylometry. In: Proceedings of the 24th USENIX Security Symposium (2015). ISBN 978-1-931971-232Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.IT CenterRWTH Aachen UniversityAachenGermany
  2. 2.Learning Technologies Research GroupRWTH Aachen UniversityAachenGermany

Personalised recommendations