Advertisement

A Framework for e-Assessment on Students’ Devices: Technical Considerations

  • Bastian Küppers
  • Ulrik Schroeder
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 829)

Abstract

This paper presents FLEX, a framework for electronic assessment on students’ devices. Basic requirements to such a framework and potential issues related with these requirements are discussed, as well as their state of the art. Afterwards, the client-server architecture of FLEX is presented, which is designed to meet all requirements previously identified. The FLEX client and the FLEX server are discussed in detail with focus on utilized technologies and programming languages. The results of first trials with the existing prototype are discussed in relation to the identified basic requirements. Thereafter, assessment of programming courses is discussed as use case of FLEX, which makes use of the extensibility of client and server. The paper closes with a summary and an outlook.

Keywords

Computer based examinations Computer aided examinations e-Assessment Bring You Own Device BYOD Reliability Equality of Treatment 

References

  1. 1.
    Themengruppe Change Management & Organisationsentwicklung: E-Assessment als Herausforderung - Handlungsempfehlungen für Hochschulen. Arbeitspapier Nr. 2. Berlin: Hochschulforum Digitalisierung. (2015). https://hochschulforumdigitalisierung.de/sites/default/files/dateien/HFD%20AP%20Nr%202_E-Asessment%20als%20Herausforderung%20Handlungsempfehlungen%20fuer%20Hochschulen.pdf
  2. 2.
    James, R.: Tertiary student attitudes to invigilated, online summative examinations. Int. J. Educ. Technol. High. Educ. 13, 19 (2016).  https://doi.org/10.1186/s41239-016-0015-0CrossRefGoogle Scholar
  3. 3.
    Berggren, B., Fili, A., Nordberg, O.: Digital examination in higher education–experiences from three different perspectives. Int. J. Educ. Dev. Inf. Commun. Technol. 11(3), 100–108 (2015)Google Scholar
  4. 4.
  5. 5.
    Biella, D., Engert, S., Huth, D.: Design and delivery of an e-assessment solution at the University of Duisburg-Essen. In: Proceedings of EUNIS 2009 (2009)Google Scholar
  6. 6.
    Bücking, J.: eKlausuren im Testcenter der Universität Bremen: Ein Praxisbericht (2010). https://www.campussource.de/events/e1010tudortmund/docs/Buecking.pdf
  7. 7.
    Brooks, D.C., Pomerantz, J.: ECAR Study of Undergraduate Students and Information Technology (2017). https://library.educause.edu/~/media/files/library/2017/10/studentitstudy2017.pdf
  8. 8.
    Poll, H.: Student Mobile Device Survey 2015: National Report: College Students (2015). https://www.pearsoned.com/wp-content/uploads/2015-Pearson-Student-Mobile-Device-Survey-College.pdf
  9. 9.
    Willige, J.: Auslandsmobilität und digitale Medien. Arbeitspapier Nr. 23. Berlin: Hochschulforum Digitalisierung. (2016). https://hochschulforumdigitalisierung.de/sites/default/files/dateien/HFD_AP_Nr23_Digitale_Medien_und_Mobilitaet.pdf
  10. 10.
    Forgó, N., Graupe, S., Pfeiffenbring, J.: Rechtliche Aspekte von E-Assessments an Hochschulen (2016). https://dx.doi.org/10.17185/duepublico/42871
  11. 11.
    RWTH Aachen University: Richtlinien zur Aufbewahrung, Aussonderung, Archivierung und Vernichtung von Akten und Unterlagen der RWTH Aachen (2016). http://www.rwth-aachen.de/global/show_document.asp?id=aaaaaaaaaatmzml
  12. 12.
    German Bundestag: Basic Law for the Federal Republic of Germany (2012). https://www.btg-bestellservice.de/pdf/80201000.pdf
  13. 13.
    Frank, A.J.: Dependable distributed testing: can the online proctor be reliably computerized? In: Marca, D.A. (ed.) Proceedings of the International Conference on E-Business. SciTePress, S.l (2010)Google Scholar
  14. 14.
  15. 15.
    Inspera Assessment. https://www.inspera.no/
  16. 16.
  17. 17.
  18. 18.
    Dahinden, M.: Designprinzipien und Evaluation eines reliablen CBA-Systems zur Erhebung valider Leistungsdaten. Ph.D. thesis (2014). https://dx.doi.org/10.3929/ethz-a-010264032
  19. 19.
    Küppers, B., Politze, M., Schroeder, U.: Reliable e-assessment with git practical considerations and implementation (2017). https://dx.doi.org/10.17879/21299722960
  20. 20.
    Akherfi, K., Gerndt, M., Harroud, H.: Mobile cloud computing for computation offloading: issues and challenges. Appl. Comput. Inform. 14, 1–16 (2016).  https://doi.org/10.1016/j.aci.2016.11.002. ISSN 2210-8327CrossRefGoogle Scholar
  21. 21.
    Kovachev, D., Klamma, R.: Framework for computation offloading in mobile cloud computing. Int. J. Interact. Multimed. Artif. Intell. 1(7), 6–15 (2012).  https://doi.org/10.9781/ijimai.2012.171CrossRefGoogle Scholar
  22. 22.
    Buxmann, P., Hess, T., Lehmann, S.: Software as a service. Wirtschaftsinformatik 50(6), 500–503 (2008).  https://doi.org/10.1007/s11576-008-0095-0CrossRefGoogle Scholar
  23. 23.
  24. 24.
  25. 25.
    Küppers, B., Schroeder, U.: Bring Your Own Device for e-Assessment – a review. In: EDULEARN 2016 Proceedings, pp. 8770–8776 (2016). https://dx.doi.org/10.21125/edulearn.2016.0919. ISSN 2340-1117
  26. 26.
  27. 27.
    eAssessment an der Universität Basel, Basel (2017). https://bbit-hsd.unibas.ch/medien/2017/10/EvaExam-Betriebskonzept.pdf
  28. 28.
    Sadiki, J.: E-Assessment with BYOD, SEB and Moodle at the FFHS (2017). https://www.eduhub.ch/export/sites/default/files/E-Assessment_eduhubdays_showtell.pdf
  29. 29.
    Kavanagh, M.; Lozza, D.; Messenzehl, L.: Moodle-exams with Safe Exam Browser (SEB) on BYOD (2017). https://www.eduhub.ch/export/sites/default/files/ShowTell_ZHAW.pdf
  30. 30.
  31. 31.
  32. 32.
    WISEflow implemented on the University of Agder, Norway. http://uniwise.dk/2014/07/31/wiseflow-uia/
  33. 33.
    Søgaard, T.M.: Mitigation of Cheating Threats in Digital BYOD exams. Master’s thesis (2016). https://dx.doi.org/11250/2410735
  34. 34.
    March, S.T., Smith, G.F.: Design and natural science research on information technology. Decis. Support Syst. 15(4), 251–266 (1995).  https://doi.org/10.1016/0167-9236(94)00041-2. ISSN 0167-9236CrossRefGoogle Scholar
  35. 35.
    Kaur, R., Kaur, A.: Digital signature. In: 2012 International Conference on Computing Sciences, pp. 295–301 (2012).  https://doi.org/10.1109/ICCS.2012.25
  36. 36.
    Morgan, R.L., Cantor, S., Carmody, S., Hoehn, W., Klingenstein, K.: Federated security: the Shibboleth approach. EDUCAUSE Q. 27(4), 12–17 (2004)Google Scholar
  37. 37.
    Küppers, B., Kerber, F., Meyer, U., Schroeder, U.: Beyond lockdown: towards reliable e-assessment. In: GI-Edition - Lecture Notes in Informatics (LNI), P-273, pp. 191–196 (2017). ISSN 1617-5468Google Scholar
  38. 38.
    Seshadri, A., Luk, M., Shi, E., Perrig, A., van Doorn, L., Khosla, P.: Pioneer: verifying code integrity and enforcing untampered code execution on legacy systems. ACM SIGOPS Oper. Syst. Rev. 39(5), 1–16 (2005).  https://doi.org/10.1145/1095810.1095812CrossRefGoogle Scholar
  39. 39.
    Garay, J.A., Huelsbergen, L.: Software integrity protection using timed executable agents. In: Proceedings of the 2006 ACM Symposium on Information, Computer and Communications Security, pp. 189–200 (2006). https://dx.doi.org/10.1145/1128817.1128847
  40. 40.
    Electron Framework. https://electron.atom.io/
  41. 41.
  42. 42.
  43. 43.
    Namiot, D.; Sneps-Sneppe, M.: On micro-services architecture. Int. J. Open Inf. Technol. 2(9) (2014)Google Scholar
  44. 44.
    Küppers, B., Politze, M., Zameitat, R., Kerber, F., Schroeder, U.: Practical security for electronic examinations on students’ devices. In: Proceedings of SAI Computing Conference 2018 (2018, in Press)Google Scholar
  45. 45.
  46. 46.
    Zameitat, R., Küppers, B.: JDB – Eine Bibliothek für Java-Debugging im Browser (in Press)Google Scholar
  47. 47.
    Fielding, R.T., Taylor, R.N.: Principled design of the modern Web architecture (2002). https://dx.doi.org/10.1145/514183.514185
  48. 48.
  49. 49.
    Langr, J., Hunt, A., Thomas, D.: Pragmatic unit testing in Java 8 with Junit, 1st edn. Pragmatic Bookshelf, Raleigh (2015). ISBN 978-1-94122-259-1Google Scholar
  50. 50.
    Queirós, R., Leal, J.P.: Programming exercises evaluation systems - an interoperability survey. In: Helfert, M., Martins, M.J., Cordeiro, J. (eds.) CSEDU (1), pp. 83–90. SciTePress (2012)Google Scholar
  51. 51.
    Caliskan-Islam, A., Liu, A., Voss, C., Greenstadt, R.: De-anonymizing programmers via code stylometry. In: Proceedings of the 24th USENIX Security Symposium (2015). ISBN 978-1-931971-232Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.IT CenterRWTH Aachen UniversityAachenGermany
  2. 2.Learning Technologies Research GroupRWTH Aachen UniversityAachenGermany

Personalised recommendations