Advertisement

EvoParsons: design, implementation and preliminary evaluation of evolutionary Parsons puzzle

  • A. T. M. Golam BariEmail author
  • Alessio Gaspar
  • R. Paul Wiegand
  • Jennifer L. Albert
  • Anthony Bucci
  • Amruth N. Kumar
Article
  • 21 Downloads

Abstract

The automated design of a set of practice problems that co-adapts to a population of learners is a challenging problem. Fortunately, coevolutionary computation offers a rich framework to study interactions between two co-adapting populations of teachers and learners. This framework is also relevant in scenarios in which a population of students solve practice exercises that are synthesized by an evolutionary algorithm. In this study, we propose to leverage coevolutionary optimization to evolve a population of Parsons puzzles (a relatively recent new type of practice exercise for novice computer programmers). To this end, we start by experimenting with successive simulations that progressively introduce the characteristics that we anticipate finding in our target application. Using these simulations, we refine a set of guidelines that capture insights on how to successfully coevolve Parsons puzzles. These guidelines are then used to implement the proposed “EvoParsons” software, with which we conduct preliminary evaluations on real human students enrolled in an introductory Java programming course at the University of South Florida. We also propose several quantitative metrics to assess the quality of puzzles produced by EvoParsons. Both simulations and experiments establish the feasibility of evolving pedagogically relevant practice problems that cover most of the dimensions underlying the interactions between problems and students. In addition, a generation-by-generation detailed analysis of the evolving population of Parsons puzzles confirms the occurrence of incremental improvements that can be explained in pedagogical terms.

Keywords

Evolutionary algorithms Coevolutionary algorithms Coevolutionary dimension extraction Introductory programming education Concept inventory Computer-aided learning Parsons puzzles 

Notes

Acknowledgements

This material is based in part upon work supported by the Association for Computing Machinery’s SIGCSE Special Projects 2015 award, and the National Science Foundation under awards #1504634, #1502564, and #1503834. Any opinions, findings, and conclusions or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. USF Information Technology students contributed to the software development efforts. Paul Burton implemented the original proof of concept software during his IT Senior Project in spring 2015, and refined it under OPS contract during summer 2015. Stephen Kozakoff extended the prototype and connected it to Epplets.org as part of his MSIT graduate practicum in fall 2015 and spring 2016.

References

  1. 1.
    D. Parsons, P. Haden, Programming Puzzles: A Fun and Effective Learning Tool for First Programming Courses. in Proceedings of the 8th Australasian Conference on Computing Education - Volume 52, ACE ’06, pp. 157–163. Australian Computer Society, Inc., Darlinghurst, Australia, Australia (2006). http://dl.acm.org/citation.cfm?id=1151869.1151890
  2. 2.
    P. Denny, A. Luxton-Reilly, B. Simon, Evaluating a new exam question: Parsons problems. in Proceedings of the Fourth International Workshop on Computing Education Research, ICER ’08, (ACM, New York, NY, USA, 2008), pp. 113–124.  https://doi.org/10.1145/1404520.1404532
  3. 3.
    P. Ihantola, V. Karavirta, Two-dimensional Parson’s puzzles: the concept, tools, and first observations. J. Inf. Technol. Educ. 10(2), 119–132 (2011)Google Scholar
  4. 4.
    J. Helminen, P. Ihantola, V. Karavirta, L. Malmi, How Do Students Solve Parsons Programming Problems?: An Analysis of Interaction Traces. in Proceedings of the Ninth Annual International Conference on International Computing Education Research, ICER ’12 (ACM, New York, NY, USA, 2012), pp. 119–126.  https://doi.org/10.1145/2361276.2361300
  5. 5.
    V. Karavirta, J. Helminen, P. Ihantola, A Mobile Learning Application for Parsons Problems With Automatic Feedback. in Proceedings of the 12th Koli Calling International Conference on Computing Education Research, Koli Calling ’12, (ACM, New York, NY, USA, 2012), pp. 11–18.  https://doi.org/10.1145/2401796.2401798
  6. 6.
    B.J. Ericson, Adaptive Parsons Problems with Discourse Rules. in Proceedings of the Eleventh Annual International Conference on International Computing Education Research, ICER ’15, (ACM, New York, NY, USA 2015), pp. 259–260.  https://doi.org/10.1145/2787622.2787740
  7. 7.
    B.B. Morrison, L.E. Margulieux, B. Ericson, M. Guzdial, Subgoals Help Students Solve Parsons Problems. in Proceedings of the 47th ACM Technical Symposium on Computing Science Education, SIGCSE ’16 (ACM, New York, NY, USA, 2016), pp. 42–47.  https://doi.org/10.1145/2839509.2844617
  8. 8.
    Nils Aall Baricelli, Numerical testing of evolution theories, part II preliminary tests of performance symbiogenesis and terrestrial life. Acta Biotheor. 16, 99–126 (1962)CrossRefGoogle Scholar
  9. 9.
    D. Hillis, Co-evolving parasites improve simulated evolution as an optimization procedure. Artif. Life II(10), 313–324 (1991)Google Scholar
  10. 10.
    S.G. Ficici, J.B. Pollack, Pareto Optimality in Coevolutionary Learning. in Proceedings of the Sixth European Conference on Artificial Life. Springer (2000)Google Scholar
  11. 11.
    K. Krawiec, M. Heywood, Solving Complex Problems With Coevolutionary Algorithms. in Proceedings of the Genetic and Evolutionary Computation Conference Companion, pp. 782–806. ACM (2017)Google Scholar
  12. 12.
    S.G. Ficici, Solution Concepts in Coevolutionary Algorithms. Ph.D. thesis, Brandeis University, Boston, MA (2004)Google Scholar
  13. 13.
    L. Vygotski, The Collected Works of LS Vygotsky (Springer, Berlin, 1987)Google Scholar
  14. 14.
    R. Watson, J.B. Pollack, Coevolutionary Dynamics in a Minimal Substrate. in Proceedings of the Genetic and Evolutionary Computation Conference, GECCO-2001, pp. 702–709. Morgan Kaufmann (2001)Google Scholar
  15. 15.
    A. Bucci, J.B. Pollack, E. De Jong, Automated Extraction of Problem Structure. in Genetic and Evolutionary Computation Conference, (Springer, 2004), pp. 501–512Google Scholar
  16. 16.
    R.P. Wiegand, A. Bucci, A.N. Kumar, J.L. Albert, A. Gaspar, A Data-Driven Analysis of Informatively Hard Concepts in Introductory Programming. in Proceedings of the 47th ACM Technical Symposium on Computing Science Education, SIGCSE ’16. (ACM, New York, NY, USA, 2016), pp. 370–375.  https://doi.org/10.1145/2839509.2844629
  17. 17.
    A. Bucci, R.P. Wiegand, A.N. Kumar, J.L. Albert, A. Gaspar, Dimension Extraction Analysis of Student Performance on Problems. in Proceedings of the 29th International Conference of the Florida Artificial Intelligence Research Society, FLAIRS ’16 (2016)Google Scholar
  18. 18.
    W. Jaskowski, K. Krawiec, Formal analysis, hardness, and algorithms for extracting internal structure of test-based problems. Evol. Comput. 19(4), 639–671 (2011)CrossRefGoogle Scholar
  19. 19.
    H. Takagi, Interactive evolutionary computation: fusion of the capabilities of EC optimization and human evaluation. Proc. IEEE 89(9), 1275–1296 (2001)CrossRefGoogle Scholar
  20. 20.
    H. Takagi, Interactive Evolutionary Computation: System Optimization Based on Human Subjective Evaluation. in IEEE Int. Conf. on Intelligent Engineering Systems (INES98), pp. 17–19 (1998)Google Scholar
  21. 21.
    S. Wang, H. Takagi, Improving the performance of predicting users subjective evaluation characteristics to reduce their fatigue in IEC. J. Physiol. Anthropol. Appl. Human Sci. 24(1), 81–85 (2005)CrossRefGoogle Scholar
  22. 22.
    S. Wang, X. Wang, H. Takagi, User Fatigue Reduction by An Absolute Rating Data-Trained Predictor in IEC. in Evolutionary Computation, 2006. CEC 2006. IEEE Congress on, pp. 2195–2200. IEEE (2006)Google Scholar
  23. 23.
    E. Popovici, A. Bucci, R. Wiegand, E. de Jong, Coevolutionary Principles. in Handbook of Natural Computing, pp. 987–1033. Springer (2012)Google Scholar
  24. 24.
    A. Gaspar, A.G. Bari, A.N. Kumar, R.P. Wiegand, A. Bucci, J.L. Albert, Evolutionary Practice Problems Generation: Design Guidelines. in 28th IEEE International Conference on Tools with Artificial Intelligence, ICTAI’16 (2016)Google Scholar
  25. 25.
    A. Gaspar, A.G. Bari, A.N. Kumar, R.P. Wiegand, A. Bucci, J.L. Albert, Evolutionary Practice Problems Generation: More Design Guidelines. in Proceedings of the 30th International Conference of the Florida Artificial Intelligence Research Society, FLAIRS ’17 (2017)Google Scholar
  26. 26.
    R. Caceffo, R.B. Guilherme Gama, R.A. Tales Aparecida Tania Caldas, A Concept Inventory for CS1 Introductory Programming Courses in C. in Technical Report 18-06, Institute of Computing, University of Campinas, SP, Brasil. Brasil (2018)Google Scholar
  27. 27.
    G.L. Herman, L. Kaczmarczyk, M.C. Loui, C. Zilles, Proof by Incomplete Enumeration and Other Logical Misconceptions. in Proceedings of the Fourth International Workshop on Computing Education Research, ICER ’08 (ACM, New York, NY, USA, 2008), pp. 59–70.  https://doi.org/10.1145/1404520.1404527
  28. 28.
    V.L. Almstrum, P.B. Henderson, V. Harvey, C. Heeren, W. Marion, C. Riedesel, L.K. Soh, A.E. Tew, Concept Inventories in Computer Science for the Topic Discrete Mathematics. in Working Group Reports on ITiCSE on Innovation and Technology in Computer Science Education, ITiCSE-WGR ’06, (ACM, New York, NY, USA, 2006), pp. 132–145.  https://doi.org/10.1145/1189215.1189182
  29. 29.
    K.C. Webb, C. Taylor, Developing a Pre- and Post-course Concept Inventory to Gauge Operating Systems Learning. in Proceedings of the 45th ACM Technical Symposium on Computer Science Education, SIGCSE ’14 (ACM, New York, NY, USA, 2014), pp. 103–108.  https://doi.org/10.1145/2538862.2538886
  30. 30.
    J. Vahrenhold, P. Wolfgang, Developing and validating test items for first-year computer science courses. Comput. Sci. Educ. 24, 304–333 (2014)CrossRefGoogle Scholar
  31. 31.
    M.F. Farghally, K.H. Koh, J.V. Ernst, C.A. Shaffer, Towards a Concept Inventory for Algorithm Analysis Topics. in Proceedings of the 2017 ACM SIGCSE Technical Symposium on Computer Science Education, SIGCSE ’17, (ACM, New York, NY, USA 2017), pp. 207–212.  https://doi.org/10.1145/3017680.3017756
  32. 32.
    K. Goldman, P. Gross, C. Heeren, G. Herman, L. Kaczmarczyk, M. Loui, C. Zilles, Identifying important and difficult concepts in introductory computing courses using a Delphi process. ACM SIGCSE Bull. 40(1), 256–260 (2008)CrossRefGoogle Scholar
  33. 33.
    K. Goldman, P. Gross, C. Heeren, G.L. Herman, L. Kaczmarczyk, M.C. Loui, C. Zilles, Setting the scope of concept inventories for introductory computing subjects. Trans. Comput. Edu. 10(2), 5:1–5:29 (2010)zbMATHGoogle Scholar
  34. 34.
    L.C. Kaczmarczyk, E.R. Petrick, J.P. East, G.L. Herman, Identifying Student Misconceptions of Programming. in Proceedings of the 41st ACM Technical Symposium on Computer Science Education, SIGCSE ’10. (ACM, New York, NY, USA, 2010), pp. 107–111.  https://doi.org/10.1145/1734263.1734299
  35. 35.
    R. Caceffo, S. Wolfman, K.S. Booth, R. Azevedo, Developing a Computer Science Concept Inventory for Introductory Programming. in Proceedings of the 47th ACM Technical Symposium on Computing Science Education, SIGCSE ’16, (ACM, New York, NY, USA, 2016), pp. 364–369.  https://doi.org/10.1145/2839509.2844559
  36. 36.
    E.D. De Jong, J.B. Pollack, Ideal evaluation from coevolution. Evol. Comput. 12(2), 159–192 (2004)CrossRefGoogle Scholar
  37. 37.
    A. Bucci, J.B. Pollack, Focusing Versus Intransitivity Geometrical Aspects of Co-evolution. in Proceedings of the Genetic and Evolutionary Computation Conference, GECCO-2003, (Springer, 2003), pp. 250–261Google Scholar
  38. 38.
    J. Branke, K. Deb, K. Miettinen, Multiobjective Optimization: Interactive and Evolutionary Approaches, vol. 5252 (Springer, Berlin, 2008)zbMATHGoogle Scholar
  39. 39.
    A. Bucci, Emergent Geometric Organization and Informative Dimensions in Coevolutionary Algorithms. Ph.D. thesis, Brandeis University, Boston, MA (2007)Google Scholar
  40. 40.
    E.D. De Jong, A. Bucci, DECA: Dimension Extracting Coevolutionary Algorithm. in Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation, pp. 313–320. ACM (2006)Google Scholar
  41. 41.
    S. Whiteson, P. Stone, On-line Evolutionary Computation for Reinforcement Learning in Stochastic Domains. in Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation, GECCO ’06. ACM, New York, NY, USA (2006), pp. 1577–1584.  https://doi.org/10.1145/1143997.1144252
  42. 42.
    C. Taylor, D. Zingaro, L. Porter, K. Webb, C. Lee, M. Clancy, Computer science concept inventories: past and future. Comput. Sci. Educ. 24(4), 253–276 (2014)CrossRefGoogle Scholar
  43. 43.
    C.H. Crouch, E. Mazur, Peer instruction: ten years of experience and results. Am. J. Phys. 69(9), (2001)Google Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.University of South FloridaTampaUSA
  2. 2.Institute for Simulation & TrainingUniversity of Central FloridaOrlandoUSA
  3. 3.The CitadelCharlestonUSA
  4. 4.CambridgeUSA
  5. 5.Ramapo College of New JerseyMahwahUSA

Personalised recommendations