Measuring Secondary School Students’ Competence in Computational Thinking in ICILS 2018—Challenges, Concepts, and Potential Implications for School Systems Around the World

  • Birgit EickelmannEmail author
Open Access


Focusing on increasing relevance of researching teaching and learning computational thinking, this chapter elaborates on the international study ICILS 2018 (International Computer and Information Literacy Study, second cycle). In the scope of this international comparative study, a research module on computational thinking is, for the first time, being realized as an international option. Countries with education systems which are taking part in ICILS 2018 were able to choose whether they wanted to take part in this additional module. The option comprises computer-based tests, two test modules for each student, in the domain of computational thinking for Grade 8 students as well as additional computational-thinking-related questions and items in the study’s questionnaires for students, teachers, school principals, and IT coordinators. This chapter introduces the research approach of the study and especially its approach to computational thinking from the perspective of educational school-related research. Since findings of the study will not be available until the end of 2019, the current chapter illustrates the study’s theoretical and empirical approach and outlines what kind of results will for the first time feature within the scope of an international large-scale assessment. With regard to the aim of the study to provide, apart from basic research knowledge toward an in-depth understanding of computational thinking, information on the current situation and future perspectives of education systems around the world, examples of potential implications for schools and school systems will also be given.


Computational thinking ICILS 2018 Students’ competences International comparison Implications for school systems 

4.1 Introduction: The Relevance of Researching Teaching and Learning Computational Thinking in Schools

Comparatively new in the discussion about what kind of competences young people need in order to participate effectively in the digital society and to be prepared for work as well as for everyday life, competences related to computational thinking are attracting increasing attention. The corresponding research can be allocated in a broader understanding of researching ICT literacy (e.g., Ainley, Schulz, & Fraillon, 2016; Siddiq, Hatlevik, Olsen, & Throndsen, 2016; ETS, 2007). From this perspective, computational thinking adds to a new understanding of computer-related problem-solving. It not only broadens the previous definitions of ICT literacy but indeed opens up a new perspective. In this scope, teaching and learning about how to solve problems and how computer systems work means competence in computational thinking can be applied in different contexts (Ainley, Schulz, & Fraillon, 2016). Accordingly, the current discussion grows around the question as to where these competences should and could be taught. Answering this question requires developing an understanding of teaching and learning computational thinking and establishing common ideas of computational concepts, practices, and perspectives within a school system (Kong, 2016). Currently, computational thinking challenges the work of education systems all over the world, especially with regard to the development of competence models, teacher education, and curriculum integration of computational thinking (Kafai, 2016; Bescherer & Fest, 2018). Looking at the current developments, three approaches to support the acquisition of competences in computational thinking and improve students’ achievement in computational thinking can be identified: (1) Cross-curricular approach: The first approach is to understand computational thinking as a cross-curricular competence which can be taught within different subjects, acknowledging that each subject has a particular view and contribution to students’ acquisition of competences in computational thinking (Barr & Stephenson, 2011). (2) Computer science approach: The second discussion refers to the understanding of computational thinking as being a substantial part of computer science (e.g., Kong, 2016). In this understanding, thought processes related to computational thinking and involved in formulating and solving problems can be represented as computational steps and algorithms (Aho, 2011). This, to a certain extent, leads to promoting computational thinking best in the context of teaching computer science and is highly correlated with its contribution to modeling, programming, and robotics. Following on from this school of thought, Kong (2016) proposes a computational thinking framework, based on a framework by Brennan and Resnick (2012), to develop a curriculum in K-12 that promotes computational thinking through programming. The special thing about this framework is that although it is assumed that computational thinking draws “on the fundamental knowledge and skills of computer science” (Kong, 2016, p. 379), it is supposed that computational thinking is broader than computer science and refers to problem-solving, system design, and human behavior. Based on this, computational thinking is promoted in a separate 3-year curriculum. This results in a third approach, which is to develop a new learning area in terms of a separate subject. (3) Computational thinking as an individual subject/learning area: From this perspective, computational thinking is seen as a key competence in the field of using new technologies competently and reflectively. This understanding can be elaborated on and realized in different ways (Rich & Hodges, 2017). A number of countries have already implemented a computational thinking learning area either as a compulsory subject (e.g., in England, Department of Education, 2013) or as part of an optional subject (e.g., Denmark, EMU, 2017).

Regardless of the approach taken, including computational thinking in formal educational learning is already being realized, has been embarked on or is planned, while knowledge about the nature of computational thinking and factors and conditions related to its acquisition in the school context is still pending. This kind of knowledge can be understood as meta-knowledge, which is also important for improving learning in schools and for the development of school systems, the latter especially when it comes to ICT-related educational policies. In this understanding, research knowledge related to computational thinking, its implementation in schools and school systems, and the effectiveness of different approaches seems to be crucial in order for decision-making to bring educational systems into the digital age (Eickelmann, 2018). The urgent questions arising from this are what exactly should be taught in schools in future in the context of computational thinking and which conditions in classrooms, schools, and school systems are supportive in terms of facilitating and contributing to students’ acquisition of competences in the field of computational thinking. Against this background, the international comparative study ICILS 2018 (International Computer and Information Literacy Study, 2nd cycle), which is for the first time closing the aforementioned research gap on the basis of representative data from school systems from all over the world, is introduced below.

4.2 Researching Students’ Achievement in Computational Thinking in the Context of ICILS 2018

In the following section, the theoretical framework and the empirical approach of measuring students’ achievements in computational thinking in the context of the international comparative large-scale study ICILS 2018 will be presented. The first section provides information on the study and its scope. The second section describes the approach of researching computational thinking in the context of the study. It provides information on the research design, on the understanding of computational thinking in the scope of the study as a construct that can be measured by conducting computer-based tests. Furthermore, the research questions that are addressed in the study ICILS 2018 as well as information on the relevant context factors are presented. The last subsection deals with insights into national extensions implemented by individual countries participating in the computational thinking option of ICILS 2018.

4.2.1 ICILS 2018—Assessing Students’ Readiness for the Digital World in the Scope of an International Comparative Study

With ICILS 2018 (International Computer and Information Literacy Study), the IEA (International Association for the Evaluation of Educational Achievement) is completing the second cycle of ICILS. The study is an international comparative assessment with participating countries from all around the world. As for ICILS 2013, the international study center of ICILS 2018 is allocated at the Australian Council for Educational Research (ACER). ICILS 2013 for the first time focused on computer and information literacy (CIL) as a competence area measured in international comparisons by conducting computer-based student tests for Grade 8 in 21 education systems around the world (Fraillon, Ainley, Schulz, Friedman, & Gebhardt, 2014). After having successfully run this first cycle of ICILS, the IEA decided to conduct a second cycle (ICILS 2018). Acknowledging the rapid changes affecting ICT in teaching and learning and the aspiration to conduct a future-oriented study, ACER suggested adding computational thinking as an extension of the study. The core and trend part of both ICILS 2013 and ICILS 2018 comprises student tests for CIL and questionnaires on teaching and learning with ICT and individual, classroom and school factors with regard to the acquisition of CIL. Within the scope of ICILS 2018, nine education systems (Denmark, France, Germany, Luxembourg, Portugal, the U.S., the Republic of Korea and the benchmarking participants Moscow (Russia) and the German federal state North Rhine-Westphalia) are making use of the international option and are participating in the additional module focusing on computational thinking. Each student of the representative student sample takes, in addition to two 30-min CIL tests, two 25-min computer-based tests on computational thinking. From the research perspective, the development of the computer-based tests, covering all aspects of computational thinking and making it work for Grade 8 students, has probably been the most challenging part of the current cycle of the study. The aforementioned student tests are complemented by questionnaires addressing the tested student, teachers in the participating schools and the school principals and ICT coordinators of the schools which are selected for participation in the study. In this context, questionnaire items of particular interest in the context of computational thinking are added in the student and teacher questionnaires. Furthermore, all participating countries are asked to provide data about the education system and its approach to teaching and learning with ICT by filling in a so-called national context survey. This country-related questionnaire refers, for instance, to aspects of educational goals, curricula, and teacher education related to the scope of the study.

Data collection took place in spring 2018 for the Northern Hemisphere and in autumn 2018 for countries from the Southern Hemisphere. All education systems participated with a representative school sample, comprising representative teacher and student samples. Therefore, the results of the study allow for interpreting the status quo of Grade 8 student achievement in CIL and in addition, for those education systems which are taking part in the international option, also for the domain of computational thinking.

4.2.2 Computational Thinking as Part of ICILS 2018

In the following, in-depth information is provided on the international option on computational thinking in the context of ICILS 2018. This includes the description of the theoretical understanding of computational thinking in terms of the definition of the construct used as a base for developing the student tests for ICILS 2018. Furthermore, initial insights into the questionnaires are given to provide examples of information on which factors and aspects are assumed to be relevant for student acquisition of computational thinking considering different perspectives. The Construct of Computational Thinking and How Student Achievements in Computational Thinking Are Measured

The study starts where most research on computational thinking begins, basing itself on an adaptation of Wing’s (2006) statements on computational thinking. In her understanding, computational thinking comprises fundamental skills which allow individuals to solve problems by using computers: “Computational thinking is a way humans solve problems; it is not trying to get humans to think like computers” (Wing, 2006, p. 35). Wing’s idea relates to Papert (1980) who developed the essential features of computational thinking. In recent years, the field of computational thinking has been continuously developed (e.g., Dede, Mishra, & Voogt, 2013; Mannila et al., 2014; Voogt, Fisser, Good, Mishra, & Yadav, 2015).

In the context of ICILS 2018, computational thinking is defined as “an individual's ability to recognize aspects of real-world problems which are appropriate for computational formulation and to evaluate and develop algorithmic solutions to those problems so that the solutions could be operationalized with a computer” (Fraillon, Schulz, Friedman, & Duckworth, 2019). In this context, the understanding of computational thinking and its relevance for future generation lead to new tasks for schools and school systems in order to offer the possibility for every child to participate effectively in the digital world. In this context, it is stressed that this approach sees young people not only as consumers in a digital world but also their need for competence as reflective creators of content (IEA, 2016).

Apart from this broader understanding of computational thinking, the subject of the study is a more detailed definition of the construct “computational thinking.” This has been developed by taking previous research findings, relevant approaches, and understandings of computational thinking into account. The study’s understanding of computational thinking skills also corresponds to international standards such as the ISTE standards for students (2016). These standards focus on the understanding of “Computational Thinkers” that “students develop and employ strategies for understanding and solving problems in ways that leverage the power of technological methods to develop and test solutions” (p. 1), including skills such as problem formulation, data collection and analysis, abstraction, modeling, algorithmic thinking, solution finding, use of digital tools, representation of data, decomposition, and automation. The construct as it is addressed in ICILS 2018 consists of two strands which are both subdivided into subareas (Fraillon, Schulz, Friedman, & Duckworth, 2019).

Strand I: Conceptualizing problems: The first strand refers to the conceptualization of problems. Conceptualizing problems acknowledges that before solutions can be developed, problems must first be understood and framed in a way that allows algorithmic or system thinking to assist in the process of developing solutions. As subareas it includes three aspects: 1. Knowing about and understanding computer systems; 2. Formulating and analyzing problems; and 3. Collecting and representing relevant data. A task that provides evidence of an individual’s ability to know about and understand computer systems includes, for example, operating a system to produce relevant data for analysis or explaining why simulations help to solve problems. Formulating problems entails the decomposition of a problem into smaller manageable parts and specifying and systematizing the characteristics of the task so that a computational solution can be developed—possibly with the help of a computer. Analyzing consists of making connections between the properties of and developing solutions to previously experienced problems and new problems to establish a conceptual framework to underpin the process of breaking down a large problem into a set of smaller, more manageable parts. Collecting and representing relevant data comprises making effective judgements about problem-solving within systems. This requires knowledge and understanding of the characteristics of the relevant data and of the mechanisms available for collection, organization, and representation of the data for analysis. This could, for instance, involve creating or using a simulation of a complex system to produce data that may show specific patterns or characteristics.

Strand II: Operationalizing solutions: The second strand concerns operationalizing solutions. Operationalizing solutions comprise the processes associated with creating, implementing, and evaluating computer-based system responses to real-world problems. It includes the iterative processes of planning for, implementing, testing, and evaluating algorithmic solutions to real-world problems. The strand includes an understanding of the needs of users and their likely interaction with the system under development. The strand comprises two aspects: 1. Planning and evaluating solutions and 2. Developing algorithms, programs, and interfaces. Examples of tasks that, for instance, provide evidence of an individual’s ability to develop algorithms, programs, and designs can be processed such as creating a simple algorithm or modifying an existing algorithm for a new purpose.

This understanding of computational thinking acted as a basis for the development of the student tests. Each aspect is covered in at least one of the two computational thinking test modules. Student test data is analyzed using IRT scaling and student achievement data is analyzed in relationship to the context data which is gathered via the various types of questionnaires. The analyses are guided by the research questions, which are presented in the following section. Research Questions Related to Computational Thinking in the Context of ICILS 2018

Taking the aforementioned research gap and the aims of the study into account, the following three overarching research questions are addressed in the study. The questions refer to different levels within education systems: the school system level, the school and classroom level, and the individual student level.
  1. (1)

    First, what variations exist in and across different countries in student achievement in computational thinking and what aspects of students’ personal and social background are related to it?

This question is answered by gathering data on student achievement in computational thinking, using computer-based computational thinking tests. The student test data enables compilation of national averages as well as for comparison of student achievement between countries. As in other international comparative studies, the student achievement data also allows for in-depth analyses within countries, e.g., comparing student achievement between boys and girls, between students with and without migration background or between students from different socioeconomic backgrounds. If countries have chosen to stratify the student sample to differentiate between different school types or tracks, analysis can also reveal differences and similarities between groups of students from different school types. These are just a few examples for potential analysis that are useful for the purpose of obtaining information on student achievement in computational thinking and gathering data and information to describe efforts made in teaching and learning computational thinking.
  1. (2)

    Second, what aspects of education systems, schools, and classroom practice explain variation in student achievement in computational thinking?

This question focuses on the way in which computational thinking is implemented in education systems, in schools, and in classroom practice. Data relevant to this question will be collected via the questionnaires. This type of data and results, for instance, enable interpretation and framing of findings to address the first question. Furthermore, information on educational practice and settings is gathered and provides interesting insights into teaching and learning with ICT in the context of computational thinking.
  1. (3)

    Third, how is student achievement in computational thinking related to their computer and information literacy (CIL) and to their self-reported proficiency in using computers?


This research question connects the core part of the study referring to CIL and the additional part referring to computational thinking. By applying both the student test on computational thinking and the student test on CIL within the same student sample, initial correlations between the two constructs can be examined (For a detailed overview of the underlying construct of CIL in ICILS 2013, see Fraillon, Ainley, Schulz, Friedman, & Gebhardt, 2014. The exact constructs of CT and CIL in ICILS 2018 can be found in the study’s framework. It also provides information on how the study envisages the differences between these two areas). This also leads to new fundamental theoretical knowledge as well as to important information on how teaching of both competences can be combined and how student learning might be better supported.

A comprehensive and more detailed list of research questions as well as further details on instrument preparation and content can be found in the study’s assessment framework (Fraillon, Schulz, Friedman, & Duckworth, 2019). A detailed overview of all instruments and objects will be published by ACER (Australian Council for Educational Research) in 2020 with the so-called ICILS 2018 user guide for the international database. Insights into the Structure and Content of the Study’s Instruments

As mentioned above, the ICILS 2018 applies computer-based student tests for Grade 8 students and adds questionnaires for students, teachers, school principals, and IT coordinators. Furthermore, a so-called national context survey questionnaire is applied and filled in by the national study centers of the countries participating in the study.

In the scope of the study, assessing students’ achievement in computational thinking by applying computer-based tests means developing tests that have an authentic real-world focus to capture students’ imagination in an appropriate way. At the core of the test modules “authoring tasks” contain authentic computer software applications (Fraillon, Schulz, & Ainley, 2013). The actual content of the test modules themselves will be published in the report of the study in 2019. Additionally, with the aim of exploring classroom practices regarding student use of computational thinking tasks, ICILS 2018 gathers information from students via student questionnaires. Parts of the questionnaires relate to computational thinking and are only applied in those countries that have chosen to include the computational thinking module. Students should, for example, specify the extent to which they have learned different computational thinking tasks in school. These tasks refer to the study’s computational thinking construct described above. Furthermore, with respect to teacher attitudes toward teaching computational thinking, teachers are asked about the value they attach to teaching skills in the field of computational thinking. These skills also refer to the abovementioned computational thinking construct and are part of the teacher questionnaires. It is of note that the experts from the national centers together with the international study center, located at ACER in Melbourne, decided to include teachers from all different school subjects. Going beyond only involving computer science teachers has been a consensual decision of the experts included in the development of the instruments of the study. Based on this decision, the study will allow for comparing teachers practice as well as attitudes between different subject areas. Making Full Use of the International Option: National Extension Toward Computational Thinking

According to the current relevance of computational thinking, education systems in several countries are applying national extensions. In Denmark, for example, the Danish national research center for ICILS 2018 is applying an additional quantitative study addressing school principals (Caeli & Bundsgaard, 2018). This extension aims to examine how schools already support or plan to support competences in the field of computational thinking by realizing the Danish new curriculum, which was implemented in 2017 and gives schools the opportunity to teach “technology understanding” (EMU, 2017). In Germany, several additions are conducted, including adding reading tests and tests on cognitive abilities (Eickelmann, 2017; Heller & Perleth, 2000). A closer examination of the third research question, in particular, the relation between student achievement in computational thinking and their self-reported proficiency in using computers, suggests that their self-reported proficiency of using computational thinking tasks also be taken into account. Therefore, the latter will also be added as a national extension in Germany. Furthermore, items focusing on computer science and its practice in schools are added as well as items aiming to examine the correlation between computational thinking and general problem-solving skills (Labusch & Eickelmann, 2017; see also Chap.  5 by Labusch, Eickelmann, & Vennemann, in this book). With regard to the computer science part, the German national consortium of the study was expanded to include a leading expert in the field of computer science and researching computer science in schools.

4.3 Relevance and Potential Outcomes for Educational Systems Around the World

Considering the increasing relevance of the next generation’s competences in the field of computational thinking, several educational systems around the world have already decided to implement computational thinking as an obligatory subject into school curricula. Despite the fact that the approaches taken in implementing computational thinking may differ between countries, it becomes clear that supporting computational thinking processes and competences is considered as a future-oriented part of school education and adds to the traditional subjects and learning areas (Labusch & Eickelmann, 2017). Developing an understanding of computational thinking that leads to facilitating teaching it in schools, however, seems to be challenging. While various perspectives on computational thinking and its relevance for school learning exist, there is very limited availability of empirical knowledge based on a sound database. In addition to many studies currently being conducted around the world, the international comparative large-scale assessment ICILS 2018 for the first time provides empirical findings on student achievement in computational thinking in different education systems. The addition of questionnaire data to the results of the computer-based student test information about the incorporation of computational thinking in teaching and learning in classrooms as well as about school curricula enables data to be gathered on teachers’ attitudes and students’ background. In terms of research, one of the main outcomes of ICILS 2018 is the development of a theoretical model explaining student competences. Furthermore, the development of a sound understanding of the construct as well as bringing forward an empirical-based competence model differentiating between different competence levels is to be acknowledged and will be included in the report on the results of the study. For education systems—countries or regions—participating in the computational thinking part of ICILS 2018, which is optional for countries participating in the study, in-depth information on Grade 8 student achievement in computational thinking, the relevance of school and individual factors, such as gender, migration background, or socioeconomic background will be available. Beyond findings at the national and regional levels, the added value of the study lies in the international comparison and the opportunity for education systems to learn from one another.

In summary, it can be stated that the study will close a research and knowledge gap in the field of teaching and learning computational thinking and its implementation in schools and school systems. Apart from reflecting the status quo of student achievement in computational thinking, with publication of reporting on the study by the end of 2019, the future direction for education systems can be drawn from the study’s national and cross-national results. Furthermore, the study provides a starting point for developing computational thinking competence in students on the basis of an international comparative approach. Applying and realizing an international approach underpins the importance of the research field related to computational thinking.


  1. Aho, A. V. (2011). Computation and computational thinking. Computer Journal, 55(7), 832–835.CrossRefGoogle Scholar
  2. Ainley, J., Schulz, W., & Fraillon, J. (2016). A global measure of digital and ICT literacy skills. Background paper prepared for the 2016 Global Education Monitoring Report. Paris: UNESCO.Google Scholar
  3. Barr, V., & Stephenson, C. (2011). Bringing computational thinking to K-12: What is involved and what is the role of the computer education science community? ACM Inroads, 2(1), 48–54.CrossRefGoogle Scholar
  4. Bescherer, C., & Fest, A. (2018). Computational thinking in primary schools: Theory and casual model. In A. Tatnall & M. Webb (Eds.), Tomorrow’s learning: Involving everyone. IFIP advances in information and communication technology. Springer.Google Scholar
  5. Brennan, K., & Resnick, M. (2012). New frameworks for studying and assessing the development of computational thinking. In A. F. Ball & C. A. Tyson (Eds.), Proceedings of the 2012 Annual Meeting of the American Educational Research Association (pp. 1–25). Vancouver: American Educational Research Association.Google Scholar
  6. Caeli, E. N., & Bundsgaard, J. (2018). Computational thinking initiatives in Danish grade 8 classes. A quantitative study of how students are taught to think computationally. Paper presented at ECER 2018 (European Conference on Educational Research), Bolzano, Italy.Google Scholar
  7. Dede, C., Mishra, P., & Voogt, J. (2013). Advancing computational thinking in 21st century learning. Presented at EDUsummIT. International Summit on ICT in Education, Dallas, TX.Google Scholar
  8. Department of Education (2013). National curriculum in England: Computing programmes of study. Retrieved January 08, 2018, from
  9. Eickelmann, B. (2017). Computational Thinking als internationales Zusatzmodul zu ICILS 2018—Konzeptionierung und Perspektiven für die empirische Bildungsforschung. [Computational Thinking as an international option in ICILS 2018—The perspective of educational research] Tertium Comparationis. Journal für International und Interkulturell Vergleichende Erziehungswissenschaft, 23(1), 47–61.Google Scholar
  10. Eickelmann, B. (2018). Cross-national policies on information and communication technology in primary and secondary schools—An international perspective. In J. Voogt, G. Knezek, R. Christensen & K.-W. Lai (Eds.), International handbook of information technology in primary and secondary education. Singapore: Springer.Google Scholar
  11. EMU (2017). Teknologiforståelse [Technology understanding] valgfag (forsøg) – Fælles Mål og læseplan [Danish curriculum]. Retrieved January 08, 2018, from
  12. ETS (Educational Testing Service) (2007). Digital transformation: A framework for ICT literacy. A report of the international ICT literacy panel. Princeton: Center for Global Assessment. Retrieved January 09, 2018, from
  13. Fraillon, J., Ainley, J., Schulz, W., Friedman, T., & Gebhardt, E. (2014). Preparing for life in a digital age. The IEA international computer and information literacy study international report. Amsterdam: Springer.CrossRefGoogle Scholar
  14. Fraillon, J., Schulz, W., & Ainley, J. (2013). Assessment framework of ICILS 2013. Amsterdam: IEA.Google Scholar
  15. Fraillon, J., Schulz, W., Friedman, T., & Duckworth, D. (2019). Assessment Framework of ICILS 2018. IEA: Amsterdam.Google Scholar
  16. Heller, K. A., & Perleth, C. (2000). KFT 5–12 + R. Kognitiver Fähigkeitstest für 5. bis 12. Klassen, Revision [cognitive abilities test for Grade 5 to 12 students]. Göttingen: Beltz.Google Scholar
  17. IEA (2016). The IEA’s International Computer and Information Literacy Study (ICILS) 2018. What’s next for IEA’s ICILS in 2018? Retrieved January 09, 2018, from
  18. ISTE (2016). ISTE Standards for Students. Retrieved April 03, 2018, from
  19. Kafai, Y. (2016). From computational thinking to computational participation in K–12 education. Seeking to reframe computational thinking as computational participation. Communications of the ACM, 59(8), 26–27.Google Scholar
  20. Kong, S. C. (2016). A framework of curriculum design for computational thinking development in K-12 education. Journal of Computers in Education, 3(4), 377–394.CrossRefGoogle Scholar
  21. Labusch, A., & Eickelmann, B. (2017). Computational thinking as a key competence—A research concept. In S. C. Kong, J. Sheldon & K. Y. Li (Eds.), Conference Proceedings of International Conference on Computational Thinking Education 2017 (pp. 103–106). Hong Kong: The Education University of Hong Kong.Google Scholar
  22. Mannila, L., Dagiene, V., Demo, B., Grgurina, N., Mirolo, C., Rolandsson, L., & Settle, A. (2014). Computational thinking in K-9 education. Presented at Conference on Innovation & technology in computer science education, Uppsala.Google Scholar
  23. Papert, S. (1980). Mindstorms. Children, computers and powerful ideas. New York, NY: Basic Books.Google Scholar
  24. Rich, P., & Hodges, C. B. (Eds.) (2017). Emerging research, practice, and policy on computational thinking. Springer Publishing Company.Google Scholar
  25. Siddiq, F., Hatlevik, O. E., Olsen, R. V., & Throndsen, I. (2016). Taking a future perspective by learning from the past—A systematic review of assessment instruments that aim to measure primary and secondary school students’ ICT literacy. Educational Research Review, 19(1), 58–84.CrossRefGoogle Scholar
  26. Voogt, J., Fisser, P., Good, J., Mishra, P., & Yadav, A. (2015). Computational thinking in compulsory education: Towards an agenda for research and practice. Education and Information Technologies, 1–14.Google Scholar
  27. Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33–35.CrossRefGoogle Scholar

Copyright information

© The Author(s) 2019

Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Authors and Affiliations

  1. 1.Institute for Educational Science, Paderborn UniversityPaderbornGermany

Personalised recommendations