The presented test instrument opens up a wide range of approaches for further didactic research and valuable implications for university teacher education; it may not be considered in isolation from the findings of the above discussion or from the limits set out.

Teaching mathematical modelling is a cognitively demanding activity for (pre-service) teachers (Blum, 2015), which is why quality development in teacher education requires a detailed examination of professional competence to teach mathematical modelling. In order to analyse this competence, theoretical models (see Sect. 2.4) are required, which describe the requirements for teachers in detail, as well as measuring instruments (see Sect. 3.5) that are equally suitable for the purpose of adequately measuring the skills and abilities required. As shown in the course of this book, these could be used profitably for conceptualisation such as operationalisation of specific professional competence.

The measuring instrument covers many substantial components of modelling-specific professional competence and has been extensively examined—given the recorded constructs—in order to meet established test quality criteria (see Sect. 4). The recognition that the conceptualised domain-specific competence can be empirically recorded and the corresponding knowledge facets can be described in a rapid and homogeneous manner thus indicates an added value for further didactic research on the teaching of mathematical modelling, since for example a wide range of university courses and concepts can be evaluated in a more targeted manner and thus be assessed in a more differentiated way. Competence developments can also be analysed in more detail in order to obtain an informed basis for modelling the possible levels.

The extent to which there are correlations between individual aspects of professional competence for teaching mathematical modelling could be demonstrated with regard to the cognitive-oriented as well as the affective-motivational components. In subsequent research projects, it would be desirable to combine the aspects of professional competence for teaching mathematical modelling with other specific competencies. For example, Klock (2020) focuses on the correlations between intervention competency and diagnostic competency, while Wess (2020) looks at the links between the task and diagnostic competency to teach mathematical modelling. On the other hand, examinations of further correlations, for example between specific task and intervention competency or between these and other constructs, are still pending.

However, in the light of the above, it should also be noted that these relate on the one hand to pre-service teachers and on the other to individual universities in Germany. Accordingly, the results obtained primarily represent site-specific empirical confirmations of the structures and correlations shown. Further work with the aim of a possible adaptation of the conceptualised structural model as well as of the test instrument used to fit and use practicing teachers on the one hand and in international contexts on the other is therefore still to be done.

Finally, in the context of the COACTIV study, which serves as the basis for the conceptual considerations of the structural model as defined in Sect. 2.4, it would be of particular interest to gain insights into other facets of modelling-specific professional knowledge. In this context, the combination of this instrument and that of the test developed by Haines et al. (2001) to capture modelling-specific content knowledge as a profitable way to conduct future analyses. However, such a combination requires either the compilation of extensive test books or the conception of a balanced rotation design. As both instruments ensure the rapid homogeneity of the designs considered, the latter option, in particular, shows itself to be an economic way of recording a broader competence structure.

In general, it is appropriate for the following research projects to use these preparatory work as a starting point to further investigate the genesis, structure and relevance of professional competence of (pre-service) teachers in the field of mathematical modelling. For example, it would be particularly desirable to demonstrate to what extent the modelling-specific content knowledge or the modelling-specific pedagogical content knowledge as well as other affective-motivational components of professional competence to teach mathematical modelling of practitioners are predictively valid for the quality of teaching and the learning progress of their students. To answer a more global design of this question, COACTIV used the longitudinal cross-sectional component of PISA in Germany (Bruckmaier et al., 2018). On the other hand, for a local, modelling-specific design, it is advisable to use proven and valid modelling competence tests, such as those developed by Zöttl et al. (2010), Kaiser and Brand (2015) or Hankeln et al. (2019).

Overall, the effectiveness of (more) developed elements, structures and teaching formats in the context of teacher education must always be measured in terms of the developed teacher competences. On the basis of the findings presented, it also seems desirable to consider further process-related competences such as problem solving or reasoning in the context of teaching–learning laboratories, thereby contributing to a holistic, practice-related mathematics teacher education.