Is there a link between Preparatory Course Attendance and Academic Success? A Case Study of Degree Programmes in Electrical Engineering and Computer Science

  • Gilbert Greefrath
  • Wolfram Koepf
  • Christoph Neugebauer
Article
  • 693 Downloads

Abstract

In recent years, universities have been increasingly complaining that the basic mathematics skills of new students barely satisfy the requirements of many degree courses. They criticise the high dropout rates, especially in mathematics and natural sciences degrees, and link them to the lack of basic skills of university entrants. Many universities are reacting to this problem by setting up or reorganising preparatory courses in subjects with mathematical content. School knowledge in mathematics is usually repeated in these preparatory courses. The orientation of these preparatory courses is, however, very heterogeneous and ranges from the teaching of skills to development of general competences. The type of preparatory course (e-learning or classroom based course) also varies. The effect of such preparatory courses is being examined at many universities via preliminary tests and follow-up tests. Using Kassel University as an example, we are examining the possible correlations between the examination results in maths, attendance of a preparatory course and the test results at the start of the course. We are also looking at whether correlations between the type of preparatory course, classroom-based or e-learning, and the exam results can be established. We are also interested in whether it is possible to predict the course progress with the help of a test at the start of the course. The article shows the possibilities and the limits in deriving information from data on test and exam results. This could be an interesting perspective for many universities.

Keywords

Preparatory course Academic progress Bridge course Academic success Admissions test 

Introduction

There are big hurdles to overcome in the transition from school to university, particularly in degree programmes with a large portion of mathematics (Gueudet 2008; de Guzman et al. 1998; Holton 2001; Tall 1991). For years statistics have been published regularly on the increase in attrition and drop-out rates at German universities. Heublein et al. (2012) report that the student drop-out rate in bachelor’s degree programmes at universities is 39 % in the field of mathematics/natural sciences. As for the causes of dropping out of university, it is necessary to consider both external factors (school preparation, academic requirements, financial situation, etc.) and internal ones (psychological/physical stability, ability to perform, motivation, etc.) (cf., among others, Heublein et al. 2008; Dieter 2012). Universities have reacted to this with numerous measures in recent years. In order to counter the difficulties in transition, tests and preparatory courses are held at many universities on the start of a degree programme in order to first determine the mathematical skills of the new students and then to close any possible gaps that are discovered. The short-, medium- and long-term effect that preparatory courses have on students’ success is controversial, however. By some they are judged to have only a very minor effect while by others a significantly larger one (cf. Lagerlöf and Seltzer 2009; Ballard and Johnson 2004).

In many locations, including in Kassel (Germany), there are extensive data records on test and exam results of new students. We would like to look into the question of what information can be obtained from such data; for students, and for researchers and universities. Thanks to the diverse possibilities of the Internet, preparatory courses are frequently no longer offered as pure classroom-based courses, but are supplemented by material provided online. In this article, therefore, we will also examine the effect that two different preparatory course formats (classroom-based and e-learning) have on success in the degree programmes of Electrical Engineering and Computer Science at the University of Kassel. In particular, we are interested in possible differences between the fields and preparatory course formats.

Theoretical Background

Tests at the Start of the Degree Programme and Forecasting Academic Success

Tests at the start of studies can have distinct functions (s. Fig. 1). Tests could have the aim of recording the current performance level of the students or generating a prediction of how successful their studies will be. Tests can serve the purpose of student selection (to allocate a limited number of study places for example), can help provide student support (to allow them to attend additional offers or a preparatory course) or serve research purposes. It is interesting for research, for example, to see how well study success can be predicted through testing at the start of the degree programme or through other parameters. The question concerning the quality of prediction cannot be clearly answered. It is reminded to optimize the tests to improve the prediction of study success.

Registration of Performance Level

The correct answer rates for mathematical tests at the start of the studies are worryingly low for the most part and may be falling. They are around 40 % (cf. Knospe 2008, 2011; Abel and Weber 2014; Haase 2014). According to Knospe (2011), the reason is that the content is not addressed and taught appropriately at school. Bruder et al. (2010) note that universities see deficiencies above all in the subject matters handled in grade 5–10, and observe shortcomings in new students’ general competence such as self-organisation, self-assessment or their willingness to make an effort. Some of these tests at the course outset have been conducted for many years. At the University in Esslingen, the correct answer rate for the same test declined over a period of 20 years from just under 60 % to roughly 45 % (Abel and Weber 2014, p. 2). Knospe (2011) also makes similar observations.

Naturally, it is necessary to consider that many factors have an influence on the acquisition of mathematical competence (in the sense of the German educational standards, KMK 2012) in a degree programme, and not all of these factors can be reflected in tests given at the start of the course or in school marks. Here, for example, one could name cognitive and motivational characteristics of students or the learning environment at the university and in the family (see Lingel et al. 2014, p. 59).

Test Development and Prediction of Study Success

The ability of tests at the course outset to forecast a student’s future performance at university – particularly if these tests are used to select students – is frequently questioned (see, e.g., Geiser and Studley 2002). There is no clear answer to the question of how well tests at the course start forecast academic success. Calls are often made to perfect the tests in order to improve their ability to forecast (see Wilmot et al. 2011). It is also discussed whether such tests are even capable of forecasting academic success at all, or whether there are better indicators of future academic success than such test results, and whether such tests are fair.

Other research has also addressed the accuracy of forecasts based on tests at the start of the course. In a comparison of tests at the start of the course and school marks at the end of the school period, studies showed that school marks are the better predictor. Precisely for fields of study such as mathematics, the natural sciences and engineering, both the average mark and the mark in mathematics at school can be viewed as good predictors (Trapmann et al. 2007, p. 24). This also applies to degree programmes containing a lot of mathematics at universities of applied sciences [Fachhochschule] even if the correlation between the school mark and the final mark in the degree programme is weaker for students with a university access authorisation to a university of applied sciences than it is for those with a university access authorisation to a general university (Hell et al. 2008, p. 137). An interesting aspect is that in the case of a degree programme in engineering a mathematics test or a school mark in the field of mathematics is usually able to predict academic success better than a specific aptitude test for engineering (Hell et al. 2008, p. 163).

However, there are also studies showing that a mathematics test at the start of the course is better able to forecast performance in mathematics examinations in the first few years of studying than the mark in mathematics at the end of the school period. For example, a study of Greefrath and Hoever (2016) at the University of Applied Sciences in Aachen from 2009 to 2013 analysed 809 admissions tests in the field of mathematics that were taken by students for degree programmes in Electrical Engineering and Computer Science. The test consisted of 16 items that question the future students’ abilities on secondary levels. The students achieved average correct answer rates of 40 to 50 %, depending on the requirements. If one looks at correlations with the results of examinations in higher-level mathematics at the end of the first year of studying, it can be seen that the score of the test correlates more strongly with the mark in the mathematics examination than with the A-level marks (Abitur marks) in mathematics. In this arrangement, the test appears to be a good predictor of academic success in mathematics courses, at least a better predictor than the mark in mathematics at school or the average mark on a school graduation certificate. That is an interesting result, particularly in light of the brevity of the test (Greefrath and Hoever 2016). Ballard and Johnson (2004) hold the view that a test at the start of the course on fundamental mathematical concepts to be an important factor for forecasting academic success in an introductory course on microeconomics. The admission tests usually consisted of simple mathematics tasks of lower and upper secondary school mathematics. Hence, it is not clear that these tests are valid to forecast the performance in university mathematics examinations. However, there is an important difference from tests at school: no calculators were allowed in university tests. This same restriction applies to exams at the university, which may be a clue as to why the test results correspond well to the exam results.

Some research has also raised doubts about the fairness of tests at the start of the course for the selection of students, if the number of places for the study programme is limited. It has been seen that the performance of women in tests at the course outset is underestimated. Furthermore, the use of A-level marks (Abitur marks) together with admission tests appears to be fairer with regard to gender-specific disadvantages (Fischer et al. 2013, p. 2).

Preparatory Courses

Many universities prepare new students annually for their degree programmes by offering a preparatory course in mathematics. Preparatory courses in the field of mathematics are offered for all mathematically rich degree programmes and at all kinds of universities. These preparatory courses are often intended for new students in all disciplines that use mathematics, above all, engineering, mathematics and natural sciences. Criteria for student participation in the preparatory course are also given. This means that preparatory course participation can be voluntary or offered as an option (cf. Reimpell et al. 2014). One of their aims is to go over school mathematics in order to achieve a sufficient common and adequate level of knowledge at the start of their studies. The preparatory courses also help with the development of mathematical techniques and structures that will be a part of the daily toolkit in the future degree programme. However, the preparatory courses should not introduce the content of the basic lectures, but rather treat the school mathematics in more depth and offer an impression of the later degree programme. Furthermore, they should adopt, compared to school, more abstract ways of speaking, writing and arguing for mathematics courses at university. Even pupils with good marks – independently of the selection of a basic or advanced course – often have great difficulties meeting the requirements for first-year lecture courses at university. Preparatory courses in mathematics should help to resolve these problems and even out the different levels of knowledge before the commencement of a degree programme. They should give everyone who considers this useful for his or her studies, particularly new students who took a break after their A-level (Abitur), the opportunity to freshen up their knowledge and skills.

The teaching of mathematical competences in preparatory courses takes place with different accentuation. An important question is whether the knowledge gained should be more process-related, therefore with a focus on general mathematical competences as in the German educational standards (KMK 2012) such as problem solving, modelling and argumentation as well as communicating, or more content-related, therefore more structured according to mathematical subject areas. A preparatory course can also be designed to convey general competences for a degree programme, such as generic learning methods or organising studies including work methods.

The way a preparatory course is conducted ranges from purely online courses, some with virtual tutors such as the online-mathematics-bridge course (OMB) at TU Berlin (cf. Roegner et al. 2014), to combined blended-learning courses (e.g., Ebner et al. 2016) right through to full classroom based courses. Online courses allow flexible, mobile learning at a student’s own learning pace.

Some preparatory courses concentrate on content-related mathematical competences. The success of these courses can be examined with preliminary and follow-up tests that record the students’ mathematical competences. At Esslingen University, for example, students on technical and business degree courses who took part in the compact course achieved around 10 % more correct answers (out of 31 multiple choice questions) on average in the test in the first lecture week in comparison with a group who did not attend the compact course. The participation in the compact course was based on the student’s self-assessment (Abel and Weber 2014).

Other preparatory courses have more of a focus on process-related or general competences. In cooperation with the chairs for didactics in mathematics at the Ludwig Maximilian University (LMU) of Munich and the Technical University of Munich (TUM), a two-week bridge course concept was developed – consisting of lectures and exercises in small groups. A focal point of the Munich bridge course is on teaching learning strategies that are necessary or helpful for university mathematics. Furthermore, new students should become familiar with the precise usage of mathematical language. A third focal point of the Munich bridge course is to introduce new students to authentic working methods in mathematics and thus the emphasis on the procedural character of mathematics. Evaluations of the preparatory course in Munich show significantly better results from students who took part in the bridge course and that the students grasped the prioritisation and therefore the pursued goals (cf. Reichersdorfer et al. 2014). Figure 2 shows an overview of decisions as part of a preparatory course concept (s. Greefrath et al. 2015).
Fig. 1

Functions of tests at the start of studies

Fig. 2

Decisions as part of a preparatory course concept (cf. Kürten et al. 2014)

Another large project is being conducted at the universities of Kassel, Darmstadt, Paderborn and Lüneburg, among other universities (Biehler et al. 2012). The group project called VEMINT (Virtual Entry tutorial for Science, Technology, Engineering and Mathematics) offers both learning material in a modular format and electronic preliminary and follow-up tests in order “to support students in their ability to self-manage and self-assess” (Biehler et al. 2014, p. 261). Bausch et al. (2014) describe in more detail the actual organisation of preparatory courses in the VEMINT project, which are based on interactive learning material developed by Biehler et al. (2014). In this project, preparatory courses and bridge courses have been developed and researched in the area of mathematics for some years now. At numerous German locations, multimedia teaching and learning material (subjects covering secondary level I and the basics of logic, proofs, analysis and vector calculus) have been designed for various scenarios at both general and applied science universities, and evaluated (course and material have received very positive feedback from the participants). The basic knowledge that new students in STEM subjects have at the beginning of university for successfully studying. This heterogeneity needs to be levelled off by the preparatory course. At the Technical University of Darmstadt, the VEMINT learning material has been used for new students in the disciplines of Civil Engineering, Computer Science, Mathematics, Mechanical Engineering and Mechanics since 2007. At the end of the four-week preparatory course, the students rated the course very positively in an on-line evaluation, and also gave it good marks in terms of academic progress. In the preparatory mathematics course at the University of Kassel, students choose between e-courses with a higher amount of e-learning and the classroom-based courses with a higher amount of classroom time. Participants were also divided into specific course groups here based on the specific requirements in the degree programme. In Kassel, Fischer (2014) found a wide variability in the changes in the students’ mathematical competences after the preparatory course. However, considerably better results were observed in the e-courses. Neither a more sophisticated examination of the effects on the academic success of distinct course options nor an examination of the long-term effects of preparatory courses has yet been completed at all of the universities involved in the VEMINT project. In Kassel – unlike most other universities – successful participation in a test at the beginning of the degree programme is obligatory for the two programmes Electrical Engineering and Computer Science. The students of Electrical Engineering and Computer Science have the option to participate in a preparatory course. They can select between a preparatory course with high attendance lasting six weeks and an electronic course with low attendance lasting four weeks. None of these preparatory courses is mandatory. However, the test which takes place between the preparatory course and the lecture period is mandatory for all beginning students. The mathematical topics covered in the test are topics which are also covered in the preparatory courses. Both preparatory courses use the VEMINT materials (Biehler et al. 2014). These materials cover the topics arithmetic; rewriting of terms; powers; functions, linear, quadratic, polynomials, exponential and logarithm, trigonometric functions; calculus, in particular differential and integral calculus; vectors, lines and planes. These topics are presented on a CD or in Moodle, using the menu items Summary, Introduction, Explanation, Applications, Errors, Exercises, Information, Visualisation, Supplement, and the material is intended for self-study. The students can select intermediate exam questions, and are given feedback about their success.

Participation in most of the preparatory courses is voluntary. These courses are gladly taken up, and the evaluations have been positive. Few universities restrict themselves solely to the repetition and consolidation of school material, but rather also provide strategies to manage anticipated difficulties in the degree programme. Research has not come to a uniform conclusion on the benefits of such preparatory courses. Whilst, as suggested above, many universities have circulated only surveys without any statistical analysis on their preparatory courses, preliminary and follow-up tests in connection with preparatory courses are given increasingly frequently in order to examine the effects more precisely. However, caution is often required with regard to the results since selective parameters must be taken into account due to the voluntary participation in many of the conducted tests and preparatory courses. Furthermore – as we will see in this study – there are other parameters that also have an influence on the academic success, so it is not possible to make a clear statement on the influence of a preparatory course.

Basic Education in Mathematics as Background for Test Design

In this article the terms competence, knowledge and skills are differentiated. Knowledge refers to the image of reality that is anchored in social consciousness. Skills consist of abilities and capabilities; the latter are automatic components that cannot be controlled consciously (Pippig 1988). The frequently used term competence includes both knowledge and skills according to this interpretation (Weinert 2001). Competence comprises not only knowledge, abilities and capabilities, but also motivation and will. Of particular significance for the tests at the course outset, examined in the following, is the general mathematical competence “to handle symbolic, formal and technical elements in mathematics” as stated in the German educational standards (KMK 2012). Universities attach particular importance to these abilities at the beginning of students’ studies, and they are usually checked in tests at the course start. “This competence consists primarily of carrying out operations with mathematical objects such as numbers, quantities, variables, terms, equations and functions as well as vectors and geometric objects. The spectrum here ranges from simple and transparent routine processes to complex processes and reflective analysis of them. This competence also includes factual knowledge and fundamental rule knowledge for the systematic and efficient processing of mathematical tasks…” (KMK 2012, p. 16). We use knowledge, capabilities and skills, therefore a cognitive competence term, as selected aspects of competence in the following. Weinert (2001) also differentiates between cognitive and non-cognitive components of competences.

There are hints in the literature that the general mathematical competence “to handle symbolic, formal and technical elements in mathematics” and the development of complex mathematical competences are connected. Longitudinal studies at elementary school level have examined the extent to which the ability to perform arithmetic operations falling within this competence are connected with the solution of complex mathematical tasks. Here it was seen, for example, that text tasks are solved better if the pupil has a greater command of arithmetic operations (Fuchs et al. 2006), and it was also observed that an early understanding of numbers benefits the development of mathematical competence (Krajewski and Schneider 2009). It is therefore suspected that a good command of operations with mathematical objects in terms of the aforementioned competence improves the development of complex mathematical competence at university as well. This is normally a prerequisite for the design of tests at the start of the course and preparatory courses.

Mathematical expertise can be described using various models. Krauss et al. (2008) based their work on Shulman (1986) and made a proposal for four different levels to describe mathematical expertise. They differentiate between:
  1. 1)

    everyday mathematical knowledge,

     
  2. 2)

    command of school material,

     
  3. 3)

    deeper understanding of specialist content at secondary level,

     
  4. 4)

    university knowledge (see Krauss et al. 2008, p. 237).

     

Building on this differentiation, we shall provide a more detailed description of mathematical competence that can be used for making distinctions in competence based on the four levels above. By using the term competence instead of knowledge, we are emphasising that not only knowledge, but also abilities and capabilities will be included in our model of mathematical expertise.

The classification by Krauss et al. (2008) begins with a first level of mathematical knowledge that all adults must possess. The first level is called everyday mathematical competence. It can be described as the ability “to obtain, use and interpret everyday mathematical information and thus to successfully handle the various mathematical requirements in daily life” (Rammstedt 2013, p. 12). This includes, for example, the evaluation of an offer for an everyday object from a mathematical point of view. With regard to everyday mathematical competence, the last PIACC study showed a value in Germany that was slightly, but statistically significantly above the OECD average (Rammstedt 2013, p. 14).

Krauss et al. (2008) refer to the second level as school knowledge. We use this level to signify the competence to use the knowledge usually taught in school. This school-mathematical competence goes beyond everyday mathematical competence and includes, for example, the competence that is required for an A-level examination at school [Abiturprüfung]. The levels of competence in Germany are outlined in the education standards for the general university access authorisation [Hochschulreife]. They specify the general and content-related mathematical competence that pupils should have acquired by the end of their time in school. The general mathematical competence includes: mathematical reasoning, solving problems mathematically, modelling mathematically, using mathematical illustrations, handling symbolic, formal and technical elements in mathematics, and communicating mathematically. This competence should be acquired by pupils in their work with mathematical content that is divided up into main concepts. These main concepts should establish relationships between different mathematical areas, that is, counter an isolated perspective in mathematical disciplines (KMK 2012). Whilst Germany’s PISA study results in mathematics improved from 2003 to 2012 producing a lead of roughly a half-school year relative to the OECD average (OECD 2014), universities are complaining about clear deficiencies in competence at the end of the school period (e.g., Cramer and Walcher 2010). We shall assume that the everyday mathematics competence is contained in the school mathematics competence.

The third level describes specialist knowledge in mathematics that is required for a deeper understanding of the content in level 2. We call this level in-depth competence in school mathematics. This also includes elementary mathematics from a higher standpoint (see Klein 1924), that is, as it is taught at university. In turn, the previous level is contained in this level. The fourth level will be called competence in university mathematics. This means specialist knowledge in mathematics, which is taught at university, usually without any connection to the school curriculum, for example algebraic number theory (cf. Krauss et al. 2008).

Depending on the degree programme, in-depth competence in school mathematics or the competence in university mathematics are achieved at university in a subject with mathematical components. In the design of preparatory courses and tests at the course outset, the focus is usually on competence in school-level mathematics, including everyday mathematical competence. This is also the case in the mathematics test conducted here. Fischer (2014) observed the context, self-assessment of students, process quality of the preparatory course and data on learning activities, amongst other aspects, in an extensive model. The mathematics test used also only uses items from competence areas taught in school (Fischer 2014, p. 210), so it should be classified as the second level according to Krauss et al. (2008).

In our study we are concentrating on the cognitive aspects of mathematical competence. The reasons for this are the data situation of the tests used, the available information on the preparatory course carried out and a view to the possible use of our results and tools at other universities.

Research Questions

For the project at the University of Kassel, a preparatory course with high attendance of six weeks and an electronic course with low attendance of four weeks in the Framework of the VEMINT-Project (s. Preparatory Courses) was chosen, and a mathematics test of competence in school-level mathematics was developed (s. Test Construction for the Mathematics Test). It is therefore interesting to examine what effects attendance of the preparatory course has on future academic success in Mathematics, and what forecast the test at the start of the course provided for future academic success in Mathematics. Exam results from the first degree year in mathematics were available to us as a measure of study success in mathematics. At the university of Kassel both classroom-based courses and extensive e-learning courses have been developed. This raises the question of whether this effort is justified, and which variation should be recommended to students. In connection with the type of preparatory course, consideration must be given to whether the two groups that in each case decide on the type of preparatory course are comparable at all. One might expect that in comparable groups the classroom-based environment and the personal exchange have a positive effect on success (cf. Fischer 2014). In light of the specific framework conditions at the University of Kassel, three research questions were developed here and will be discussed and checked on the basis of the first results in the following:

Question 1: Can we find a relation between the results of the mathematics test and participation in a preparatory course or the selected type of preparatory course (classroom-based or e-learning)?

Question 2: Can we establish a link between the exam results in the subject of mathematics in the first year of study and participation in a preparatory course or the selected type of preparatory course (classroom-based or e-learning)?

The mathematics test is usually taken at the start of the first year shortly after the preparatory course. The obvious question is therefore whether you can establish the effects of the preparatory course with the help of the test. Of course, the question of whether the effects of the preparatory course can still be measured later on is even more interesting. These questions arise since a preparatory course was set up in the hope that it would have a positive effect on students. The aforementioned results of the research show that preparatory courses are very positively assessed by students and teachers. Their effect might be minor, however.

Question 3: Can a test at the start of the degree programme provide a prediction for the course progress in mathematics in the first semesters of a degree programme?

In Kassel – unlike most other universities – successful participation in a test at the beginning of the degree programme for Electrical Engineering and Computer Science is obligatory. The question of how well these tests can forecast performance is particularly interesting here. In examining the forecasting options with regard to the first few semesters, it is necessary to address the question of the fairness of the tests and to examine gender-specific discrimination, as some research has showed.

Design of Study

Creation of the Study

In the winter semester of 2010/2011, at the University of Kassel the collection of data started for a classroom-based preparatory course and an e-learning preparatory course for students in the bachelor’s degree programmes of Electrical Engineering and Computer Science.

The preparatory course option “classroom-based” is a six-week preparatory course entirely in the classroom with a lecturer (alternating “lecture”/exercise/self-learning); the option “e-learning” is (only) a four-week preparatory course that has roughly 4 classroom-based units, with the preparatory course being held otherwise at home electronically via Moodle. The students are free to choose which option. The intentions of the preparatory courses are, on the one hand, to repeat, consolidate and reflect on school mathematics and, on the other, to teach academic ways of thinking and working in mathematics at the university level. For this reason, the content of the preparatory course mainly refers to school knowledge in the field of mathematics. Just a little content from the lectures on Linear Algebra and Analysis are included.

Whilst the students may decide on whether to attend a preparatory course and select the type of preparatory course (classroom-based or e-learning), they are all required to take the subsequent mathematics test as this is anchored in the examination policy of the university. Students can also take this test without attending a preparatory course. Only after passing this test are students allowed to attend lectures in later semesters. If students fail the test or do not take it, they must take a preparatory course in mathematics alongside their degree programme, and only after passing this accompanying course will they be allowed to repeat the mathematics test. Those students who failed the test, cannot in all cases be clearly classified in participating or not participating a preparatory course according to these regulations.

The lectures entitled Linear Algebra and Analysis in the first two semesters are attended by students in the bachelor’s degree programmes of Electrical Engineering and Computer Science, among others. Especially the courses during the first semesters repeat much of the content in school-level mathematics such as elementary functions in Analysis and vector calculus in Linear Algebra.

The quasi-experimental study was conducted from the winter semester of 2010/2011 to the winter semester of 2013/2014. In each academic year the study was carried out in the same way. The chronological sequence of the study per academic year is outlined in Fig. 3. The sample consists of a total of 1,376 students of which 639 were in the Electrical Engineering degree programme and 737 were in the Computer Science degree programme. In total, 1,243 students are male and 133 female. In addition to the results of a mathematics test, the results of the examinations in Analysis and Linear Algebra were collected. Altogether, 1,052 students participated in the mathematics test; with 674 taking examinations in Analysis and 794 taking them in Linear Algebra.
Fig. 3

Chronological sequence of study in the first year

Research Instruments

Test Construction for the Mathematics Test

The mathematics test at the course outset includes the school-level mathematical competence that is necessary for building up in-depth, school-level competence in mathematics and the competence in university mathematics that should be acquired in the first few semesters. Particular importance is attached to carrying out operations since these are critical for building up greater competence (see “Basic education in mathematics as background for test design” section).

The mathematics test at the beginning of the course consists of a total of 33 items with the degree of difficulty the same as in the preparatory course. The following mathematical subject areas are part of the test: term conversions (6 items), powers (2 items), equations (3 items), elementary functions (3 items), differential calculus (6 items), integral calculus (4 items), geometry (5 items), calculating limit of a function (2 items) and functional examinations (2 items). The choice of content is orientated towards the mathematical prerequisites frequently demanded by universities from the area of school mathematics. Content from stochastics are generally not required, even if they are part of school mathematical skills. In contrast, algebra, geometry and analysis are specifically mentioned (see Dürr et al. 2016, p. 216 et seq.). Tests at other locations frequently use comparable test contents (see e.g., Greefrath and Hoever 2016) and relate to comparable preparatory course contents (see e.g., Hoever 2014).

The test does not consider competence in all areas of school mathematics, but rather capability and skills that can be classified under the general competence “to handle symbolic, formal and technical elements in mathematics” (Basic Education in Mathematics as Background for Test Design). The focus is on skills that are frequently required in mathematics lessons. These skills are, for example, in the area of mathematical term rewriting: “Simplify the following term as far as possible.” (6 items) or in the area of differential calculus: “Calculate the first derivative of each of the following functions.” (6 items). The terms or functional equations used are part of the standard scope of school mathematic skills (see level 3 of Krauss et al. 2008).

The following Fig. 4 shows an example of an item from the field of integral calculus.
Fig. 4

Example of an item

Almost all tasks are set in an open response format. Only a number or term is usually asked for as an answer. There is also one multiple-choice question. No aids are allowed during the 90-min test.

The quality of the research instrument was checked by using the data of 233 students from the winter semester 2013/14. For these tests, there are detailed results on individual items so that statements can be made on the reliability of the test. The items were coded with one point or zero points each. Furthermore, a factor analysis was conducted in order to determine the underlying latent variables. On the basis of the scree-plot, it was decided that the factor analysis revealed only one major factor. A total of 27 of the 33 items can be characterised by using one factor, which explains a total variance of 35 %. The reliability analysis of the items for this factor shows a good value for Cronbach’s alpha. This factor includes tasks from arithmetic, algebra and analysis. The reliability analysis in the entire test (33 items) also provides a very good value for Cronbach’s alpha \( \alpha =.93 \). From a psychometric point of view, a successful test design was thus achieved.

Collection of Additional Data

In addition to the collection of tests of mathematical competence, further person-related information was also collected. Not only the attendance in an advanced course in mathematics at school and the type of university access authorisation were included, but information about gender, year of graduation from school and selection of a preparatory course as well as the course option was collected.

Examinations

Academic success was operationalised by the results of mathematics examinations for Analysis and Linear Algebra after the first semesters of studying (s. Fig. 3). These examinations are used for economic reasons. The structure of the exams is equal in each year. The tasks in different years have small differences. The focal point of the examination in Linear Algebra consists of tasks related to vector calculus, complex numbers, matrix computations and linear mappings. It consists of 13 items in 4 tasks where real and complex equations must be solved; a system of linear equations is examined, the representation matrix of a linear mapping must be determined and matrix eigenvalues and diagonalizability are investigated. The following Fig. 5 shows an example of an examination task on Linear Algebra.
Fig. 5

Example of an examination task in Linear Algebra

The exam for Analysis consists of 17 items in 6 questions in which among, other things sequences, and functions are examined, limits, defining gaps and root functions of various types must be determined, derivatives and extreme points must be determined and 2-dimensional integrals have to be calculated. Some types of functions occurring in the examination have already been addressed in the context of the preparatory course.

The group of computer scientists receive 3 of these 6 tasks. An example of an examination task in Analysis is shown in Fig. 6.
Fig. 6

Example of an examination task in Analysis

Results

From the winter semester of 2010/2011 to the winter semester of 2013/2014 a total of 722 students attended the preparatory courses which were offered every winter semester; 320 of them selected e-learning, whilst 397 chose the classroom-based option. The others did not provide any information about the selection of the preparatory course. It is necessary to consider that the preparatory course could be selected voluntarily and information about the motivation for taking a preparatory course was not collected. This could mean that the groups with and without a preparatory course are not directly comparable. 235 students stated that they had taken an advanced course in mathematics in school; 762 had not. Unfortunately, too few students provided information about the type of university access authorisation. Furthermore, the results of the students were collected in the examinations in Analysis and Linear Algebra.

Table 1 shows, that predominantly students without an advanced course in mathematics attended a preparatory course.
Table 1

Attendance preparatory course

Gender

Advanced course in mathematics

Preparatory course

N

Male

Yes

Yes

151

No

70

No

Yes

491

No

185

Female

Yes

Yes

11

No

3

No

Yes

54

No

32

All

Yes

Yes

162

No

73

No

Yes

545

No

217

Missing Data

 

379

Table 2

Preparatory course option in relation to an advanced course in mathematics

Advanced course in mathematics

Preparatory course option

N

Yes

e-learning

102

 

Classroom-based

60

No

e-learning

212

 

Classroom-based

328

Missing Data

 

674

All

e-learning

320

 

Classroom-based

397

Missing Data

 

659

Students with advanced course in mathematics select prevalently an e-learning course (63.0 %), students without select prevalently a classroom-based course (60.7 %) (s. Table 2).

Table 3 shows the composition of the groups which chose a preparatory course and those who did not. As one can see, there are great differences inside and between the groups. One can make similar observations about the groups classified by preparatory course option (Table 4).
Table 3

Composition of the groups with and without a preparatory course

Preparatory course

Degree programme [N]

Gender [N]

Advanced course in mathematics [N]

Yes

Electrical Engineering

Male

Yes

356

656

162

Computer Science

Female

No

366

66

545

Not specified

Not specified

Not specified

0

0

15

No

Electrical Engineering

Male

Yes

283

587

73

Computer Science

Female

No

371

67

217

Not specified

Not specified

Not specified

0

0

364

Table 4

Composition of the groups in relation of the preparatory course option

Preparatory course option

Degree programme [N]

Gender [N]

Advanced course in mathematics [N]

e-learning

Electrical Engineering

Male

Yes

167

289

102

Computer Science

Female

No

153

31

212

Classroom-based

Electrical Engineering

Male

Yes

187

362

60

Computer Science

Female

No

210

35

328

Missing Data

659

659

674

Mathematics Test and Attendance of Preparatory Course

In total, just under 53.0 % (n = 722) of the students attended a preparatory course (55.7 % of the electrical engineers and 49.7 % of the computer scientists). And 26.1 % of all electrical engineers decided in favour of the e-learning option, whilst 29.3 % chose a classroom-based course.1 (46.9 % of the electrical engineers who attended a preparatory course decided in favour of the e-learning option, whilst 52.5 % chose a classroom-based course. 0.6 % did not provide data concerning their preparatory course option.) Among all computer scientists, 20.8 % decided in favour of the e-learning option, whilst 28.5 % opted for the classroom-based course. (41.8 % of the computer scientists who attended to a preparatory course decided in favour of the e-learning option, whilst 57.4 % chose a classroom-based course. 0.8 % did not provide data concerning their preparatory course option.)

As Table 5 shows, students in Electrical Engineering performed significantly better than computer scientists on the test in mathematics both with and without the preparatory course. Electrical engineers have probably a higher level of prior knowledge in mathematics than computer scientists when they enter university, or students with a higher level of prior knowledge are more likely to choose a degree programme in Electrical Engineering than Computer Science. Without attending the preparatory course, the pass rate in the mathematics test is 28 % (Electrical Engineering) and 23 % (Computer Science); for students who attended the preparatory course, the rates rise to 69 % and 55 %, respectively.
Table 5

Results from the mathematics test

Degree programme

Attended preparatory course

Mean value [%]

N

Cohen’s effect size d (with regard to attendance of a preparatory course)

Electrical Engineering

No

48.0

132

0.49 (small)

Yes

59.9

325

Computer Science

No

39.7

174

0.33 (small)

Yes

47.1

340

The differences in their mean values are significant on the level of 0.01

Both groups, however, achieve better results in the mathematics test after they have attended a preparatory course. This may be accounted for by a suitable matching of the preparatory course content and the content of the mathematics test. If one examines the effect sizes, then one observes only minor effects in the difference between those with and without a preparatory course for the respective degree programme. For the interpretation of the results, we followed Cohen (1988).

If we differentiate the mathematics test results by the type of preparatory course – the total number of students in this table who attend a preparatory course is smaller than 722, because not all students take part in the test (see above) – (cf. Table 6), this results in only a moderate effect with regard to students studying Electrical Engineering. Students who attend the e-learning preparatory course perform significantly better in the mathematics test than students in the classroom-based option.
Table 6

Results from the mathematics test in relation to preparatory course option

Degree programme

Preparatory course option

Mean value [%]

N

Cohen’s effect size d (with regard to attendance of a preparatory course)

Electrical Engineering

e-learning

64.2

150

0.55 (moderate)

Classroom-based

56.6

173

Computer Science

e-learning

46.4

144

0.32 (minor)

Classroom-based

47.7

195

Total

e-learning

55.5

294

0.42 (minor)

Classroom-based

51.2

368

The differences in their mean values (only Electrical Engineering) are significant on the level of 0.01

For students in Electrical Engineering, the decision for or against a preparatory course does not depend on having taken an advanced course in mathematics in school. By contrast, students in Computer Science are more likely to decide in favour of a preparatory course if they did not have an advanced course in school. If no advanced course was taken in school, both student groups are more likely to prefer the classroom-based option.

An advanced course in mathematics has a great influence on the results of the test in mathematics (cf. Table 7). Students of Electrical Engineering who took an advance course in school performed significantly better than ones without. The mean value of the achieved points is 75 % and 48 %, respectively, whilst among computer scientists it is 64 % (with advanced course in school) and 40 % (without advanced course). If we take into account all students of both degree programmes, an effect size of 1.23 is produced, and thus a major effect from having taken an advanced course in school against having not taken one is seen on the results of the test in mathematics.
Table 7

Results of the mathematics test in relation to an advanced course in mathematics in school

Degree programme

Advanced course in mathematics

Mean value [%]

N

Cohen’s effect size d (with regard to advanced course in mathematics)

Electrical Engineering

No

48.5

300

1.20 (large)

Yes

74.7

133

Computer Science

No

39.9

407

1.09 (large)

Yes

64.0

94

Total

No

43.6

707

1.23 (large)

Yes

70. 3

227

The differences in their mean values are significant on the level of 0.01

Regarding question 1, we can therefore establish that, amongst the examined students, attendance of a preparatory course and the type of course has an evidently lesser influence on the results of the maths test than attending an advanced mathematics course in school. This suggests that previous knowledge could have a much bigger influence on the test results than the preparatory course. But as you can see in Table 8 attending both a preparatory course and an advanced course in mathematics leads to the best results in the mathematics test. No preparatory course and no advanced course in mathematics are probably bad conditions for good results in the mathematics test.
Table 8

Results of the mathematics test in relation to an advanced course in mathematics in school and in relation to a preparatory course

Degree programme

Preparatory course

Advanced course in mathematics

N

Mean value [%]

Electrical Engineering

Yes

Yes

99

76.8

No

223

52.2

No

Yes

34

68.4

No

77

37.6

Computer Science

Yes

Yes

56

69.4

No

282

42.5

No

Yes

38

56.1

No

125

34.3

All

Yes

Yes

155

74.1

No

505

46.8

No

Yes

72

61.9

No

202

35.5

Attending Preparatory Course, Preparatory Course Option and Examination Results

The following table presents the examination results in Analysis and Linear Algebra in relation to attendance of a preparatory course (cf. Table 9).
Table 9

Examination results in Analysis and Linear Algebra in relation to attendance of a preparatory course

Degree programme

Attended preparatory course

Analysis

Linear Algebra

Cohen’s effect size d (with regard to attendance of a preparatory course)

Electrical Engineering

Yes

Mean value

53.5

53.5

0.088 (no effect), Analysis

0.19 (minor effect, Linear Algebra)

N

218

261

No

Mean value

51.5

49.1

N

158

159

Computer Science

Yes

Mean value

51.0

42.5

0.20 (minor effect, Analysis)

0.00 (no effect, Linear Algebra)

N

140

176

No

Mean value

46.0

42.6

N

159

197

The differences in their mean values are significant on the level of 0.05 (only Computer Science, examination results in Analysis)

Whilst attending a preparatory course in the field of Electrical Engineering evidently has only a minor relation to the examination results (there was a 2 % improvement in the examination in Analysis, and a 4 % improvement in Linear Algebra), and it is similar in Computer Science with regard to the area of Linear Algebra (no improvement), students of Computer Science did benefit from a preparatory course in the area of Analysis (improvement of 6 %). This may be due to the fact that students of Computer Science – as already mentioned above – enter university with less knowledge than students of Electrical Engineering and therefore profit in particular from the good match between the preparatory course offer and the examination in Analysis. However, the effect here is also minor – as can be seen in the table (dCohen = 0.2).

As Table 10 shows, the results in Analysis and Linear Algebra in relation to preparatory course option differs greatly. In the group of electrical engineers, the students with the e-learning option perform roughly 10 % better compared to students in the classroom-based course in both examinations. An explanation for the different results for the examination in Linear Algebra (Electrical Engineering) could be the composition of these preparatory courses. Students who took an advanced course in school are more likely to prefer the e-learning option. Accordingly, then, we should see comparable results among computer scientists. Yet the preparatory course option here did not have any influence on the results of the examination in either Analysis or Linear Algebra.
Table 10

Examination results in Analysis and Linear Algebra in relation to preparatory course option

Degree programme

Preparatory course option

Analysis [%]

Linear Algebra [%]

Cohen’s effect size d (with regard to preparatory course option)

Electrical Engineering

e-learning

Mean value

58.4

59.6

0.45 (minor effect, Analysis)

0.51 (moderate effect, Linear Algebra)

N

111

125

Classroom-based

Mean value

48.4

47.9

N

107

136

Computer Science

e-learning

Mean value

50.4

41.6

0.05 (no effect, Analysis)

0.074 (no effect, Linear Algebra)

N

51

69

Classroom-based

Mean value

51.7

43.2

N

87

107

The differences in their mean values are significant on the level of 0.01 (only Electrical Engineering)

If we consider the examination results in relation to an advanced mathematics course in school (cf. Table 11), then we see again a significant effect in both groups.
Table 11

Examination results in Analysis and Linear Algebra in relation to an advanced mathematics course in school

Degree programme

Advanced course in mathematics

Analysis [%]

Linear Algebra [%]

Cohen’s effect size d (with regard to advanced course in mathematics)

Total

Yes

Mean value

62.7

61. 5

0.55 (moderate effect, Analysis)

0.74 (moderate effect, Linear Algebra)

N

144

183

No

Mean value

47.3

42.3

N

285

372

The difference in their mean values is significant on the level of 0.01

Mathematics Test and Examinations

A possible connection between the results in the mathematics test and in the examinations is of interest. A detaild analysis – as illustrated in Tables 11 and 12 – shows significant correlations between the results of the mathematics test and the examinations. A test at the course start can therefore be regarded as a forecast of the student’s future exam performance in the first two semesters.
Table 12

Correlations between the mathematics test and the examinations

Degree programme

Mathematics test

Analysis

Linear Algebra

Electrical Engineering

Mathematics test

Correlation according to Pearson

1

.494**

.595**

Significance (2-sided)

 

.000

.000

N

457

248

322

Computer Science

Mathematics test

Correlation according to Pearson

1

.528**

.540**

Significance (2-sided)

 

.000

.000

N

514

163

228

**. The correlation is significant on the level of 0.01 (2-sided)

The correlations were also examined in terms of possible gender-specific differences (see Table 13). The correlations are somewhat stronger for female students, but they do not differ significantly from those of male students.
Table 13

Correlations between the mathematics tests and the examinations

Gender

Mathematics test

Analysis

Linear Algebra

Female

Mathematics test

Correlation acc. to Pearson

1

.557**

.662**

Significance (2-sided)

 

.000

.000

N

100

36

41

Male

Mathematic test

Correlation acc. to Pearson

1

.511**

.582**

Significance (2-sided)

 

.000

.000

N

889

376

522

**. The correlation is significant on the level of 0.01 (2-sided)

A linear regression was calculated to predict examination results in Linear Algebra and Analysis respectively based on gender, advanced mathematics course in school, degree programme, attendance in a preparatory course and results in the mathematics test. When examination results in Linear Algebra were predicted it was found that gender (Beta = .091 p < .01), advanced mathematics course in school (Beta = .119, p < .01), the degree programme (Beta = −.128, p < .000) and results in the mathematics test (Beta = .534, p < .000) were significant predictors. Attendance in a preparatory course was not a significant predictor (Beta = −.036, n.s.). The overall model fit was R2 = .39. When examination results in Analysis were predicted it was found that gender (Beta = .088 p < .05) and results in the mathematics test (Beta = .524, p < .000) were significant predictors. Advanced mathematics course in school (Beta = .068, n.s.), the degree programme (Beta = −.014, n.s.) and attendance in a preparatory course (Beta = −.081, n.s.) were not significant predictors. The overall model fit was \( {R}^2 \) = .31. In both cases the amount of explained variance is rather low, but there is an interesting influence of the results in the mathematics test.

Summary and Discussion of the Results

Important factors that were seen in this study are the prior knowledge of the students and the selected degree programme. Students who do not attend a preparatory course but have taken an advanced mathematics course in school perform significantly better in the mathematics test than students who do not attend a preparatory course without an advanced mathematics course in school. On average, they performed twice as many tasks correctly. Attending a preparatory course also leads to significantly better test results of the students who had taken an advanced mathematics course in school (.000). The pre-university education of the students is of great significance for the results of the mathematics test at the beginning of their studies. This result indicates a small potential impact of preliminary courses.

New Electrical Engineering students evidently arrive at Kassel university with significantly better prior knowledge than Computer Science students. The correct answer rate for the mathematics test was at 48 % and 39 %, respectively, for the two student groups. After attending the preparatory course the pass rate in the mathematics test rises clearly, but more for the Electrical Engineering students. This shows that students in Electrical Engineering benefit more substantially from a preparatory course. Only slightly better results in the examination are seen after attending a preparatory course; by contrast, those who had previously taken an advanced mathematics course in school got significantly better examination results (.000). A correlation between the test and examination results shows clearly that the selected test gives a forecast of future academic success. Regarding the questions addressed, the following picture emerges:

Attending a preparatory course has no general effect on academic success. One sees, however, that attending a preparatory course is related to significant short-term success regarding the results of the mathematics test. The effect on long-term success is hardly seen. Other factors such as attending an advanced mathematics course in school have considerably greater influence on academic success than participation in a preparatory course. If one considers, however, that the school education (e.g., taking an advanced course in school) has already been completed at the time the student starts the degree programme, then attending a preparatory course is an important possibility for students, even if it only has a minor effect.

A relationship between the type of preparatory course (classroom-based or e-learning) and the ongoing course of study could only be exhibited for students of Electrical Engineering, not for students in Computer Science. Students of Electrical Engineering also have better prior knowledge when they enter the university. Possibly, they have consciously selected their field and also the preparatory option. For this reason, it may not be the type of preparatory offer, but rather the selective behaviour of the students that is the determining factor.

The results of a mathematics test at the course start provided a good forecast of the future exam performance in the early semesters (s. linear regression analysis in Mathematics Test and Examinations). It was not possible to find clear indications of gender-specific differences in this study.

The goal of the article was to present some results of quantitative research on preparatory courses in mathematics that were offered at the University of Kassel in the field of Electrical Engineering and Computer Science from 2010 to 2013. The correct answer rates were remarkably low. But relative to the admissions test from Knospe (2008) that has been given for many years, comparable correct answer rates (50 %) are seen at the beginning of the preparatory course at the University of Kassel. As opposed to Knospe’s test, only inner-mathematical tasks and also tasks from the area of secondary level II (grade 10–12) were used in Kassel. Some deficiencies that students have in mathematical competence without calculator use can be resolved for the most part over the short term by attending a suitable preparatory course. According to the first evaluations, the short tests given to determine basic capabilities without calculator use at the secondary level are good indicators for the future exam performance in mathematics courses at the researched universities.

In general, these results can be included in a discussion on the purpose and content of the preparatory and bridge courses. They confirm, on the one hand, other studies (Bausch et al. 2014) and show, on the other, which factors must be considered when assessing the individual courses, for example, depending on the degree programme and prior education of the students. In the literature there is also no uniform picture of the effect that preparatory courses have. Whilst Lagerlöf and Seltzer (2009) or Di Pietro (2012), among others, have found that preparatory courses have no positive effect or only a minor one, other researchers such as Espey (1997), Ballard and Johnson (2004) as well as Engelbrecht (1997) have seen significantly more positive effects. This inconsistent picture may be due to the fact that individual courses must be regarded seperately, as the examination of the different degree programmes in Kassel has shown. Furthermore, it is necessary to consider in our research that the decision for or against a preparatory course is made by the students.

The analyses carried out do not allow us to make any statements about causal relations. However, there are many hints on relationships between relevant variables related to preparatory courses. Furthermore, it has to be considered that the students were not randomly divided into preparatory course and non-preparatory course groups and that the course option was chosen by the students. We cannot rule out that this may cause shifts in the results, but the large number of examined students still allows interesting statements to be made.

Firstly, the results of other universities are confirmed and, secondly, this study shows the important recognition that general judgements cannot be made. Students on different degree courses differ considerably. There are also huge differences in previous knowledge within the degree programmes e.g., due to attending an advanced course at school. It is interesting that attending a preparatory course only seems to result in minor effects in exam success. Despite this, these courses are offered at almost all universities. This should of course be welcomed because it represents an initial step in student support at universities. However, further courses should be planned because the effects are so small. The correlation between the maths test and exam results is also interesting. This offers the opportunity to identify students “at risk” at an early time and provide targeted support. The fact that this test generally addresses school topics shows that offering preparatory courses and support measures that start at this stage (cf. Krauss et al. 2008, p. 237) make sense. The extent to which other aspects of preparatory courses (such as learning methods and study organisation, cf. Fig. 1) might be helpful cannot be analysed using our data, because only the technical competences (“to handle symbolic, formal and technical elements in mathematics”, see Basic Education in Mathematics as Background for Test Design) have been included. For the testing of such a hypothesis, the influence of other factors must be taken into account (e.g., prior knowledge, willingness to learn, taking an advanced mathematics course in school).

Conclusion

The study shows that universities can obtain interesting information on various groups of students and the preparatory courses they offer from the data available to them through examinations and mathematics tests. The different results of our study show that an individual assessment for degree courses including mathematics could be interesting.

An interesting aspect of this study is the possibility of making statements about future academic success by using a short test at the start of the course. This may make it possible to identify groups “at risk” early on, and so give them reasonable support and advise them accordingly.

There is a need for additional research in the ongoing development and evaluation of the preparatory material, primarily in the area of e-learning, in order to improve the support for independent learning. There are studies that have researched the effect of preparatory courses, but only a very few that have examined the effect in relation to the type or materials used (e.g., Fischer 2014; Biehler et al. 2014). It would be desirable to have not only the results from an obligatory mathematics test for students, but also additional detailed information about the students’ future performance. Then it would be possible to make more precise statements about the composition of the partial groups.

Footnotes

  1. 1.

    The differences in the information can be explained by not always complete datasets.

References

  1. Abel, H., & Weber, B. (2014). 28 Jahre Esslinger Modell – Studienanfänger und Mathematik. In I. Bausch, R. Biehler, R. Bruder, P. R. Fischer, R. Hochmuth, W. Koepf, S. Schreiber, & T. Wassong (Eds.), Mathematische Vor- und Brückenkurse: Konzepte, Probleme und Perspektiven (pp. 9–19). Wiesbaden: Springer Spektrum.CrossRefGoogle Scholar
  2. Ballard, C. L., & Johnson, M. F. (2004). Basic math skills and performances in an introductory economics class. Journal of Economic Education, 35(1), 3–23.CrossRefGoogle Scholar
  3. Bausch, I., Fischer, P. R., & Oesterhaus, J. (2014). Facetten von Blended Learning Szenarien für das interaktive Lernmaterial VEMINT – Design und Evaluationsergebnisse an den Partneruniversitäten Kassel, Darmstadt und Paderborn. In I. Bausch, R. Biehler, R. Bruder, P. R. Fischer, R. Hochmuth, W. Koepf, S. Schreiber, & T. Wassong (Eds.), Mathematische Vor- und Brückenkurse: Konzepte, Probleme und Perspektiven (pp. 87–102). Wiesbaden: Springer.CrossRefGoogle Scholar
  4. Biehler, R., Fischer, P. R., Hochmuth, R., & Wassong, T. (2012). Self-Regulated Learning and Self Assessment in Online Mathematics Bridging Courses. In A. A. Juan, M. A. Huertas, S. Trenholm, & C. Steegman (Eds.), Teaching Mathematics Online: Emergent Technologies and Methodologies (pp. 216–237). Hershey: IGI Global.CrossRefGoogle Scholar
  5. Biehler, R., Fischer, P. R., Hochmuth, R., & Wassong, T. (2014). Eine Vergleichsstudie zum Einsatz von Math-Bridge und VEMINT an den Universitäten Kassel und Paderborn. In I. Bausch, R. Biehler, R. Bruder, P. R. Fischer, R. Hochmuth, W. Koepf, S. Schreiber, & T. Wassong (Eds.), Mathematische Vor- und Brückenkurse: Konzepte, Probleme und Perspektiven (pp. 103–122). Wiesbaden: Springer Spektrum.Google Scholar
  6. Biehler, R., Bruder, R., Hochmuth, R., Koepf, W., Bausch, I., Fischer, P. R., Wassong, T. (2014). VEMINT – Interaktives Lernmaterial für mathematische Vor- und Brückenkurse. In I. Bausch, R. Biehler, R. Bruder, P. R. Fischer, R. Hochmuth, W. Koepf, S. Schreiber, & T. Wassong (Eds.), Mathematische Vor- und Brückenkurse: Konzepte, Probleme und Perspektiven (pp. 261–276). Wiesbaden: Springer Spektrum.Google Scholar
  7. Bruder, R., Elschenbroich, J., Greefrath, G., Henn, H.-W., Kramer, J., & Pinkernell, G. (2010). Schnittstelle Schule-Hochschule. In A. Lindmeier & S. Ufer (Eds.), Beiträge zum Mathematikunterricht 2010 (pp. 75–82). Münster: WTM.Google Scholar
  8. Cohen, J. (1988). Statistical power analysis for the behavioral sciences. Hillsdale: Erlbaum.Google Scholar
  9. Cramer, E., & Walcher, S. (2010). Schulmathematik und Studierfähigkeit. Mitteilungen der DMV, 18(2), 110–114.Google Scholar
  10. De Guzman, M., Hodgson, B., Robert, A., & Villani, V. (1998). Difficulties in passage from secondary to tertiary education. In G. Fischer & U. Rehmann (Eds.), Proceedings of the International Congress of Mathematicians (Vol III) (pp. 747–762). Berlin: Documenta Mathematica.Google Scholar
  11. Di Pietro, G. (2012). The short-term effectiveness of a remedial mathematics course: evidence from a UK university, IZA Discussion Paper Series, No. 6358, Bonn.Google Scholar
  12. Dieter, M. (2012): Studienabbruch und Studienfachwechsel in der Mathematik: Quantitative Bezifferung und empirische Untersuchung von Bedingungsfaktoren. Dissertation. Duisburg: Universität Duisburg-Essen.Google Scholar
  13. Dürr, R., Dürrschnabel, K., Loose, F., & Wurth, R. (Eds.) (2016). Mathematik zwischen Schule und Hochschule. Den Übergang zu einem WiMINT-Studium gestalten - Ergebnisse einer Fachtagung, Esslingen 2015. Wiesbaden: Springer Spektrum.Google Scholar
  14. Ebner, B., Folkers, M., & Haase, D. (2016). Vorbereitende und begleitende Angebote in der Grundlehre Mathematik für die Fachrichtung Wirtschaftswissenschaften. In A. Hoppenbrock, R. Biehler, R. Hochmuth, & H.-G. Rück (Eds.), Lehren und Lernen von Mathematik in der Studieneingangsphase (pp. 149–164). Heidelberg: Springer.CrossRefGoogle Scholar
  15. Engelbrecht, J. C. (1997). Academic support in mathematics in a Third World environment. Journal of Computers in Mathematics and Science Teaching, 16(2), 323–333.Google Scholar
  16. Espey, M. (1997). Testing math competency in introductory economics. Review of Agricultural Economics, 19(2), 484–491.CrossRefGoogle Scholar
  17. Fischer, P. R. (2014). Mathematische Vorkurse im Blended-Learning-Format. Wiesbaden: Springer.CrossRefGoogle Scholar
  18. Fischer, F. T., Schult, J., & Hell, B. (2013). Sex-specific differential prediction of college admission tests: A meta-analysis. Journal of Educational Psychology, 105(2), 478–488.CrossRefGoogle Scholar
  19. Fuchs, L. S., Fuchs, D., Compton, D. L., Powell, S. R., Seethaler, P. M., Capizzi, A. M., Schatschneider, C., & Fletcher, J. M. (2006). The cognitive correlates of third-grade skill in arithmetic, algorithmic computation, and arithmetic word problems. Journal of Educational Psychology, 98(1), 29–43.CrossRefGoogle Scholar
  20. Geiser, S., & Studley, R. (2002). UC and the SAT: Predictive validity and differential impact of the SAT I and SAT II at the University of California. Educational Assessment, 8(1), 1–26.CrossRefGoogle Scholar
  21. Greefrath, G., & Hoever, G. (2016). Was bewirken Mathematik-Vorkurse? Eine Untersuchung zum Studienerfolg nach Vorkursteilnahme an der FH Aachen. In A. Hoppenbrock, R. Biehler, R. Hochmuth, & H.-G. Rück (Eds.), Lehren und Lernen von Mathematik in der Studieneingangsphase (pp. 517–530). Heidelberg: Springer.CrossRefGoogle Scholar
  22. Greefrath, G., Hoever, G., Kürten, R., & Neugebauer, C. (2015). Vorkurse und Mathematiktests zu Studienbeginn - Möglichkeiten und Grenzen. In Roth, J., Bauer, T., Koch, H., Prediger, S. (Eds.), Übergänge konstruktiv gestalten. Ansätze für eine zielgruppenspezifische Hochschuldidaktik Mathematik (pp. 19–32). Wiesbaden: Springer SpektrumGoogle Scholar
  23. Gueudet, G. (2008). Investigating the secondary-tertiary transition. Educational Studies in Mathematics, 67(3), 237–254.CrossRefGoogle Scholar
  24. Haase, D. (2014). Studieren im MINT-Kolleg Baden-Württemberg. In I. Bausch, R. Biehler, R. Bruder, P. R. Fischer, R. Hochmuth, W. Koepf, S. Schreiber, & T. Wassong (Eds.), Mathematische Vor- und Brückenkurse: Konzepte, Probleme und Perspektiven (pp. 123–136). Wiesbaden: Springer Spektrum.CrossRefGoogle Scholar
  25. Hell, B., Linsner, M., & Kurz, G. (2008). Prognose des Studienerfolgs. In M. Rentschler (Ed.), Studieneignung und Studierendenauswahl – Untersuchungen und Erfahrungsberichte (pp. 132–177). Aachen: Shaker.Google Scholar
  26. Heublein, U., Schmelzer, R., Sommer, D., & Wank, J. (2008). Die Entwicklung der Schwund- und Studienabbruchquoten an den deutschen Hochschulen. Statistische Berechnungen auf der Basis des Absolventenjahrgangs 2006. Hannover: HIS.Google Scholar
  27. Heublein, U., Richter, J., Schmelzer, R., & Sommer, D. (2012). Die Entwicklung der Schwund- und Studienabbruchquoten an den deutschen Hochschulen, Statistische Berechnungen auf der Basis des Absolventenjahrgangs 2010. Hannover: HIS Forum Hochschule.Google Scholar
  28. Hoever, G. (2014). Vorkurs Mathematik. Theorie und Aufgaben mit vollständig durchgerechneten Lösungen. Heidelberg: Springer Spektrum.Google Scholar
  29. Holton, D. (Ed.). (2001). The teaching and learning of mathematics at university level. An ICMI study. Dordrecht: Kluwer.Google Scholar
  30. Klein, F. (1924). Elementarmathematik vom Höheren Standpunkte aus. Erster Band. Arithmetik Algebra Analysis. Berlin: Springer.CrossRefGoogle Scholar
  31. KMK (2012): Bildungsstandards im Fach Mathematik für die Allgemeine Hochschulreife (Beschluss der Kultusministerkonferenz vom 18.10.2012). http://www.kmk.org. Accessed 1 September 2015.
  32. Knospe, H. (2008). Der Mathematik-Eingangstest an Fachhochschulen in Nordrhein-Westfalen. In D. Schott (Hrsg). Proceedings 6. Workshop Mathematik für Ingenieure, Wismarer Frege-Reihe, 3/2008 (pp. 6–11), Hochschule Wismar.Google Scholar
  33. Knospe, H. (2011). Der Eingangstest Mathematik an Fachhochschulen in Nordrhein-Westfalen von 2002 bis 2010. In D. Schott (Hrsg): Proceedings 9. Workshop Mathematik für ingenieurwissenschaftliche Studiengänge, Wismarer Frege-Reihe, 02/2011 (pp. 8–13), Hochschule Wismar.Google Scholar
  34. Krajewski, K., & Schneider, W. (2009). Exploring the impact of phonological awareness, visual–spatial working memory, and preschool quantity–number competencies on mathematics achievement in elementary school: Findings from a 3-year longitudinal study. Journal of Experimental Child Psychology, 103(4), 516–531.CrossRefGoogle Scholar
  35. Krauss, S., Neubrand, M., Blum, W., Baumert, J., Brunner, M., Kunter, M., & Jordan, A. (2008). Die Untersuchung des professionellen Wissens deutscher Mathematik-Lehrerinnen und-Lehrer im Rahmen der COACTIV-Studie. Journal für Mathematik-Didaktik, 29(3–4), 233–258.CrossRefGoogle Scholar
  36. Kürten, R., Greefrath, G., Harth, T., & Pott-Langemeyer, M. (2014). Die Rechenbrücke – ein fachbereichsübergreifendes Forschungs- und Entwicklungsprojekt. Zeitschrift für Hochschulentwicklung, 9(4), 17–38.CrossRefGoogle Scholar
  37. Lagerlöf, J. N. M., & Seltzer, A. J. (2009). The effects of remedial mathematics on the learning of economics: evidence from a natural experiment. Journal of Economic Education, 2009, 115–136.CrossRefGoogle Scholar
  38. Lingel, K., Neuenhaus, N., Artelt, C., & Schneider, W. (2014). Der Einfluss des metakognitiven Wissens auf die Entwicklung der Mathematikleistung am Beginn der Sekundarstufe I. Journal für Mathematik-Didaktik, 35(1), 49–77.CrossRefGoogle Scholar
  39. OECD. (2014). PISA 2012 Results: What Students Know and Can Do – Student Performance in Mathematics, Reading and Science (Volume I, Revised edition, February 2014). PISA: OECD Publishing.Google Scholar
  40. Pippig, G. (Ed.). (1988). Pädagogische Psychologie. Berlin: Volk und Wissen.Google Scholar
  41. Rammstedt, B. (Ed.). (2013). Grundlegende Kompetenzen Erwachsener im internationalen Vergleich: Ergebnisse von PIAAC 2012. Münster: Waxmann.Google Scholar
  42. Reichersdorfer, E., Ufer, S., Lindmeier, A., & Reiss, K. (2014). Der Übergang von der Schule zur Universität: Theoretische Fundierung und praktische Umsetzung einer Unterstützungsmaßnahme am Beginn des Mathematikstudiums. In I. Bausch, R. Biehler, R. Bruder, P. R. Fischer, R. Hochmuth, W. Koepf, S. Schreiber, & T. Wassong (Eds.), Mathematische Vor- und Brückenkurse, Konzepte und Studien zur Hochschuldidaktik und Lehrerbildung Mathematik (pp. 37–53). Wiesbaden: Springer Spektrum.Google Scholar
  43. Reimpell, M., Hoppe, D., Pätzold, T., & Sommer, A. (2014). Brückenkurs Mathematik an der FH Südwestfalen in Meschede – Erfahrungsbericht. In I. Bausch, R. Biehler, R. Bruder, P. R. Fischer, R. Hochmuth, W. Koepf, S. Schreiber, & T. Wassong (Eds.), Mathematische Vor- und Brückenkurse, Konzepte und Studien zur Hochschuldidaktik und Lehrerbildung Mathematik (pp. 165–180). Wiesbaden: Springer Spektrum.Google Scholar
  44. Roegner, K., Seiler, R., & Timmreck, D. (2014). Exploratives Lernen an der Schnittstelle Schule/Hochschule. Didaktische Konzepte, Erfahrungen, Perspektiven. In I. Bausch, R. Biehler, R. Bruder, P. R. Fischer, R. Hochmuth, W. Koepf, S. Schreiber, & T. Wassong (Eds.), Mathematische Vor- und Brückenkurse, Konzepte und Studien zur Hochschuldidaktik und Lehrerbildung Mathematik (pp. 181–196). Wiesbaden: Springer.Google Scholar
  45. Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher, 15(2), 4–14.Google Scholar
  46. Tall, D. (1991). Advanced mathematical thinking. Dordrecht: Kluwer.CrossRefGoogle Scholar
  47. Trapmann, S., Hell, B., Weigand, S., & Schuler, H. (2007). Die Validität von Schulnoten zur Vorhersage des Studienerfolgs - eine Metaanalyse. Zeitschrift für Pädagogische Psychologie, 21(1), 11–27.CrossRefGoogle Scholar
  48. Weinert, F. E. (2001). Concept of competence: A conceptual clarification. In D. S. Rychen & L. H. Salganik (Eds.), Defining and selecting key competencies (pp. 45–65). Göttingen: Hogrefe.Google Scholar
  49. Wilmot, D. B., Schoenfeld, A., Wilson, M., Champney, D., & Zahner, W. (2011). Validating a learning progression in mathematical functions for college readiness. Mathematical Thinking and Learning, 13(4), 259–291.CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.Institut für Didaktik der Mathematik und der InformatikWestfälische Wilhelms-Universität MünsterMünsterGermany
  2. 2.AG Computational Mathematics, Fachbereich 10 Mathematik und Naturwissenschaften, Institut für MathematikUniversität KasselKasselGermany

Personalised recommendations