Skip to main content

Approaches to studying in first-year engineering: comparison between inventory scores and students’ descriptions of their approaches through interviews


This combined interview and survey study explored the relationship between interview data and data from an inventory describing engineering students’ ratings of their approaches to studying. Using the 18-item Approaches and Study Skills Inventory for Students (ASSIST) students were asked to rate their approaches to studying in relation to particular statements. A subsample of nine first-year engineering students participated in subsequent interviews exploring their experiences of studying and learning. The students’ views were examined and interpreted into inventory scores which were compared to the students’ actual ratings. The interviews confirmed the scales measured in the inventory and provided illustrations to them. While students who were extreme in either approach were easier to interpret, others provided a good example of the complex combination of approaches that can exhibit itself in one individual. The study illustrates how combined data sets can contribute to achieve a holistic understanding of student learning in its context.


Research on students’ approaches to studying and learning patterns in general has contributed considerably to our understanding of variations in the ways in which students set about studying in different higher education settings (Entwistle 2009). Using both self-reported measures (e.g. Richardson 2013; Vermunt et al. 2014) and qualitative interviews (e.g. Entwistle and McCune 2013; McCune and Rhind 2014), this strand of research has managed to describe how student learning is affected by a complex interaction of influences that involves a combination of how the teaching-learning environment is organised in different course settings and students’ perception of this teaching-learning environment (Entwistle 2009). In order to develop powerful learning environments (Vermetten et al. 2002) that can help students develop personal understandings of learning material that match the target understanding (Entwistle and Smith 2002) of the courses they are taking, we need to understand more about how students prefer to study in particular educational settings.

A particularly interesting question is how students’ approaches to studying, as described in inventory studies, relate to students’ personal descriptions of how they set about studying a particular subject area. In discussing how data analyses at different levels can contribute to achieving a more holistic understanding of student learning in its context, Entwistle et al. (2006) note:

[R]esearch at different levels of analysis serves different purposes. The group-level analyses offer guidance on general improvements to the system and also show the extent of any problems being experienced by students, while a narrower focus makes the nature of those problems much clearer and can lead directly to suggestions for improving the situation. (p. 146).

While research has recognised that the combination of group level analyses and more individualised fine-grained analyses can contribute to enhancing our understanding of students’ learning processes, investigations looking more precisely at students’ varying ways of interpreting and tackling coursework in a particular discipline are still scarce (e.g. Creswell et al. 2003; Lindblom-Ylänne 2003). In particular, analysis on different levels can give complementary pieces of information in understanding the varying ways in which students go about studying and learning. Investigating how individual students’ descriptions of how they experience and set about studying relate to their self-rated approaches to studying as measured using a particular inventory, may yield valuable information about how different dimensions of students’ studying and learning can be conceptualised and brought together to clarify the impact of higher education on student learning (Entwistle 2009; Entwistle and McCune 2013; Mogashana et al. 2012).

In this paper, we present a study relating data from inventories, given to all students taking a university course, to data from individual interviews with a subsample of the students. Through a fine-grained analysis of the individual interviews, we can complement the data from the inventories. The study explores the relation between students’ ratings of their approaches to studying and how they in interviews described their ways of studying.

Theoretical framework

This paper draws on an established theoretical framework focusing on students approaches to learning in higher education (Biggs 1993). This framework, originally developed in British and Swedish research (Marton et al. 1997), emphasises the functional relationship between the approaches that students adopt in dealing with learning tasks in different settings, and the understanding achieved as a result of the approaches taken. In particular, three types of approaches to studying have been emphasised in the research (Entwistle 2009): deep, surface and strategic. While a deep approach is characterised by an intention to seek meaning and understand for oneself, a surface approach is typically characterised by an intention to commit detailed pieces of information to memory in order to reproduce information accurately in, for instance, an upcoming exam. The strategic approach is an organised form of approach and describes students’ intention to maximise study achievement by selectively meeting perceived requirements of the teaching-learning environment to achieve high grades (Entwistle 2009; cf. Miller and Parlett 1974). It should be pointed out that when approaches to learning are discussed, it is easy to simplify, but the three approaches are not exclusive (Diseth 2003). Students express all three of them, but to varying degrees. An individual cannot be placed in one single category, but displays a mix of them.

While the original research on approaches drew on in-depth interviews with students, focusing on study habits, and the ways in which students profiled differences in ways of tackling coursework, later research has made use of inventory methodologies to investigate approaches among groups of students in different higher education settings (Richardson 2004). A driving force in this particular research strand is the recognition that students’ personal learning experiences, communicated in in-depth interviews, may provide valuable data for exploring central influences on learning in higher education. An interesting, but less explored, question is how students’ approaches, as described in self-report inventories, relate to the experiences that students describe in qualitative interviews.


In the present study, we used inventory data collected in a larger research project exploring students’ approaches to studying in both campus based and web-based settings (Bälter et al. 2013). We also collected interview data with an explicit aim to explore the relationship between the inventory data describing university students’ ratings of their approaches to studying and the interview data on how they orally described their ways of studying in first-year engineering at a Swedish university.


Data collection

Inventory data were collected during the summer within the context of a self-paced online preparatory course in mathematics using the short version of Approaches and Study Skills Inventory for Students (ASSIST; see Entwistle et al. 2000). The inventory was translated into Swedish and cross-culturally validated involving an independent bilingual researcher to perform a back-translation of the Swedish version into English to ensure correspondence in meaning and consistency in relation to the original version of the instrument (see Öhrstedt 2009; Öhrstedt and Lindfors 2016), and adapted to the specific online course setting (Bälter et al. 2013) by removing one statement that did not apply to the context (item 16, subscale surface approach: ‘I’m not really sure what’s important in lectures, so I try to get down all I can.’). See Table 1 for the items used in each subscale taken from the original English version of the ASSIST described in detail by Tait and Entwistle (1996) which in its long form includes 44 items (see Tait and Entwistle 1996). Students rated themselves on a Likert-type scale ranging from 1 to 5, where 5 stands for complete agreement. The aggregated score ranges from 6 to 30 for deep and strategic approach and from 5 to 25 for surface approach. An invitation letter was sent via e-mail to all students in the course. The e-mail contained a link to the inventory and informed the students of the purpose of the study as well as of their volunteer participation. Students who had neither participated nor declined participation were sent reminders 1, 2 and 3 weeks later.

Table 1 The ASSIST items used (see Tait and Entwistle 1996)

In the beginning of the autumn, 43 of the summer students enrolled in a five-year Master of Science in Engineering (in Computer Science or Media Technology). These now on-campus students were invited to participate in an interview exploring their experiences of studying and learning. Of the 43 invited students, nine volunteered for interviews. There were only significant differences between the group of interviewed students and the invited students in score for surface approach, see Table 2. The interviews were carried out without knowing the students’ scores in the ASSIST inventory. They were unstructured and conversational in style and invited the students to talk about how they went about studying. The students were encouraged to talk about their everyday studying, how they usually structured their work a typical day at the university and how they made use of the different teaching sessions (lectures, tutorials, laboratory work sessions, etc.). They were also prompted to describe how they viewed their studies in relation to how they described what was involved in understanding particular concepts brought fore in the teaching, and what they saw as being most important to focus on in their studies, for instance in relation to reading the course literature. The interviews lasted between 30 and 60 min (40 min on average) and were, with the students’ informed consent, audio-recorded and transcribed in full. The names used in the result presentation are pseudonyms chosen to reflect the students’ genders. As a token of appreciation, students were in the invitation offered two movie theatre tickets for participation.

Table 2 Mean value with standard deviation in parenthesis for students who were interviewed and the rest of the invited students

Data analysis

The analysis of the interviews was carried out in two steps. The first step involved independently reading through the interview transcripts several times, searching for utterances that indicated study activities that could be connected to students adopting a deep approach, a surface approach or a strategic approach. The analysis was theoretically grounded in Entwistle’s (2009) characterisation of each approach. After having summarised the approaches taken by individual students two authors met to discuss their interpretations focusing on instances in the interviews that indicated particular approaches (deep, surface, and strategic) for each student, ending up with a preliminary profile for each student. The second step of the analysis of the interviews focused on the specific items in the inventory. In particular, drawing on the analysis of utterances that indicated particular approaches, each of the two authors tried, independently, to simulate individual student profiles as these would presumably—given what came out of the interviews—have rated themselves in the ASSIST. It should be noted that none of those were aware how the students had rated themselves in the inventory. These independent ratings were then compared to one another and the two authors agreed on a particular rating on each item for all of the students. The next step of the analysis was to compare these author-ratings, with the students’ actual ratings in the inventory.


The analysis of the interviews made it clear that students varied in descriptions of how they set about studying and learning. In the following, examples from the interviews will be presented to illustrate in what way students gave evidence for aspects of deep, surface and strategic approaches. This description is followed by a more fine-grained picture of how individual students in the interviews described their approaches to studying. The ratings produced as a result of the interview analysis are then presented and compared to the measures from the ASSIST inventory.

Examples of deep, surface and strategic approaches

Evidence of a deep approach was given by Alice when she described the importance of understanding the definitions:

You need to understand the definitions because there are a lot of definitions that they go through and it’s not just about learning by heart but it’s about understanding and then you have to do the problems again and again until you know them, the biggest difference now is that we’re not allowed to use formula sheets so you really have to know what you’re doing and not just look it up.

Indications of a surface approach was revealed by Ben in the interview. He said that he usually does a lot of tasks, and also that you, due to the fast pace, get behind schedule if you miss just one study hour. He also emphasised that it is not possible to understand everything.

I usually just go home and study by myself […] I do all the recommended tasks, and also the recommended previous quizzes and also previous exams that one can try on […] but it is a terribly fast pace, it is not possible to understand everything. […] You need to study lots of hours every day, and if you lose just one hour then you really get behind schedule.

A strategic approach was evident in the interview with Danny. In particular he described how he usually set about planning his work.

To plan my work I use a paper where I note the big tasks and their deadlines. And then I also have another paper where I note subtasks like studying mathematics […] so when I have studied mathematics for one hour then I can cross that out, in that way I break it down.

Individual students’ approaches

Apart from analysing the interviews with a focus on approaches apparently adopted by students, the analysis went on to produce a more fine-grained picture of individual students’ approaches as these were demonstrated in the interviews. In particular, this part of the analysis sought to clarify the relationship between individual students’ personal ratings of approaches as measured by the ASSIST inventory and the ratings produced as a result of analysing interview statements. In the following, we present descriptions of findings elaborating on students’ approaches to studying in relation to deep, surface and strategic approaches as expressed in the ASSIST inventory. This is done through examples being provided from the cases of Eric and Hannah, followed by a description of how the co-authors independently given ratings correspond, and how the common decided rating relates to the students’ actual measures in ASSIST inventory.


Eric is a student at the Computer Science engineering program. In the interview, Eric told us that he likes the lectures as they give him ‘a glimpse’ of what they are supposed to learn and he can then go back to his notes and look through what has been brought to the fore in the lecture. Before the lecture, he usually goes through the assigned learning material. When studying, he tries to supplement both his notes and the literature.

If I have a lecture in the morning and I can go home in the afternoon, then I sit down directly and go through what I have learnt. And also before the lecture I usually go through the chapter that we are to learn from, I think that it is very effective. […] I get the lecture in the morning, then in the afternoon at home I go through it once more.

At the beginning of the week, Eric makes a schedule. Taking the schedule for lectures and tutor sections as a starting point, he makes a detailed plan for his studying. He plans what subject to study and when to take breaks.

In the beginning of each week I make a schedule for school activities and then I adapt my studies to that. For example, today I have an empty day, so this afternoon I have planned to sit two hours with mathematics and then a half-hour break and then two hours of something else and so on, so that’s no problem at all.

So, Eric has a tightly structured way of studying and he was also able to give arguments for the approach to studying that he describes. From the interview, we get to know him as a student who adopts a highly strategic approach to studying and our interpretation is that he would have rated himself as 30 in the ASSIST, the highest level of strategic approach. He rated himself in the inventory on the highest level for each item related to strategic approach; in total he scores 30, the same as our interpretation from the interview.

Eric also told us in the interview that he tries to understand the concepts and the ideas behind what is written. He does nearly all of the tasks, even if the course responsible had chosen some of them as a suggestion of which to prioritise, but he does not want to do them as a robot.

What I usually do is to understand, like not just sit there and do the assignments like a robot. I try to understand the concepts and the idea behind what is written, so if I understand how, like a formula, how it works and how they developed it, then I understand everything much better and can relate to that when I use it. That is so much more effective than just sitting down doing lots of assignments and then remembering how to do it.

Eric’s attempts to understand, along with his careful way of approaching the literature and study notes from the lectures, suggests that Eric, to a great extent, adopts a deep approach to studying. Our interpretation of the interview is that he would rate himself as having a deep approach scored to 26, a high level in the scale (6–30). His own rating in the inventory sums up to 27 on the items related to deep approach.

Looking for signs in the interview to interpret the level of surface approach is a bit harder. Eric solves many problem tasks, more than needed. However, from his way of describing his studies, he approaches these tasks in a way that suggests that he tries to understand the concepts and ideas behind what is written. So, our interpretation is that he would rate himself quite low on the surface approach scale in the ASSIST; we presume that his answer would sum up to 6 related to surface approach (scale 5–25). His own rating is exactly the same as our interpretation.


Hannah is a student in the Media Technology engineering program. She has always liked mathematics but has been worried about studying mathematics at university level since she has heard stories of how difficult it can be. However, for her, it has turned out well. Hannah talks about mathematics as something that in previous school years has been concrete, but in the present course, algebra and geometry, she finds it difficult to connect what she is studying to possible applications in real life. She tells us that she has tried to regard mathematics as a way of thinking, and that the studying is aimed at learning this particular way of thinking.

In elementary school [mathematics] was mostly that it was something concrete, it was like one apple plus one apple and then you have two apples, but now it becomes, as I feel now I don’t have any concrete connection to what we are doing. […] What I try to do is to think that mathematics is more like a way of thinking, to learn problem solving, and to let go of trying to understand exactly what I am doing.

It is problematic for Hannah to make this change in view on mathematics. She tries to accept that she does not understand everything.

I guess it is something you have to accept, that you don’t have to understand everything, you just have to know it, a bit. […] I don’t have any understanding of why [a subspace] is [a subspace], why you should have it and what to use it for. […] well, I’ll just have to wait and see, accept that this is a subspace and then just move on with it.

Hannah talked about how she tries to read the literature in the course but sometimes has difficulty in understanding it. She tries to explain to herself what is meant, step by step, reformulating the text and connecting it to examples given in the lectures.

I often read [the textbook] but sometimes I feel that I don’t really understand what I am reading, and sometimes I can understand it in one way but not apply it. […] But I often try, so I sit down thinking, I try to explain to myself what they mean by this step, and so on.

Attending lectures also helps her recognise the steps by examples other than the textbooks. She always takes part of the lectures and even if she does not understand what is taught she takes notes. She said she thinks that maybe if she looks into the notes afterwards then she will understand. Hannah does not usually make plans for her studies. She takes part in the lectures and tutor sections, but apart from that she primarily engages in private study. In the beginning of the semester, she often did the tasks the very last minute but she has now started to work together with peers on the tasks they are to present every week in seminars.

From the interview, we can conclude that Hannah really wants to understand, but also struggles to realise this aim. She uses internal monologues to help overcome problematic passages in texts and she goes back to notes taken in the lectures. She wants to connect mathematics to something concrete but tries to remedy the perceived lack of such connections by trying to accept that this is mostly about learning a new way of thinking. Our interpretation is that she would rate herself as 21 on the deep approach (scale 6–30) and as 11 in surface approach (scale 5–25). She is moderately strategic in her approach to studying, she does not make explicit plans for her studies but she has started to work together with peers to prepare for the seminars. She also tries to find strategies to overcome her problems of not understanding, such as internal monologues and note taking. Drawing on these interpretations of the interview data, we predict that she would rate her as 19 on the scale for strategic approach (scale 6–30). In the inventory, her answers on the items sums up to 19 on deep approach, 14 on surface approach and 15 on strategic approach. So, we have put her a bit high both on deep approach and strategic approach and a bit low on surface approach.

Ratings generated from the interview analyses

Drawing on the interview analyses, the two authors tried, independently, to emulate individual students’ profiles as these would presumably—given what came out of the interviews—have rated themselves in the ASSIST. The ratings made by the two researchers independently were compared (see Table 3). Of 153 ratings, 90 were identical and 52 one step removed, yielding a correspondence of 142 of 153. Items number 6, 12, and 15, all belonging to a deep approach to learning, accounted for the highest deviations. The differences of the two authors’ ratings were both positive and negative, the mean for deep approach was −0.37, for surface +0.16 and for strategic +0.02. So, largely it was a good match between the two authors’ ratings.

Table 3 The number of deviations between the researchers’ estimates

The interview ratings compared to actual measures in the ASSIST inventory

The ratings given by two of the authors were discussed and a common decision on rating was made, producing the following pattern of approaches (see Table 4). In the table, the measures from the ASSIST inventory for each of the interviewed students are also given. As can be seen in the table, for some students, e.g. Eric, the rating from the interview analysis and the measure given by ASSIST is nearly the same. For other students, e.g. Hannah, the deviation is higher, but still the correlation between interview analysis and ASSIST is good. The student with the highest deviation in total is George. It could be noted that George, like Hannah, is not that extreme in scoring as Eric. From Table 4, it is clear that the deep approach ratings from the interviews are a bit low for most of the students (six of nine). For surface approach it is the other way around, for five of the students the rating from the interview analysis is higher than the measure from ASSIST. The average differences are −0.6 for deep, +0.8 for strategic and +1.3 for surface approach.

Table 4 Ratings from interview analysis and students’ answers on the ASSIST inventory

In sum, the findings include several examples of how students in interviews described their ways of studying and how these descriptions can be connected to deep, strategic and surface approaches. It has also been shown how individual students in the interviews gave evidence for various levels in each approach, showing the complex combinations of streams of all the three approaches building up a single student’s way of tackling the study situation. The combination of data through ASSIST scores and more fine-grained data provided by individual students in qualitative interviews has given complementary pieces of information in understanding student learning patterns. In conclusion, the interviews largely confirm the measures revealed in the ASSIST inventory. Additionally, we can conclude that the interviews provide rich illustrations of what a specific score on the ASSIST scale signifies in students’ self-reported ways of approaching their studies.


The aim of the study was to explore the relationship between data from the ASSIST inventory describing university students’ ratings of their approaches to studying and data on how they described their ways of studying in an interview. By linking individual students’ descriptions of how they set about studying in higher education, to the broader pattern of approaches to studying measured in ASSIST, this study sought to produce a more complex understanding of students’ learning processes. Such a combination of analyses involving considerations at different levels are still scarce but are generally regarded as valuable (Creswell et al. 2003) as they yield complementary pieces of information in understanding student learning patterns. The findings from the present study suggest that the ASSIST inventory provides profiles that correspond well with the substantial qualitative information coming out of the interviews, and so together inventory data and interview data can potentially create a stronger basis for clarifying students’ learning processes in relation to particular higher education settings.

So, the interviews seemed to validate inventory profiles but, and perhaps more interestingly, the interviews also provide a basis for elaborating and expanding our understanding of the constructs measured in the inventory. As seen in the example of Eric, students whose scores on ASSIST end up in the extremes are relatively easy to categorise in terms of deep, surface or strategic approaches, and here we produced an almost perfect estimation of the score. The scores for Hannah and George, on the other hand, were slightly misaligned with the aggregated estimate. These students had more mixed and moderate approaches which made their answers more open to speculation.

The estimated score, compared to the students’ answers, were generally a little low for a deep approach to learning and a little high for a surface approach. The best agreement was for strategic approach, even though the estimate here also was slightly higher than the answers. The reason for this mismatch could be that students were answering the questions from the point-of-view of what they thought would be expected from an ideal student, thus interpreting the questions rather than responding spontaneously. Another reason could be that the inventory was administered during an online course and the students were told to think about how they approach their studies in that specific environment. Since approaches to learning are said to be context dependent (Entwistle 2009), this could be part of the mismatch: it was not linked to the researchers’ misinterpretation but rather to the students’ exhibiting a different behaviour in a campus environment. A third reason could be the amount of time that elapsed between the inventory and the interview; however, other studies have found that individuals do not easily change their learning approaches (Entwistle 2009; Entwistle and McCune 2013; Kann et al. 2015).

It has been argued by Mogashana et al. (2012) that it is important to be sensitive to the context, both the cultural and the linguistic context, in which an inventory such as ASSIST is administered. In a similar study they concluded that students, in responding to the inventory, had confronted difficulties of different kinds: sometimes particular items (statements) confused students, sometimes the wording caused problems and there were instances where the students’ response depended on a particular context rather than the context defined by the single item. In our study, no such problems were encountered and there was a remarkably good match between estimated scores based on the authors’ independent analysis of the interview data and the ASSIST scores.

So to conclude, the present study illustrates how a combination of data analyses can contribute to achieving a more holistic understanding of student learning in its context. Recognising the value of combining broader inventory studies with fine-grained analyses of individual testimonies of approaches to studying opens up a pathway to exploring how different dimensions of students’ studying and learning can be conceptualised and brought together to clarify the impact of higher education on student learning in a particular educational setting.


  • Bälter, O., Cleveland-Innes, M., Pettersson, K., Scheja, M., & Svedin, M. (2013). Student approaches to learning in relation to online course completion. Canadian Journal of Higher Education, 43(3), 1–18.

    Google Scholar 

  • Biggs, S. (1993). Understanding ageing: images, attitudes and professional practice. Buckingham: Open University Press.

    Google Scholar 

  • Creswell, J. W., Plano Clark, V. L., Gutmann, M. L., & Hanson, W. E. (2003). Advanced mixed methods research designs. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 209–240). Thousand Oaks: Sage.

    Google Scholar 

  • Diseth, Å. (2003). Personality and approaches to learning as predictors of academic achievement. European Journal of Personality, 17(2), 143–155.

    Article  Google Scholar 

  • Entwistle, N. (2009). Teaching for understanding at university: deep approaches and distinctive ways of thinking. London: Palgrave Macmillan.

    Book  Google Scholar 

  • Entwistle, N., & McCune, V. (2013). The disposition to understand for oneself at university: integrating learning processes with motivation and metacognition. British Journal of Educational Psychology, 83(2), 267–279.

    Article  Google Scholar 

  • Entwistle, N., & Smith, C. (2002). Personal understanding and target understanding: mapping influences on the outcomes of learning. The British Journal of Educational Psychology, 72, 321–342.

    Article  Google Scholar 

  • Entwistle, N. J., Tait, H., & McCune, V. (2000). Patterns of response to an approaches to studying inventory across contrasting groups and contexts. European Journal of Psychology of Education, 15(1), 33–48.

    Article  Google Scholar 

  • Entwistle, N., McCune, V., & Scheja, M. (2006). Student learning in context: understanding the phenomenon and the person. In L. Verschaffel, F. Dochy, M. Boerkaerts, & S. Vosniadou (Eds.), Instructional psychology: past, present, and future trends. Sixteens essays in honour of Eric de Corte (pp. 131–148). Amsterdam: Elsevier.

    Google Scholar 

  • Kann, V., Bälter, O., Svedin, M., & Colarieti Tosti, M. (2015). Lärstrategier på längden och tvären [Teaching strategies lengthwise and crosswise]. Paper presented at 5:e Utvecklingskonferensen för Sveriges Ingenjörsutbildningar, Uppsala University.

  • Lindblom-Ylänne, S. (2003). Broadening an understanding of the phenomenon of dissonance. Studies in Higher Education, 28(1), 63–77.

    Article  Google Scholar 

  • Marton, F., Hounsell, D., & Entwistle, N. (Eds.). (1997). The experience of learning: implications for teaching and studying in higher education. Edinburgh: Scottish Academic Press.

    Google Scholar 

  • McCune, V., & Rhind, S. (2014). Understanding students’ experiences of being assessed: the interplay between prior guidance, engaging with assessments and receiving feedback. In C. Kreber, C. Anderson, N. Entwistle, & J. McArthur (Eds.), Advances and innovations in university assessment and feedback (pp. 246–263). Edinburgh: Edinburgh University Press.

    Chapter  Google Scholar 

  • Miller, C. M., & Parlett, M. (1974). Up to the mark: a study of the examination game. Guilford: Society for Research into Higher Education.

    Google Scholar 

  • Mogashana, D., Case, J. M., & Marshall, D. (2012). What do student learning inventories really measure? A critical analysis of students’ responses to the Approaches to Learning and Studying Inventory. Studies in Higher Education, 37(7), 783–792.

    Article  Google Scholar 

  • Öhrstedt, M. (2009). Studieapproach, stress, studieresultat och förmåga att bedöma egen prestation [Approaches to studying, stress, academic achievement and the ability to assess own performance]. Master thesis, Stockholm University.

  • Öhrstedt, M., & Lindfors, P. (2016). Students’ adoption of course-specific approaches to learning in two parallel courses. European Journal of Psychology of Education, 31, 209–223.

    Article  Google Scholar 

  • Richardson, J. T. E. (2004). Methodological issues in questionnaire-based research on student learning in higher education. Educational Psychology Review, 16(4), 347–358.

    Article  Google Scholar 

  • Richardson, J. T. E. (2013). Approaches to studying across the adult life span: evidence from distance education. Learning and Individual Differences, 26, 74–80.

    Article  Google Scholar 

  • Tait, H., & Entwistle, N. (1996). Identifying students at risk through ineffective study strategies. Higher Education, 31(1), 97–116.

    Article  Google Scholar 

  • Vermetten, Y. J., Vermunt, J. D., & Lodewijks, H. G. (2002). Powerful learning environments? How university students differ in their response to instructional measures. Learning and Instruction, 123, 263–284.

    Article  Google Scholar 

  • Vermunt, J. D., Richardson, J. T. E., Donche, V., & Gijbels, D. (2014). Students’ learning patterns in higher education: dimensions, measurement and change. In D. Gijbels, V. Donche, J. T. E. Richardson, & J. D. Vermunt (Eds.), Learning patterns in higher education: Dimensions and research perspectives (pp. 295–310). New York: Routledge.

    Google Scholar 

Download references


The authors would like to acknowledge that this research was made possible through a generous grant from the Faculty of Science at Stockholm University.

Author information

Authors and Affiliations


Corresponding author

Correspondence to Kerstin Pettersson.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Pettersson, K., Svedin, M., Scheja, M. et al. Approaches to studying in first-year engineering: comparison between inventory scores and students’ descriptions of their approaches through interviews. High Educ 75, 827–838 (2018).

Download citation

  • Published:

  • Issue Date:

  • DOI:


  • Approaches to studying
  • Qualitative interviews
  • Inventory
  • Engineering students