Introduction

Our approaches to teaching research methods as part of Politics and International Relations (IR) undergraduate degree programmes have changed significantly over the past decade. We have seen a growing debate and reflections on research methods in the literature, and a significant push for improving the provision of methods training by national quality control bodies and research councils. As a result, dedicated research methods courses have increasingly risen to prominence across Politics and IR curricula. At the same time, there has been a growing interest in different ways of embedding research methods in substantive courses and, thus, of going beyond stand-alone research methods classes.

This article reflects on our experiment of a systematic engagement with research methods in the teaching of an undergraduate optional course on “Contemporary Russian Politics” at Newcastle University in the UK. We demonstrate that by integrating an explicit discussion and practice of research methods in a Comparative Politics course, we can achieve three important objectives. First, we can support our students in practicing a variety of research methods. Due to their focus on different aspects of the political system and processes of a state or a region, Comparative Politics courses are particularly well-suited for engaging with a wide range of qualitative and quantitative methods. Second, we can encourage our students to become more critically aware data-users—a learning outcome that is increasingly important in the context of rising populism and anti-expert sentiments. Finally, we can assist our students in becoming more reflexive learners. Comparative Politics courses provide us with ample opportunity to reflect on identity construction and positionality: How do we interpret the Other her—whether it is a state or a region, how do different data shape our understanding and knowledge of the Other, and how are our understandings of Self linked to our interpretations of the Other?

The article is structured as follows. The first section locates our approach to engaging with research methods in the existing literature by undertaking a brief overview of three debates: on research methods, citizen education and positionality in learning and teaching. The second section introduces our course on “Contemporary Russian Politics” and discusses the ways in which we have embedded research methods in the course, the types of data we have used, and the ways in which we have encouraged students to question their positionality. The third section presents our reflections on the outcome of our experiment by comparing students’ essays submitted in 2017/18 (before the experiment) and 2019/20 (during the experiment) and by tracing any changes in students’ answers to the same essay questions. Finally, the conclusion proposes possible ways of strengthening the research methods component of Comparative Politics courses.

Key debates: practicing skills, questioning data, reflecting on positionality

Over the past decade, the literature on learning and teaching in Politics and IR has proposed innovative ways of improving the provision of research methods training in undergraduate programmes. Yet, the research methods debate has largely developed in parallel to two other important strands of the literature—on citizen education and on positionality. We suggest that a more systematic approach to methods training would benefit from an explicit engagement with these debates. We begin by briefly discussing some of the key points in these literatures before proposing how we can incorporate their insights into the teaching of a Comparative Politics course.

Politics and IR scholars increasingly agree that methods training should be a crucial component of undergraduate teaching because of its role in helping students “learn how we go about answering important questions in the discipline” and “how it is that we have come to know what we know” (Bernstein 2021: 3). Yet, there has been less agreement on how to integrate research methods into curricula (Adriaensen et al. 2021). Importantly, the research methods agenda has largely been dominated by concerns with the provision of quantitative methods training. A reason for quantitative dominance is astutely summarised by Gunn (2017a: 244) in relation to the UK but applicable to many other educational contexts, a “dearth of quantitative skills among students has ramifications within and outside the academy”. While acknowledging the urgent need to improve methods training across social sciences, numerous publications have highlighted difficulties in achieving this goal (Murtonen 2015; Scott Jones and Goldring 2015). Earley’s (2014: 245) analysis of 89 studies has identified some common perceptions of methods classes: students often feel “anxious or nervous about the course”, they “fail to see the relevance of the course to their major and their lives”, and they are often “uninterested and, therefore, unmotivated to learn the material”. A growing number of scholars have responded to this challenge by proposing innovative approaches to teaching stand-alone methods classes (again dominated by quantitative methods). Their key objective is captured in the title of a recent article: “Teaching Statistics: Going from Scary, Boring, and Useless to, Well, Something Better” (Bailey 2019). Welcome innovations include introducing research methods through an “apprenticeship model” (Page 2015), using a “blended flipped classroom” (Van der Zwan and Afonso 2019) or mini-games (Asal et al. 2018), or even resorting to “the metaphor of a murder investigation and trial” (McCarty 2021: 623).

These examples are certainly promising in maintaining students’ attention and alleviating anxiety about the technical difficulties of the course. Yet, they are less likely to be effective in overcoming students’ perceptions of research methods’ irrelevance for their degree programmes or for their lives. To address the latter challenge, the rapidly developing literature has proposed adopting “a discipline-embedded approach” to methods training (Leston-Bandeira 2013: 207) that would allow students to appreciate the links between research methods and substantive knowledge. Advocates of the discipline-embedded approach have generally opted for one of two models. The first model prioritises incorporating aspects of substantive topics into the teaching of stand-alone methods classes. Examples include structuring a research methods course around the topic of political assassinations (Oldmixon 2018), replicating a qualitative study on resource curse and civil conflict (Becker 2020), or more broadly adopting a problem-based approach to methods training (Dyrhauge 2014). As emphasised by Leston-Bandeira (2013: 210), such embedding implies that “methods are a means towards an end, rather than an end in itself”.

The second model envisages embedding research methods training into the teaching of substantive courses (Adriaensen et al. 2014; Dickovick 2009). Again, this approach is particularly popular with experts in quantitative methods who have successfully used topics in substantive courses as pedagogic hooks, defined by Lewthwaite and Nind (2016: 422) as “things that are non-threatening, non-technical, even enjoyable”, to alleviate students’ anxiety and to demonstrate the relevance of research methods to students’ degree programmes. For instance, Williams et al. (2021: 336) have advocated “[b]uilding data science modules into substantive political science courses”, while Gunn (2017b) has trialled embedding quantitative methods in a course on psephology. Another example is Clough’s project on “Integrating Quantitative Methods into the Politics Curriculum”, funded by the UK Economic and Social Science Research Council, that has produced “a set of teaching materials, based on academic journal articles using quantitative methods” for courses in American Politics, British Politics and Introduction to International Politics (UKRI n.d.).

Both models represent a significant improvement in the provision of methods training. Yet, they risk remaining isolated encounters with research if they are not systematically supported throughout the rest of the curriculum. Thus, we need to find more effective ways of not simply introducing students to research methods, but of integrating the reflection on and the practice of research in the majority of substantive Politics and IR courses. Our argument echoes the wider call across social sciences to “[f]amiliarize students with the process of research by integrating it more fully throughout the curriculum” (Markle 2017: 113). It also echoes the wider call for “incorporating and scaffolding more multi-method research methodology training” (Clancy and Bauer 2021: 161, emphasis added). Instead of employing isolated (often single-method and mostly quantitative) research exercises in relation to particular topics, we advocate integrating a variety of research methods—both qualitative and quantitative—into every aspect of a substantive course: from lectures to independent reading, seminars and assessment. This approach, we argue and demonstrate in our experiment, is more effective in normalising the research process. It encourages our students to see research as key to both their formal (as part of their degree programme) and everyday learning. While scholars disagree on what constitutes everyday learning, in this article we use this concept as synonymous with “informal learning”. Defined by Livingstone (1999: 5), “informal learning is any activity involving the pursuit of understanding, knowledge or skill which occurs outside the curricula of educational institutions, or the course or workshops offered by educational or social agencies”. By supporting our students to practice their research skills on a regular basis and in relation to every topic, we aim to encourage them to accept research as essential when they want to gain a better understanding of any issue—whether at the university, at the workplace or in the media.

The idea of everyday learning is central to another strand of the literature—the literature on citizen education—that has recently gained a new momentum in response to political developments across the world, particularly the rise of populism and the spread of various conspiracy theories and anti-expert sentiments, facilitated by the growth of social media (Gatt et al. 2021). As argued by Blair and Stockemer (2020: 224), “as a discipline we need to be better at ensuring our curriculum reflects the contemporary landscape and that we provide our students with the tools and the ability to engage in contemporary political debates”. While the renewed emphasis on citizen education may entail developing courses on populism or authoritarianism (as well as explicitly addressing these topics in wider Politics and IR courses), it also entails more critical engagement with data more broadly. As summed up by Lupia and McCubbins (2019: 657), the abundance of misleading information and “fake news” “has prompted most present-day observers to be concerned not about a lack of information but rather about how citizens will sort through and use all of the information that is available”.

In response to these concerns, a growing number of scholars have emphasised our responsibility to more proactively support our students to become more critically aware data users both in their formal education, and in their everyday lives. To quote McCartney (2020: 238), “we need to address the information wars by placing more emphasis on the quality of information and sources of evidence, regardless of our sub-discipline or the title of the course”. In a similar vein, Harden and Harden (2020: 344) have called for a greater attention to information literacy: “When students can critically evaluate sources, they better recognize the characteristics of the information ecosystem in which they engage on a daily basis”. Yet, despite the best efforts by libraries and academic departments to curate the quality of sources within their collections, improve student access to databases, and offer students guidance when evaluating sources, students’ submitted works indicate that questionable sources remain regular occurrences.

The literature has proposed a number of recommendations to address this persistent problem. For example, McCartney (2020: 240–241) has advocated incorporating a discussion of sources, their biases and credibility in lectures and seminars of substantive courses, as well as developing assignments that ask students to assess “the quality of evidence that authors use”. Similarly, Harden and Harden (2020) have proposed regular information literacy exercises, such as filling in questionnaires on sources. Focusing on media consumption, Wender and D’Erman (2021) have trialled news journals, with students and lecturers documenting their media use and reflecting on bias. We suggest that this emphasis on critical engagement with evidence throughout substantive curricula is particularly important for Comparative Politics courses. Covering a wide range of topics related to the political system or processes in a state or a region, these courses are particularly well suited for encouraging reflection on different types of data (broadly understood and encompassing both quantitative and qualitative data) and their (often strategic) use by political actors. Moreover, a systematic focus on both the quality of data and on their strategic use can contribute to normalising the research process and making it an integral part of students’ approach to both formal and everyday learning, as discussed above.

The third strand of literature that, we argue, is relevant for the methods debate is the literature on positionality which is premised on the well-established argument about “the tight coupling of educational practices and the formation of geopolitical subjects” (Muller 2011: 15). Indeed, the role of education in reproducing or disrupting understandings of identity and processes of othering has been extensively studied in the literatures on nationalism, identities and identifications (Vickers and Jones 2005; Zajda 2017). Surprisingly, the pedagogic literature has been less reflexive on our own contribution to these processes. A welcome intervention is a special section on “Teaching Africa” published in Politics in 2016. As emphasised by Routley (2016: 482), “positionality and representations profoundly shape engagement with Africa”. Discussing her experience of teaching African Politics at a UK university, Routley writes about the need to address students’ assumptions of “their alignment with Western actors who will ‘solve’ Africa’s problems”. Similarly, Gallagher et al. (2016: 443) reflect on students’ motivations for choosing an African Politics course “because they hope to work for relief or development organisations in the future, and so they are implicitly invested in the continuing equivalence of Africa and disaster or development”. While students’ assumptions about other states or regions may vary, we agree with the “Teaching Africa” authors that we should take positionality seriously.

We propose that we can more proactively support our students in reflecting on their assumptions by weaving questions of positionality, identity and othering into the course, and by making these questions more visible for every topic. Our approach is similar to what Soedirgo and Glas (2020) describe as an “active reflexivity posture”. While the importance of reflexivity is widely acknowledged as important by scholars working in different methodological traditions, Soedirgo and Glas argue that in practice reflexivity often remains static: researchers tend to discuss their assumptions at the start of the research process rather than maintaining continuous focus on positionality throughout the project. To make reflexivity more dynamic, Soedirgo and Glas (2020: 529) propose documenting one’s reflections on a regular basis. While their recommendations are aimed at researchers, they are equally appropriate for students as part of their engagement with research in substantive Politics and IR courses. An “active reflexivity posture” may vary from encouraging students to record their reflections as part of formative assessment, to supporting them to regularly articulate their reflections in class without putting them on paper. We can further contribute to normalising their continuous reflexivity by sharing our own reflections.

Moreover, a consistent focus on positionality and othering can reinforce for our students the importance of engaging with a greater variety of voices from a state or a region. For example, instead of talking about Russia’s view of the “West”, we should encourage our students to examine sometimes overlapping and sometimes conflicting understandings of the “West” as they are articulated by Russia’s decision-makers, oppositional politicians, cultural figures, as well as understandings of the “West” in Russia’s popular culture or public opinion. Gibert (2016: 495), for example, writes about her experience of using popular culture artefacts to help students “consider the very wide range of voices and views on Africa, its politics and international relations”. While this unpacking of multi-vocality is essential for any Comparative Politics course, it is particularly important for courses on those states or regions in relation to which students are likely to have strong views, for example following a war. Writing about their experience of teaching a class on the Arab–Israeli conflict, Yakter and Tessler (2018: 434) observe that “American students may have “otherness” bias, predispositions brought from home, and resistance to opinions they do not share”. This observation echoes our experience of teaching Russian Politics at a UK university. Moreover, we expect that students’ views of Russia will be more rigid in response to Russia’s war against Ukraine.

The task of introducing a variety of voices and experiences brings us back to the question of data and methods. By supporting our students to use a wide range of data—from public opinion surveys, media broadcasts, official documents and images, to official statistics—and a wide range of methods, we can more effectively encourage them to unpack the complexity of the Other. We do not suggest that practicing different research methods and working with different types of data would automatically lead our students to reflect on identity construction and to question their positionality. Rather, we argue that by doing this we are better placed to open up discussions of positionality, and to demonstrate how our engagement with different voices and experiences can transform our understandings of a state or a region. Thus, by consistently engaging with research methods throughout a Comparative Politics course, we can support our students in practicing research skills, questioning data and reflecting on their positionality as part of their formal and everyday learning.

Engaging with research methods in a “Contemporary Russian Politics” course

In 2019/20, we incorporated discussions and activities engaging with research methods, different types of data and questions of positionality into the structure of an optional second-year undergraduate course on “Contemporary Russian Politics” at Newcastle University in the UK. This course attracts around 80 students, with the majority pursuing Politics, Politics and Economics, Politics and History, or Politics and Sociology degree programmes. A small minority come from other subjects, such as Geography or Law. The majority are from the UK background, with a small minority coming from the EU or other backgrounds, including from Russia and other post-Soviet states. Most students do not speak Russian, and only a small minority would have an experience of living in Russia or travelling there.

This course is well suited for building on students’ skills that they have developed during the compulsory first-year course on “Becoming a Political Analyst” and the compulsory second-year course on “Becoming a Political Researcher”. While the former aims to “enable students to assess critically the quality of data, evidence and analyses produced by others” (Newcastle University 2021a), the latter “introduces students to key concepts used in both quantitative and qualitative research” and covers a range of research methods, including surveys, interviews and focus groups, content and discourse analysis, and ethnography (Newcastle University 2021b). “Contemporary Russian Politics” is taught over 12 weeks, with weekly 2-h lectures and 1-h seminars. Our topics include the historical background to Russia’s post-Soviet trajectory, Russia’s political culture, various aspects of Russia’s political system, and Russia’s foreign and security policies. The summative assessment includes a written essay (students choose one of seven questions) and an unseen written exam. Running since 2011, this course has always included questions of identity and positionality in its syllabus. Equally, we have frequently used various types of data in lectures, while encouraging students to use appropriate evidence in their essays. However, in 2019/20 we introduced three significant changes to the ways in which we linked substantive topics with reflections on research methods and data, and to the ways in which we positioned the systematic engagement with research methods and continuous reflection on positionality as key to our joint effort to understand Russian Politics.

First, compared to previous years, we made the questions of identity and positionality significantly more visible. Previously, we had focussed on identity construction in isolated topics, such as nation-building or foreign policy. In 2019/20, we explicitly emphasised throughout the entire course that to gain a deeper understanding of Russian politics, we need to engage with multiple voices from Russia and analyse different types of data that reflect these voices and their experiences. We began with a conceptual discussion of identities and positionality. Most students had already been familiar with the idea of othering, having studied social constructivist, poststructuralist, feminist and postcolonial perspectives in their first-year courses. For others, this conceptual introduction was a necessary first step before they could reflect on the ways in which Russia had been historically imagined both in Russia and in the “West”. In particular, we discussed common themes in the writings about Moscow by Western travellers from the sixteenth century onwards (see Cross 1971), the idea of the “East” in European identities (Neumann 1999), and the ideas of “Europe” in Russia (Neumann 1996; White and Feklyunina 2014). During the first week, the teaching team reflected on our own understandings of Russia, and how they had changed over years. While we did not single out students from post-Soviet states, some of them volunteered to share their reflections on how their understandings had been shaped by their family memories, education, mass media or commemorative practices. Throughout the rest of the course, we adopted an “active reflexivity posture” by encouraging our students to reflect on their changing assumptions.

Second, we incorporated an explicit discussion and practice of various research methods in every component of the module. Whereas previously we had mostly discussed research findings of relevant studies of Russian politics in our lectures and seminars (without focusing on the research process or questioning the types of data used in these studies), in 2019/20 we shifted our focus to the research process. Each lecture included a discussion of a particular research method and type of data that scholars of Russian politics had employed in their studies. We then encouraged students to reflect on advantages and limitations of using this method and data. When preparing for their seminar, students were asked to pay particular attention to the method and the type of data used in their reading. Finally, in seminars, following the discussion of the reading, students worked in small groups to practice their research skills in analysing a set of research materials that we had prepared for this topic. We then encouraged students to reflect on whose voices or experiences these methods and data allowed us to examine, and how the research process helped us gain a deeper understanding of a particular aspect of Russian politics.

For example, when discussing the USSR’s collapse and the 1990s reforms, we considered the significance of individual and collective memories of this period for Russia’s political culture, particularly for attitudes towards democracy. We engaged with the work of the Noble Prize-winning author Svetlana Alexievich (2016) on Secondhand Time: The Last of the Soviets. An oral history of ordinary Russians who talk about their experiences, Alexievich’s book not only helped us discuss the method of oral history, but also bring conflicting Russian voices into the classroom. In the lecture, we watched an interview with Alexievich and considered advantages and limitations of oral history. In the seminar, we worked with extracts from the book to discuss popular attitudes to Gorbachev’s Perestroika and their significance for contemporary Russia. Another example is our discussion of quantitative methods, “hooked” to a question about the extent to which we can describe Russian elections as free and fair. As part of our topic on “Elections and Voting”, we discussed the research process and findings of a study by Lukinova et al. (2011) that employed quantitative methods to establish the magnitude of suspiciously cast ballots during Russia’s 2008 presidential elections.

Finally, we introduced a significantly greater variety of data. In previous iterations of the course, we had used statistical data (for example, the data on Russia’s economic growth or defence spending) and public opinion data in isolated lectures. In contrast, in 2019/20 we worked with different types of data in all lectures and seminars. For example, we included a discussion of public opinion data in most topics of the course. Having reflected upon limitations of public opinion surveys in authoritarian states, we worked with surveys of the Moscow-based independent Levada Centre. When covering the evolution of Russia’s political system, we looked at the results of a survey asking “which political system is best for Russia” (Levada Centre 2019: 27). When discussing the role of the President in the Russian political system, we examined changing attitudes towards Putin over the past two decades: “Why does Putin appeal to you?” (Levada Centre 2019: 71). Other examples include a selection of Putin’s official photographs that allowed us to reflect on advantages and limitations of visual analysis; statistical data published by the Russian Federal Statistics Service; President Putin’s (2014) “Crimea Speech”; an article on “Liberalism in Crisis” written by Putin’s critic Khodorkovsky (2004); and a news broadcast by the state-sponsored media outlet RT.

While consistently engaging with research methods throughout the course, we decided to keep the assessment arrangements from the previous years (an essay and an unseen exam), without introducing an explicit research methods assignment. By avoiding explicit instructions to engage with primary or secondary data, we were better positioned to evaluate the extent to which students chose to do so without being prompted. We also kept essay questions from the previous iteration of the course, thus allowing us to compare student engagement with data across two years. While our course was affected by the Covid-19 pandemic, most teaching had been delivered before the transition to remote learning. Unable to administer an unseen written exam during the lockdown, we replaced it with a timed open-book exam in May 2020. Because of this change in assessment, the following section focuses entirely on comparing student essays in 2017/18 and 2019/20.

Does engagement with research methods make a difference?

As our teaching progressed throughout the semester, we noticed that many students became noticeably more confident in discussing various research methods and types of data, and in employing them in small group work as part of their seminars. Many of them also became more proactive in identifying different Russian voices and experiences, and in reflecting on how their engagement with these voices was changing their assumptions. Their active reflexivity posture was particularly evident in our frequent discussions of media reports about Russia in the British media. Students often volunteered to reflect on how those reports (that they encountered independently as part of their everyday learning) linked to our course, what types of data they used, and whose voices of experiences they represented. To evaluate the impact of our approach on students’ engagement with research methods in greater depth, we resorted to both quantitative and qualitative analysis of their essays. We began by coding all essays submitted in 2017/18 and 2019/20 according to the types of data that they had used including Russian official statistics, official documents, President’s public statements, statements by Russian politicians or public figures, and the Russian media. We then compared both sets of essays by focusing on four questions. First, how many essays went beyond the recommended literature to engage with any data? Second, how many different types of data did they use? Third, what types of data did they use? And finally, how did students analyse the data?

Somewhat counterintuitively, the percentage of students who used any data in 2019/20 was slightly lower than in the previous year: 53% compared to 59% in 2017/18. This difference can be explained by the fact that the course had always encouraged students to engage with various kinds of data even before our effort to make this engagement more systematic and more salient. More interestingly, however, the 2019/20 essays displayed a different pattern of engaging with data. As Table 1 shows, in 2017/18, 70% of those who did use any data, engaged with only one type, while only 10% engaged with three or more types of data. In 2019/20, on the other hand, as many as 20% engaged with three or more types of data while another 34% engaged with two types. Thus, a noticeably larger share of students felt sufficiently confident and motivated to identify and explore a greater range of evidence.

Table 1 The number of types of data used in students’ essays in 2017/18 and 2019/20

A closer examination of the types of data also points at a significantly greater variety of Russian voices in the 2019/20 set of essays—an outcome that we were particularly happy to achieve. First, the most common type of data used in the 2017/18 essays was the President’s statements. In 2019/20, students’ use of this type of data noticeably decreased to 13%. At the same time, students’ use of statements or articles by other Russian politicians or public figures increased from only 3% in 2017/18 to 17% in 2019/20. Finally, the overall share of non-official Russian voices (i.e. statements by Russian politicians or public figures, public opinion surveys and the Russian media) grew from only 21% in 2017/18 to an impressive 42% of essays in 2019/20. Thus, in 2019/20 more students engaged both with a greater number of different types of data and were more likely to bring in Russian non-official voices, which suggests a greater openness for unpacking a more complex and more multi-vocal picture of Russia. Examples of such multi-vocality in 2019/20 essays include engagement with publications by oppositional politicians, such as Mikhail Khodorkovsky (essay 61), Boris Nemtsov (essay 7) or Vladimir Ryzhkov (essay 4); Russian NGOs, such as the Russian LGBT Network (essay 77); and pro-Kremlin intellectuals, including Sergei Karaganov (essay 37) and Fyodor Lukyanov (essays 6; 11). Compared to 2017/18 essays, in 2019/20 students also engaged with a greater variety of Russia’s official documents, including the Constitution of the Russian Federation, Russia’s Foreign and Security Policy Concepts and Russia’s Strategy of Social and Economic Development (essays 12, 14, 22) (Table 2).

Table 2 Types of data used in students’ essays in 2017/18 and 2019/20 (as percentage of essays among those essays that used any data at all)

More importantly, our analysis of students’ essays points at a noticeable change in how our students’ identified appropriate data and analysed them in their essays. Compared to the 2017/18 essays, in 2019/20 students were more careful in identifying reliable sources. For example, most discussions of Russia’s public opinion relied on the data of the independent Moscow-based Levada Centre, while most references to Russia’s official statistics were attributed to Russia’s Federal State Statistics Service, the World Bank or other inter-governmental bodies. More importantly, in the 2017/18 essays, students generally used only the type of data that we had engaged with in class in relation to the particular topic of their essay. In 2019/20, students were noticeably more willing to engage with types of data that we had used for other topics, as well as to draw links between topics, a skill that was repeatedly modelled and practiced in the 2019/20 seminars. For example, while in 2017/18 most references to public opinion were limited to the discussion of public support for Putin, in 2019/20 students used public opinion data to explore the basis and limits to such public support—for example, by linking it to public attitudes towards Russia’s economic performance (essay 6), to examine public attitudes towards LGBT rights (essay 53) or to look at public expectations regarding Russia’s foreign policy (essay 26). Some of the strongest essays used public opinion data to evaluate claims in the literature by, for example, discussing whether public opinion data from the 1990s supported an argument about a demand for the restoration of Russia’s great power status already before Putin’s rise to power (essay 26).

While in 2017/18 most references to Putin’s statements (as well as rare references to statements by other Russian politicians and public figures) were limited to brief direct citations without any attempt to use discourse or narrative analysis, in 2019/20 we found some excellent instances of students employing these methods. Examples include an investigation of how the Russian authorities interpreted the “coloured revolutions” in the post-Soviet space (essay 22), and an analysis of how President Putin and Russia’s Foreign Minister Lavrov interpreted Russia’s international position, and how their narrative of the lost stability of the Cold War can contribute to our understanding of Russia’s support for the Assad regime in Syria (essay 14). Other essays traced the idea of “restoring” Russia’s past glory in Putin’s statements (essay 46), or discussed debates on Russia’s post-Soviet identity in the wider society (essay 1).

Predictably, students’ engagement with methods and different types of data was uneven across the essays. Some 2019/20 essays demonstrated limited awareness of the quality of their sources or used lengthy direct citations from President Putin without any attempt to situate these quotes in wider narratives. Thus, our experiment suggests that a consistent engagement with research methods in substantive courses should not be relied upon to replace standalone methods courses or fill substantial gaps in students’ research methods skills. Yet, our approach has supported a large number of students to at least begin to accept their engagement with research methods as an essential part of their learning.

Conclusion

Our experience in the classroom and our analysis of students’ essays point to a significant positive impact of our approach on students’ learning. However, they also highlight some limitations that we would need to address in our future teaching. By excluding an explicit research methods component from our assessment, we missed an opportunity to encourage a greater number of students to draw conscious attention to their skills throughout the course. Knowing that they would need to employ research skills as part of their assessment, students could potentially pay more attention to research skills during seminars, as well as be more proactive in seeking out feedback. Although by excluding a research methods component we were better placed to evaluate students’ willingness to engage with research methods without being prompted, we will redesign our assessment in future.

We also expect that the impact would be significantly greater if our efforts were more effectively coordinated both with the stand-alone research methods courses and with other substantive courses on our undergraduate programmes. As a way forward, we are planning to develop research exercises for other Comparative Politics courses in our UG programmes that draw explicit links between individual courses. For example, students taking a course in “Government and Politics of the USA”, may be encouraged to compare findings of public opinion surveys conducted in the USA and in those states that we study in other Comparative Politics courses. Finally, we could redesign the structure of the course (for example, by reducing the number of lectures while increasing the number of seminars or adding workshops) to create more space for research-focussed discussions and exercises. Our activities could include, for example, a more in-depth engagement with research methods by replicating influential studies in the field of Russian politics, or analysis of the strategic use of the same data by different political actors. While these research exercises may vary greatly across Politics and IR courses, they can ultimately achieve the same purpose of supporting our students in practicing their research skills, in learning to question any data or evidence, and in reflecting on their positionality.