Keywords

Introduction

Until very recently, only a limited number of public opinion surveys conducted across Central Asia were available to researchers, and those that existed had limited reach, such as the Life in Kyrgyzstan study (The Life in Kyrgyzstan Study, 2023), the polls by the International Republic Institute’s Centre for Insights in Survey Research (CISR, 2023), the Asia Barometer Survey (2005), or the rounds of World Values Survey (Haerpfer & Kizilova, 2020). These large-scale public opinion surveys often do not cover all five Central Asian countries and are not conducted on a regular basis. In addition, many polls covered only a handful of topics, which were considered safe in light of the political sensitivity in these predominantly autocratic societies; e.g. the Listening to citizens of Uzbekistan project (World Bank, 2023), which focusses mostly on economic indicators. Furthermore, most public opinion polls conducted previously in Kazakhstan, Kyrgyzstan and Tajikistan were financed by international organizations (IOs) such as the UN agencies, or by global companies like Coca-Cola. Such survey data remains closed to the public due to inertia and lack of incentive on the side of the IOs and global companies.

Since the countries of Central Asia are considered to be more “closed” societies with somewhat backward survey technology (Haerpfer & Kizilova, 2020), most researchers continue to rely on qualitative research methods such as (digital) ethnography, participatory observation, individual interviews and focus groups to obtain up-to-date opinion data from the wider Central Asian public. Recently, however, we are witnessing a growing interest in survey data on Central Asia among both policy analysts and academics. Some researchers are not only using existing survey data in their studies but are also venturing to create their own surveys with the help of local partner organisations such as the Central Asia Barometer (hereafter CAB). CAB is an applied social research centre that conducts opinion polls across the five Central Asian countries of Kazakhstan, Kyrgyzstan, Tajikistan, Turkmenistan and Uzbekistan. It was founded in Bishkek, Kyrgyzstan, by a group of opinion polling enthusiasts in 2012. Since 2017, CAB has organised 13 multiwave public opinion surveys in all five Central Asian countries (CAB, 2023) funded by subscriptions and post-factum purchases of the survey data. CAB’s ultimate goal is to run its surveys across the region on a monthly basis and to publish the polling data in a free and open access manner soon after. Currently, CAB only has the means to publish the opinion data in open access format after two years.

In this chapter, I will reflect on my experience in running public opinion polls in my role as the director of CAB. In discussing the challenges that my team and I have encountered when conducting public opinion research in the region, I seek to provide some useful guidance and tips for both organisations and individual scholars that plan to conduct public opinion research in Central Asia. The remainder of this chapter is structured as follows. In the next section, I discuss the various barriers that survey companies and researchers encounter when trying to access Central Asia to conduct public opinion research. I then discuss the mechanisms and methods opinion polling centres have previously adopted in order to run polls on sensitive topics in the region. Finally, I summarise the main takeaways for other scholars and institutes that plan to run public opinion polls in Central Asia.

Entering the Central Asian Field

I would like to start with a very personal observation/statement about the region. Despite their shared historical ties, from my experience working in the opinion polling industry, it is very problematic to assume that public opinion does not vary across the five Central Asian countries of Kazakhstan, Kyrgyzstan, Tajikistan, Turkmenistan and Uzbekistan. The authorities’ mechanisms of survey censorship vary across the region and the rules for survey companies are informal, fluid, human-dependent and partly negotiable, which makes the collection of public opinion data an even more uncertain venture. As such, opinion polling centres’ and researchers’ access to the field varies across the region.

To date, Kyrgyzstan and Kazakhstan are more accessible to scholars and survey companies than Tajikistan, Turkmenistan and Uzbekistan. To conduct opinion research in some Central Asian countries, opinion polling centres need to obtain an official letter from the respective authorities. The process of obtaining the necessary permission letters to conduct opinion research in Turkmenistan, Tajikistan and Uzbekistan is challenging and time-consuming. For that reason, my team and I use the help of our local partner organisations, who usually have to submit our detailed survey questionnaire to the respective authorities for pre-approval. Depending on the nature of the topic, questionnaires can be rejected. For example, amidst the border disputes between Tajikistan and Kyrgyzstan (Najibullah, 2023), Tajik authorities denied requests to conduct research in the border areas of the countries.

While in Tajikistan, survey companies usually have to obtain the necessary approval letter from the security services, in Uzbekistan, it can be any ministry from which they need to get the permission. Opinion polling centres are requested to obtain these permission letters for their local partner institutions that conduct the interviews in the respective mahallas (Uzbek word for “community”). The local interviewers then need to show this letter to each mahalla committee. Thus, while we have witnessed a boom in survey research under Mirziyoyev’s reign, opinion polls that feature questions about politics and sensitive topics continue to be closely monitored by the Uzbek authorities (Dall’Agnola, 2023a). For example, it is hard for our contractors to conduct individual interviews, because sometimes, a representative of the mahalla committee may want to stand nearby to check if the interviewer is really asking the questions outlined in the permission letter. Under the watchful gaze of the representative of the mahalla committee, respondents often feel pressured to self-censor and answer questions in line with the Uzbek government’s stance on a particular topic or issue. This phenomenon is also associated with self-censorship and “autocratic bias” (Tannenberg, 2022, 592) is very common among respondents in all five Central Asian countries. Central Asian people have little trust in survey providers, and most believe that the opinion polling centre is conducting the research on behalf of the local government. This can lead to high rates of systematic non-responses and/or biased answers, resulting in poor data (Chia, 2014).

While Uzbekistan has somewhat softened its stance on opinion polling over the last five years, inquiries into public opinion in Turkmenistan are even more difficult for survey companies and foreign researchers. This is because it is almost impossible for them to enter the country, let alone to conduct opinion research (Dall’Agnola, 2023a). CAB conducts polls in Turkmenistan through a local partner organisation that needs to remain anonymous for safety and security reasons. The rules in obtaining permission letters in Turkmenistan are even more complicated and vague than in Uzbekistan or Tajikistan.

While opinion polling without official permission letters can lead to the closure of opinion research centres, Central Asian authorities are also known for retroactively interfering in the transfer and dissemination of survey data. For example, the Uzbek authorities are currently blocking the transfer of public opinion data that was collected by a local Uzbek polling centre for the World Value Survey Secretariat in Uzbekistan in 2022 (Dall’Agnola, 2023a). Furthermore, while in Kyrgyzstan and Kazakhstan, opinion polling centres may face fewer problems in data collection and transfer than in the other Central Asian countries, they face similar difficulties in terms of the dissemination of the polls’ results. For example, in March 2022, CAB conducted a snap poll in Kyrgyzstan on public attitudes towards the war in Ukraine, with the intent to publish the data as soon as the data collection was finished. Out of fear of potential governmental sanctions against our organisation, we were only able to partly publish the poll in September 2022, when the data was no longer relevant. Having families and livelihoods in the region contributes to greater self-censorship among both local researchers and survey companies. I am convinced that it would have been much easier for CAB to freely disseminate the data if I and my team were not predominantly people who were born and live in Central Asia.

Finally, this does not mean that Central Asian regimes do not see the benefit in opinion polls. Some authorities even run opinion polls themselves. In Kazakhstan and Uzbekistan, for example, public opinion polls are frequently conducted by government-organised organisations such as Yuksalish in Uzbekistan or by governmental think tanks such as the Kazakh Institute of Strategic Studies for internal use in Kazakhstan. Kazakhstani authorities are also known for running snap polls before elections and referendums to legitimise the regime (Sorbello, 2023). Comparing the survey results between the government-affiliated and independent survey providers, we observe stark differences in the level of support for political parties. For instance, while both Paper Lab and CAB’s opinion polls found that 20 per cent of Kazakhstan’s population was planning to vote for the Amanat party (Sorbello, 2023), the Public Opinion Foundation, in their study for KTK, a TV channel, claimed that 60 per cent were planning to vote for the same party (Zakon.kz, 2023). In short, whether polls conducted by the local authorities really capture the pulse of the public remains highly questionable.

Both challenges outlined above do not only contribute to the self-censorship of survey companies and the networks they work with. They also influence the questions opinion polling centres can ask in the region, as the next section shows.

Asking Sensitive Questions in an Authoritarian Context

Even surveyors in democratic societies carefully assess the sensitivity of their questionnaires before commissioning the full survey (Tourangeau & Yan, 2007). In the Central Asian context, however, even questions that seem unproblematic at first can be censored by local authorities or contracted partner institutions at a later stage (Dall’Agnola, 2023a; Haerpfer & Kizilova, 2020). For example, CAB has encountered situations where interviewers were arrested and their tablets were confiscated by local authorities, even though they had only conducted surveys on English language learning preferences. In Tajikistan and Turkmenistan, opinion poll centres cannot ask questions regarding Islam, anything relating to the LGBTQ + community or issues (Dall’Agnola, 2023a), or domestic or international politics, leaving aside any questions regarding the president or other authorities. Moreover, although CAB has been able to conduct polls on respondents’ preferences towards other Central Asian countries in Turkmenistan, it is impossible to ask the same question in Tajikistan. The rules with regard to which questions can be asked and which are off-limits in the region and within specific countries are fluid and constantly changing. For example, in Kazakhstan, CAB did not face any specific problems with regard to survey questions about domestic politics until the Qantar and Russia’s war in Ukraine in 2022.

To conduct opinion polls on sensitive topics, CAB regularly consults with its local contracted partner institutions about national political events and the applicability of new survey questions in their respective countries. Depending on the country, CAB has to slightly rephrase new questions by making them less specific or placing them in another part of the questionnaire. The incorporation of a new question into the Central Asia Barometer Survey wave can thus take up to several weeks, even though the initial research question submitted to CAB was well-formulated. For example, in 2022, CAB was asked to include a question on people’s attitudes towards CCTV cameras in public spaces. Several local vendors expressed concerns about the political sensitivity of this question. To guarantee a high-response rate, local contractors proposed using more specific and positive language and to limit the focus of the question to CCTV cameras’ role in reducing crimes. However, this new question slightly distorted the original aim of the question, which was to capture public attitudes towards various types of CCTV cameras, including those that are used to control the wider public. Unsurprisingly, more than 90 per cent of respondents surveyed in the five Central Asian countries were in favour of CCTV cameras that are used to reduce crimes (Central Asia Barometer, 2022).

The positionality of the researcher and the research teams is another important factor that influences questionnaire development. For example, when coordinating CAB’s surveys on Russia’s war in Ukraine, we noticed that the very naming of the conflict was a reason for debate within our extended network of survey providers. Countries in Central Asia consume a vast amount of Russian propaganda, and for some of the team members, the right way to name the conflict was “Russia’s special military operation in Ukraine”, while others demanded the use of “Russia’s attack on Ukraine”. To remain as impartial as possible, CAB ended up using “the conflict between Russia and Ukraine”, which seemed to be the most neutral term available at that moment. The use of the same wording in later surveys I coordinated in the Caucasus was rejected by our local partner organisations in Georgia and Armenia as it was deemed to bias the question and encourage respondents to provide an answer in favour of Russia’s decision to attack Ukraine. One possible way to measure respondents’ self-censorship and “social desirability bias” (Kalinin, 2016, 191) would be to run so-called “list experiments” (Frye et al., 2023, 213). “The premise of list experiments is that if a sensitive question is asked in indirect fashion, respondents may be more willing to offer a truthful response even when social norms encourage them to answer the question in a certain way” (Blair & Imai, 2012, 48). This method is currently being used by scholars (Frye et al., 2023) and opinion polling centres (Levada Centre, 2022) to capture Putin’s popularity among the Russian public since the outbreak of the war. Researchers who plan to commission public opinion surveys on sensitive topics in Central Asia should consider this method.

A survey company’s chosen mode of data collection can also influence respondents’ self-censorship and social desirability biases and therefore, the quality of the collected data. Despite the fact that our interviewers are frequently monitored, verbally abused, and from time-to-time even arrested by local authorities and the police, until very recently, CAB surveys mainly relied on tablet-assisted personal interviews (TAPI hereafter). TAPI is a face-to-face data collection method in which the interviewer uses a tablet to record the answers given during the interview. Only when the COVID-19 pandemic hit the region did CAB start to use phone polls in its opinion research activities.

The switch to using technology for random sampling of telephone numbers in phone polls, which is a probability-based sampling method, happened only recently in the region. Now, quota or convenience sampling methods, which often lack the solid scientific and theoretical basis needed to make inferences about the entire target population, are less used by survey providers (Langer, 2018). Moreover, while we still have to get official permission letters from local authorities to run phone polls, they are usually administered via in-home stations or computer-assisted telephone interview studios that enhance interviewer safety and are less costly and less time-consuming. This having been said, telephone polls also have limitations: For example, interviewers are less likely to build trust with their interviewees who tend to be more suspicious of having their answers recorded via telephone than in a face-to-face interview.

Moreover, since asking respondents for their verbal or written consent often increases their suspicion and unwillingness to participate in a phone or in-person interview, some of the survey providers do not ask people for their consent, which is a clear breach of ethical standards. This is something a responsible researcher may want to discuss specifically with their survey provider. Central Asians’ reluctance to give their consent for an interview, even with assurance by the interviewer that their statements will be treated anonymously, is a well-known issue (Dall’Agnola 2023b; Heathershaw & Mullojonov, 2020; Skriptaite, 2023) that ethics committees need to take into account when reviewing researchers’ research proposals.

While there is a growing body of literature that highlights the advantages of online polls over telephone and in-person interviews in authoritarian contexts (Heerwegh, 2009), thus far, CAB has not used online polls to capture public opinion for two reasons: First, most online polls do not use probability-based sampling methods and therefore are not representative of the wider population (Langer, 2018). Second, since access to the Internet and social media still remains poor in Central Asia, especially in rural areas and among older generations (Dall’Agnola & Wood, 2022), online opinion polls mainly capture the views of urban youth.

Concluding Thoughts

As I have tried to show, it is important for the survey researcher to acknowledge and consider the diverse access barriers to the Central Asian field. Access to the different Central Asian countries varies and depends not only on the political and social developments in the respective country, but also on the researcher’s positionality. Moreover, conducting opinion polls without permission letters from the local authorities in Tajikistan, Turkmenistan and Uzbekistan is sanctioned and can not only lead to the closure of local opinion research centres, but also to the arrest of the interviewers by the local police. The self-censorship bias in authoritarian contexts is an important issue to be considered by the researcher while working on their survey questionnaire, as is the dynamic list of sensitive questions pertinent to each country.