LISS
The Longitudinal Internet Studies for the Social sciences (LISS; see www.lissdata.nl) is a panel representative of the population of the Netherlands (16 years and older). Together with the US, Canada and the Scandinavian countries, participation rates in the Netherlands are among the highest in the World (e.g. Curtis et al. 2001). This makes it a good case for our current purpose: if schools of democracy are widespread anywhere it should be in these countries.
The LISS panel is designed to follow changes in the life course and living conditions of the panel members. The yearly retention rate is about 90 percent and refreshment samples are drawn to maintain the representativeness of the panel (see Leenheer and Scherpenzeel 2013 for more information). Questionnaires are answered online, and households have been equipped with a computer and/or Internet access when necessary. Monthly questionnaires are answered, taking 15–30 min in total. Most of our data come from the yearly modules Politics and Values, Social Integration and Leisure, and Background Variables. The latter are provided (on a monthly basis) by the household head. Respondents are paid (15 euros per hour) for each completed questionnaire.
Political Discussion, Interest, Efficacy, and Action
Political discussion has two dimensions and originates from a name generator in the survey. First, respondents were asked: “Most people discuss important things with other people. If you look back on the last 6 months, with whom did you discuss important things?” (max. 5 persons). Subsequently, people were asked how often they discussed political issues with these personal contacts (known as alters in social network studies), with the following answer categories: (1) almost every day, (2) once or twice a week, (3) a few times per month, (4) about once a month, (5) a number of times per year, (6) about once a year, and (7) never. These scores were reversed and recoded to a quasi-continuous measure (number of discussions per year). Subsequently, we took the maximum score across the alters, i.e. when a respondent talks about politics every day with one alter and never with another alter, the “every day” score is assigned. This means that our frequency measure should be interpreted as: respondent talks X times about politics per year with at least one alter. If respondents indicated that alter X was a fellow member of an association or organization that person was not included in the count in order to avoid tautology problems. The second dimension of political discussion is the number of alters (sum) with whom politics are discussed (regardless of frequency). Table 1 provides descriptive statistics of the political indicators.
Table 1 Descriptive statistics of political indicators (pooled dataset)
Our measure of political
interest is a scale measured through two attitudinal components (self-reported interest in news and in politics) and two behavioral components (reading national and international political news in newspapers) The attitudinal variables’ answer categories ranged from 1 (not interested) to 3 (very interested), the behavioral components ranged from 1 (seldom or never) to 4 (almost always). These 4 variables form very strong and significant Mokken scales in each individual panel wave: H-coefficients range from .70 to .74, and even the weakest item (reading international news) has an Hi value of minimally .67.
Political efficacy is a combination of six items: (1) “I am well capable of playing an active role in politics”, (2) “I have a clear picture of the most important political issues in our country”, (3) “Politics sometimes seems so complicated that people like me can hardly understand what is going on”, (4) “Parliamentarians do not care about the opinions of people like me”, (5) “Political parties are only interested in my vote and not in my opinion”, (6) “People like me have no influence at all on government policies”. The answering options were either yes (1) or no (0). Its internal consistency (α = .65) is somewhat downwardly biased because of the fact that the items are dichotomous (auxiliary analyses showed that the tetrachoric inter-item correlations are considerably higher than the Pearson’s r correlations as used in calculating Cronbach’s alpha).
To measure political
action (or “unconventional” political participation), respondents were asked the following question: “There are different ways of raising a political issue or of influencing politicians or government. Can you indicate which of the following ways you have exercised over the past 5 years?”. We counted the response to the following items (yes/no): (1) calling in radio, television or a newspaper, (2) calling in a political party or organization, (3) participated in a government-organized public hearing, discussion or citizens’ participation meeting, (4) contacted a politician or civil servant, (5) participated in a protest action, protest march or demonstration, (6) participated in a political discussion or campaign by Internet, e-mail or SMS, and (7) something else.
Comparing the means of the political indicators across the different waves, we find evidence for period effects on political interest and political activism: both show a modest downward trend. This effect is remarkable, as elections took place in June 2010, directly before the final panel wave. If anything, elections should have boosted interest and activism compared to the pre-election waves. The occurrence of these period effects demonstrates the importance of a research design that incorporates within-person changes and a relevant control group. It should be noted that—in the presence of period effects—the interpretation of socialization or schools-of-democracy effects may change: an effect of active involvement in associations could also mean ‘less decline’ rather than a ‘boost’ (i.e. an absolute increase) of interest and activism.
Active Involvement in Voluntary Associations
The question about (active) associational participation was similar to the question in the European Social Survey of 2002, and the 1999/2000 survey Citizenship, Involvement and Democracy. Respondents are asked to indicate whether they participated in a list of different voluntary associations in the past 12 months. For each association, respondents reported yes or no to five modes of involvement: (1) no connection, (2) donated money, (3) participated in an activity, (4) member, and (5) performed voluntary work. Whereas modes 2 through 5 all count as associational involvement, we only include (3) participated in an activity and (5) performed voluntary work in our measure of active involvement.
The list of voluntary associations in our analyses consisted of the following: (1) a sports club or club for outdoor activities, (2) a cultural association or hobby club, (3) a trade union, (4) a business, professional or agrarian organization, (5) a consumers’ organization or automobile club, (6) an organization for humanitarian aid, human rights, minorities or migrants, (7) an organization for environmental protection, peace organization or animal rights organization, (8) a religious or church organization, (9) a science, education, teachers’ or parents’ association, (10) a social society; an association for youth, pensioners/senior citizens, women; or friends’ clubs, and (11) other organizations that you can freely join. We left out political parties to avoid circular reasoning.Footnote 2 The final measure we used is a categorical variable that included different participation trajectories or transitions in these associations (see “Analytical strategy” section).
Control Variables
Income is the monthly net household income, divided by 1000 (mean 2.96; SD 4.72). We used the imputed version, provided by the coordinators of the survey (the calculations can be found at the following location: http://www.lissdata.nl/dataarchive/hosted_files/download/1579). Education is a 6-point scale, ranging from primary education to university degree (mean 3.57; SD 1.48). Occupational
status was one of the following three categories: high (higher academic, independent, or supervisory profession), intermediate (intermediate academic, independent, supervisory, or commercial profession), and low (other mental work or manual work). Those without a job were asked about their last job; if respondents never worked their scores are missing. Furthermore, our analyses included controls for age (mean 46.97; SD 15.98) and gender. Finally, personality was measured by the well-known inventory of the “big-five” personality traits (Goldberg 1992). The control variables were measured at the start of the panel. When values were missing we used the nearest non-missing value (if possible).
Analytical Strategy
Our analytical strategy is similar for all four political indicators. Figure A.1 (online Appendix) provides an overview the measurement of our variables. P1 represents the first wave of the questionnaire about politics and values. This module contained the political interest, internal efficacy, and political action variables. S1 represents the first wave of the questionnaire about social integration and this module contained the civic participation variables as well as the indicators of political discussion. We divided respondents into groups according to the participation transitions they reported (questionnaires S1, S2, and S3). The groups that are of substantive interest are listed and labeled in Figure A.1. For example, “incidental joiners” (pattern 010) are respondents who did not participate actively in a voluntary association at S1 (0 meaning not involved), then joined (at least) one between S1 and S2 (1 meaning involved), and subsequently quit between S2 and S3 (0 again).
Our main focus in the interpretation of the results is on the difference between non-participants (pattern 000) and joiners (011) for the entry effects and on the difference between persistent participants (111) and leavers (100) for the exit effects. See Van Ingen and Bekkers (2015) for a similar strategy as well as a discussion about modelling longitudinal effects of multiple memberships. The largest groups are in a stable state, i.e. they are either non-participants (“000”; 29 %) or persistent participants (“111”; 26 %). Among the respondents who joined (at least one) voluntary association between waves 1 and 2, around 50 % decided to quit participating again between wave 2 and 3: the groups early joiners (011) and incidental joiners (010) are of roughly equal size (both around 8 % of the sample). The late joiners (001) constitute 7 % of the sample.
We apply latent growth curve models to examine our data (using Stata 13). Based on the four-wave measurement of the political indicators, we estimate a (latent) intercept and (latent) slope of the individual political socialization trajectories of respondents: the slope represents the amount of growth, the intercept represents the initial score (wave 1) on the political indicator. Respondents with other participation transitions (e.g. with missing values) than the ones listed in Figure A.1 were not included in our models. We estimated our structural equation models using “maximum likelihood with missing values” (see StataCorp. 2013).Footnote 3 The advantage of this method is that respondents did not have to complete all four questionnaires to be included in the analyses.
In order to test our pools-of-democracy hypotheses we need to find out whether members were already more politically socialized before they entered an association. This can be tested with our models by comparing the intercepts of the “joiners” with the intercept of “non-participants”: as can be seen in Fig. A.1, P1 is prior to joining the association between S1 and S2. Additionally, a selection effect may exist with regard to remaining involved in a voluntary association. This can be tested by comparing the intercepts of “persistent participants” (111) with “leavers” (100) (the intercepts represent the scores prior to the decision to exit).
In order to test our schools-of-democracy hypotheses, we need to find out whether members became more politically socialized after they joined an association. This can be tested by comparing the growth “curve” (estimated as a linear trend in our models) of the joiners with the growth curve of the non-participants. Similarly, desocialization effects are tested by comparing the growth curves of persistent participants” (111) with “leavers” (100).
By comparing participation transition groups we avoid bias caused by period effects (which are present in our data): the non-participants (000) act as a quasi-control group. Furthermore, we note that the group of “persistent participants” (111) is also of interest, but the interpretation of their scores is not straightforward. If their intercept is different from non-participants, this can reflect both selection and causation (just like in cross-sectional research); when their growth curve is different it may represent a late, or lagged effect (since they had already been participating for an unknown period before we started monitoring them at S1).