Introduction

There is a large body of evidence documenting the importance of adult literacy and numeracy for a broad range of economic and social outcomes (Dinis da Costa et al. 2014; Hanushek et al. 2015; Hanushek and Woessman 2015; OECD 2013, 2016). Literacy and numeracy proficienciesFootnote 1 have been linked to employment, earnings, health status, social trust, political efficacy and civic engagement. Even after taking educational attainment into account,Footnote 2 strong relationships are apparent between proficiency levels and economic and social outcomes. At the macro level, growth of a country’s national gross domestic product (GDP) has been associated with increasing levels of literacy and numeracy proficiency (Schwerdt and Wiederhold 2018). Franziska Hampf et al. (2017) have provided various kinds of converging evidence that the observed relationships between proficiencies and economic outcomes are causal in nature.

Beyond supporting important economic and social outcomes, literacy and numeracy proficiency are essential for sustainable development. High levels of adult literacy and numeracy are an integral part of the United Nations Sustainable Development Goal (SDG) 4, being both prerequisites for and the by-product of lifelong learning. Policies and programmes that foster increased literacy and numeracy proficiencies effectively support SDG 4, which calls on Member States to “ensure inclusive and equitable quality education and promote lifelong learning opportunities for all” (UIS 2018, p. 7).

Forecasts of – and policy frameworks for increasing – future literacy and numeracy levels of adult populations and workforces are generally based on changes in the literacy and numeracy levels of children leaving school in the future (Hanushek and Woessman 2015; Vézina and Bélanger 2019). Although these future schooling outcomes are undeniably a key component of future adult proficiency levels, not enough attention is generally paid to changes over time in the proficiency levels of adults who are already beyond the reach of the school system. The majority of the 2030 workforce, for example, has already completed its education, so changes in their proficiency levels over time will form a major component of the change in proficiency levels among the overall population between now and then. To take this component of future change into account, we need to know much more about how adults’ proficiencies change across the lifespan and how various activities, policies and incentives may shape these changes.

Our theoretical framework for this article is practice engagement theory (PET) (Reder 1994; Sheehan-Holt and Smith 2000), which provides a testable account of how proficiency may change during adulthood. It holds that frequent engagement in reading, writing and maths activities fosters the growth of underlying literacy and numeracy proficiencies. As we will demonstrate, PET has important implications for lifelong learning and SDG 4, providing a broad framework which is suitable for designing effective policy and programmatic interventions that foster proficiency growth.

PET has been rigorously tested with data from a longitudinal study of a random sample of a population with a low level of education in a metropolitan area in the United States (Reder 2009a). It also has empirical support from large-scale cross-sectional studies of national adult populations (Jonas 2018; Sheehan-Holt and Smith 2000). However, due to data limitations, PET has not yet been tested longitudinally with a broad, nationally representative population sample. The study we present in this article contributes to the empirical literature on proficiency development in adulthood and presents the first test of PET using a nationally representative population.

Proficiency development in adults

A large body of research has investigated how literacy and numeracy proficiencies develop across the adult lifespan, and there is multi-disciplinary literature which has examined proficiency development from diverse perspectives, identifying a number of possible mechanisms underlying observed changes across the lifespan, across countries and across time (Desjardins and Warnke 2012; Green 2013). Other researchers (Barrett and Riddell 2016; Green and Riddell 2013) have examined proficiency change and ageing. Scholars of work-based learning (e.g. Billett 2004; Skule 2014) have focused on the characteristics of jobs and workplaces that foster learning and proficiency development. Marco Paccagnella (2016) examined the decline of proficiencies in older adults and conceptualised how various employment practices and policies may affect proficiency decline. Several researchers have examined proficiency development in terms of skills obsolescence and career interruption, and have identified a variety of underemployment situations and career interruptions that foster proficiency loss (Bynner and Parsons 1998; de Grip and van Loo 2002; Edin and Gustavsson 2008).

The effects of a variety of interventions in adult life on proficiency development have also been studied. The second and third authors of this article (Gauly and Lechner 2019; Gauly et al. forthcoming) have examined the impact of work-related training on the development of literacy and numeracy proficiency. They found that the frequently reported positive association between training and proficiency results from a selection effect (i.e., those with higher proficiencies are more likely to enter training) rather than from a causal effect of training on proficiency. Others have studied the impact of adult basic education on proficiency development in adults with low levels of proficiency or education (Brooks et al. 2001; Reder 2009b, 2019a; Sheehan-Holt and Smith 2000; Wolf and Jenkins 2014) and have produced inconsistent findings about whether programme participation has a significant impact on proficiency development.

Practice engagement theory

We argue that PET can help make this body of research on adult proficiency development more coherent. PET specifies how engagement in reading, writing and maths activities in everyday life (whether at work or outside of work contexts) affects literacy and numeracy proficiency development over time in the adult lifespan. PET was initially based on cross-cultural and cross-situational qualitative research on literacy practices and proficiencies. It posits that individuals’ literacy proficiencies develop as a by-product of their engagement in everyday reading and writing practices and, reciprocally, that literacy proficiencies affect levels of engagement in reading and writing practices (Reder 1994). Quantitative modelling of PET became possible as large-scale surveys started to measure literacy and numeracy proficiencies along with the use of those proficiencies in everyday practices at work and outside of the workplace (Sheehan-Holt and Smith 2000; Smith 1996, 2009). Research conducted by Nicolas Jonas (2018) is among the few studies using these data that apply path modelsFootnote 3 to identify significant reciprocal influences between numeracy proficiency and engagement in maths practices, a pattern he terms a “virtuous circle”.

More robust support for PET came from a longitudinal study of an adult population with a low level of education that followed individuals in the United States over eight years with repeated measurements of both literacy proficiency and engagement in reading and writing practices (Reder 2009b, 2019a). Cross-lagged structural equation modelsFootnote 4 were fitted to these panel data,Footnote 5 showing statistically significant positive effects of practice engagement on proficiency change and reciprocal positive effects of proficiency on changes in practice engagement.

The reciprocal linkage between proficiency and practice engagement that characterises PET enables practice engagement to mediate relationships between experiences in adult life and proficiency development. This is theoretically and practically important because there is considerable evidence that practice engagement is more malleable than proficiencies. This means that programmes, policies and interventions can target short-term changes in practice engagement levels, with the expectation that, through the mechanism of PET, these will effect longer-term changes in proficiency. For example, in analysing the impact of adult basic education programmes on students’ proficiency trajectories, the first author of this article (Reder 2009b) and Janet Sheehan-Holt and Cecil Smith (2000) found that programmes have a short-term impact on students’ levels of practice engagement in the first year after programme exit, but no significant short-term effect on proficiency change. Over time, however, programmes have a substantial impact on proficiency change when assessed five years after programme exit, due to the long-term mediating effects of programme-generated change in practice engagement (Reder 2019a). This helps us understand why several other studies of the impact of adult basic education programmes have had inconsistent results. When programmes are assessed for impact at relatively short follow-up intervals, these assessments tend to show non-significant effects (e.g. Wolf and Jenkins 2014); while follow-ups after 3 years (e.g. Brooks et al. 2001) show small but significant effects; and 5-year follow-ups show substantial effects of participation on proficiency growth (Reder 2019a).

In testing PET with the longitudinal data collected in the German national extension study, PIAAC-Longitudinal (PIAAC-L),Footnote 6 we asked several research questions:

  1. (1)

    Is adults’ engagement in reading practices associated with the development of their literacy proficiency over time? How about their engagement in maths practices or writing practices?

  2. (2)

    Is adults’ engagement in maths practices associated with the development of their numeracy proficiency over time? How about their engagement in reading practices or writing practices?

Methods

The PIAAC-L survey

For our analysis, we used data from Round 1 of the First Cycle of the Programme for the International Assessment of Adult Competencies (PIAAC), conducted in 23 countries, including Germany, in 2011–2012. Initiated by the Organisation for Economic Co-operation and Development (OECD), PIAAC is an international “adult skills” survey that measures general proficiencies, including literacy and numeracy, in the adult population (age 16–65 years).Footnote 7 We focused on the German PIAAC sample in order to use additional data available from the national extension study, PIAAC-Longitudinal (PIAAC-L).Footnote 8 The target population in the German PIAAC studies were adults who were randomly selected from local population registers in randomly selected municipalities in Germany.

PIAAC-L comprises three waves of data collection in 2014, 2015 and 2016, with the second wave in 2015 also containing a repeated proficiency assessment. This enabled us to investigate the effects of practice engagement on proficiency change. We combined data from the years 2012 and 2015 because these contained the proficiency assessments as well as the information on engagement with reading, writing and maths tasks we required for our analyses. We only considered the native German-speaking population.Footnote 9 Our sample consisted of 2,989 individuals for whom we had complete information in 2012 and 2015 on all analytical variables.

Variables used in modelling

Dependent variables

The dependent variables in our models are literacy and numeracy, each assessed in PIAAC 2012 and again in PIAAC-L 2015. PIAAC(-L)’s assessment of literacy and numeracy is based on respondents’ answers to sets of cognitive items of varying difficulty. Based on these responses, the OECD derived ten plausible values for literacy proficiency and ten for numeracy proficiency, all on 0–500-point scales (OECD 2013). Further information about the literacy and numeracy assessment frameworks, scaling methodology and sample cognitive items used in PIAAC are available in OECD (2013), PIAAC Literacy Expert Group (2009) and PIAAC Numeracy Expert Group (2009).

Practice engagement variables

PIAAC collected self-reported frequencies of performing specific tasks involving reading, writing and maths.

The eight reading tasks were:

Read …

  • directions or instructions

  • letters, memos or e-mails

  • articles in newspapers or magazines

  • articles in professional journals or publications

  • books

  • manuals or reference materials

  • bills, invoices, bank statements or other financial statements

  • diagrams, maps or schematics


The four writing tasks were:

Write …

  • letters, memos or e-mails

  • articles

  • reports

  • fill in forms


The six maths tasks were:

  • Calculate prices, costs or budgets

  • Use or calculate fractions, decimals or percentages

  • Use a calculator, either hand-held or computer-based

  • Prepare charts, graphs or tables

  • Use simple algebra or formulas

  • Use more advanced mathematics or statistics such as calculus, complex algebra, trigonometry or regression techniques


Respondents indicated, on a five-point Likert scale, how often they performed each task:

  • Never

  • Less than once a month

  • Less than once a week but at least once a month

  • At least once a week but not every day

  • Every day

All respondents were asked how often they performed each task outside of work, and, if they were currently working, how often they performed each task at work. We created new derived practice engagement (skill use) variables which merged activities across the at-work and outside-of-work contexts. For each task, the greater usage frequency reported between work and outside-of-work contexts becomes the frequency for that merged context item. For example, if an individual reported reading newspapers or magazines “every day” at work and “once a week” outside of work, then the merged frequency would be “every day”. For individuals who were not employed at the time of the interview, the merged frequency was just the outside-of-work frequency.Footnote 10 We then used the partial credit model of item response theoryFootnote 11 (Hamel et al. 2016; Masters 1982) to scale these merged frequency items into an overall index of breadth and frequency of reading engagement (RE). We used the same procedure to create merged writing engagement (WE) and maths engagement (ME) indices. These index variables were scaled with means set to 0.

Covariates

In order to control for confounding factors in the relationship between practice engagement and proficiency, we included the following set of covariates in our analyses:

  • Age: Respondents’ age in years at time of 2012 interview.

  • Gender: Binary flag = 1 if female, 0 if male.

  • Education: Years of schooling corresponding to the highest level of educational attainment in 2012.

  • Employed: Binary flag = 1 if employed at time of 2012 interview, otherwise 0.

  • Educational Gain: Binary flag = 1 if the educational attainment reported in 2015 is higher than in 2012, otherwise 0.

Analytical methods

We analysed the growth of literacy proficiency between 2012 and 2015 as well as the growth of numeracy proficiency from 2012 to 2015 using ordinary least squares (OLS) regression models of the difference between the proficiency at the two points in time. For each proficiency, we examined eight models, all of which used the difference in the assessed proficiency between 2012 and 2015 as the dependent variable and a common set of independent variables including the respective proficiency assessed in 2012 and the five covariates specified in the preceding section. The eight models differ in terms of which, if any, measures of practice engagement are included as independent variables.

The baseline model for each proficiency has no added practice engagement measures. The additional seven models of proficiency growth include three that contain one of the practice engagement measures (RE, WE, ME), three containing two of those measures and one with all three of the practice engagement measures included. In all of our models we accounted for both the complex sample design of PIAAC-L and the measurement error present in the literacy and numeracy assessments at each point in time,Footnote 12 with the latter taking into account the ten plausible values for each proficiency domain.

A key question in modelling the growth of literacy proficiency is whether RE is a significant positive predictor of gain in literacy proficiency (2012 to 2015), with literacy proficiency in 2012 and the five covariates controlled. We examined the statistical significance of the RE coefficient that is added to the baseline model. We also compared how well the literacy growth model fits, with and without RE. Since the baseline model is nested within the model with RE added, we used likelihood ratios to test if the added RE term generated a statistically significant improvement in model fit (Wilks 1938) compared to the baseline model. The hypothesisFootnote 13 we tested is that the two models fit the data equally well.

We used a similar approach to examine the estimated models of numeracy growth between 2012 and 2015. We were particularly interested in whether ME was a significant positive predictor of numeracy proficiency gain between 2012 and 2015, with numeracy proficiency in 2012 and the five covariates controlled. In addition to testing the significance of the ME coefficient, we used the likelihood ratio test to compare how well the numeracy growth model fits, with and without ME included.

Results

Descriptive statistics for proficiency, practice engagement and covariate variables

Table 1 displays the means and standard deviations for the proficiency and practice engagement variables, as well as for the five covariates. It is worth noting that the literacy and numeracy variables were assessed in both 2012 and 2015, whereas the practice engagement variables RE, WE and ME and the covariates age, gender, education and current employment were measured only in 2012. One covariate, educational gain, served as a binary flag for an increase in an individual’s total years of education between 2012 and 2015.

Table 1 Descriptive statistics for proficiency, practice engagement and covariate variables

Table 1 shows little overall change in either the population’s literacy proficiency or its numeracy proficiency between 2012 and 2015. We will return to this point below. The practice engagement variables are each scaled to have mean 0, as shown in the table. The average age of the target population in 2012 was 42. Nearly half (49%) of the population was female. The average number of years of schooling in 2012 was 13.6. A small percentage (16%) reported completing more years of education (i.e. educational gain) in 2015 than in 2012. The vast majority (82%) of the population was employed in 2012.

Proficiency gains from 2012 to 2015

We found the mean change in literacy proficiency from 2012 to 2015 to be 1.2 scale points (related to a 0–500 point scale) with a standard deviation of 25.4 scale points, which is not statistically different from zero (z = 1.43, p > 0.05). Similarly, the mean change in numeracy proficiency over the time period was 0.6 scale points with a standard deviation of 31.3 scale points, not statistically different from zero (z = 0.61, p > 0.05). Thus, overall neither literacy nor numeracy proficiency changed over the three years between 2012 and 2015. At the same time, we found considerable heterogeneity of individuals’ proficiency gains, even though the population’s average gain was zero. For example, as noted in our literature review above, participants’ age in 2012 should be negatively related to their proficiency gains, with younger adults tending to have positive gains and older adults smaller and even negative gains (proficiency decline).

Figure 1 displays the mean proficiency gain for adults of different initial ages. As expected from previous research, we found a negative relationship between proficiency gain and age, with younger adults showing the largest average gains, and older adults showing the smallest and even negative gains. The youngest age group, the 16–24-year-olds in 2012, had mean literacy and numeracy gains of 8.7 and 11.4 scale points, respectively. This is consistent with overall findings from the longitudinal and other studies mentioned above in our literature review. Because other experiences known to influence proficiency, such as education and employment, are usually correlated with age, it is best to examine the effects of age and other demographic variables more closely in the multivariate setting of regression models, so this is what we did.

Fig. 1
figure 1

Proficiency gains, 2012–2015, by age in 2012

Regression models of gain in literacy proficiency

Table 2 displays eight OLS regression models for literacy proficiency gain. Each model includes literacy proficiency in 2012 and the five covariates as independent variables – age, gender, education, employment and educational gain between 2012 and 2015. The eight models differ with respect to the combination of practice engagement variables – RE, WE and ME – included as independent variables. Model (1) is the baseline model that includes no practice engagement variables.

Table 2 Literacy proficiency gain, 2012–2015, regressed on literacy proficiency in 2012, practice engagement measures and covariates

In the baseline model of literacy proficiency gain, we find statistically significant negative effects of 2012 literacy proficiency and positive effects of years of education (as of 2012), as well as a statistically significant negative effect of age. The effects of gender, current employment and educational gain between 2012 and 2015 are not statistically significant.Footnote 14 In general, we found this pattern of effects to be the same across the other models in the table. The negative effect of age on literacy gain from 2012 to 2015 was expected, both based on previous research and on the data shown in Figure 1.

It is noticeable that the R2 valuesFootnote 15 shown for the regression models are quite small, accounting for only 11 to 13 per cent of the variance in literacy proficiency gain. These small values contrast with the larger R2 values that accompany regressions using the same covariates to predict literacy proficiency at either point in time.Footnote 16 The difference between individuals’ proficiency at the two points in time is more difficult to predict, partly because of the overall stability of proficiencies across the lifespan and partly because of the increased measurement error accompanying the difference between two repeated measurements.

Model (2) adds the reading engagement measure RE to the baseline model. We found RE to be a statistically significant positive predictor of literacy gain with the other variables controlled. With literacy proficiency in 2012 and other variables controlled, literacy proficiency in 2015 averages 4.69 scale points higher per unit of RE. Since RE is scaled with a standard deviation of 0.86 (Table 1), the RE increment is equivalent to about five literacy scale points per standard deviation of reading engagement.

Because of the small R2 values involved in both models being compared, we were interested to find out whether adding RE to the baseline model would significantly improve the overall fit of Model (2) compared to the baseline model. Since Model (1) is nested in Model (2), we were able to use a likelihood ratio test to compare the two models’ fit to the data. The test statistic of twice the difference in the two models’ log likelihoods is asymptotically distributed as chi squared (χ2)Footnote 17 with one degree of freedom for the extra parameter RE estimated for Model (2): χ2 = 63.3, df = 1, p < 0.001. We therefore rejected the null hypothesis that the two models fit the data equally well. We found that adding reading engagement to the predictive model significantly improved the overall fit.

When we look at the effects of other practice engagement variables and combinations of practice engagement variables in Models (3)–(8) in Table 2, several points stand out. First, RE remains a statistically significant and positive predictor of gain in literacy proficiency with WE and/or ME also in the model. Second, WE and ME are statistically significant predictors in some specifications (Models 3, 4 and 7), but are not as strong as RE.

These findings are consistent with the idea that the closer a set of practices is to the cognitive proficiency being assessed, the stronger the effect its practice engagement measure will have on the growth of that proficiency. Since the literacy proficiency assessed in PIAAC-L involves mastery of everyday reading tasks, we expected and found RE to be the strongest predictor of literacy proficiency growth (compared to the effects of WE and ME). We will return to this important point after examining the corresponding numeracy proficiency results.

Regression models of gain in numeracy proficiency

Table 3 is parallel to Table 2, displaying eight OLS regression models for numeracy proficiency gain between 2012 and 2015. Overall findings are similar to those for literacy proficiency. The baseline model of numeracy proficiency gain has statistically significant negative effects of 2012 numeracy proficiency and positive effects of years of education (as of 2012), as well as a statistically significant negative effect of age. The effects of gender, current employment and educational gain between 2012 and 2015 turned out not to be statistically significant.Footnote 18

Table 3 Numeracy proficiency gain, 2012–2015, regressed on numeracy proficiency in 2012, practice engagement measures and covariates

This pattern of covariate effects is the same that we found for literacy and holds across all the other models in Table 3. The R2 values are in the same range as those in Table 2 for literacy proficiency, accounting for 11 to 12 per cent of the variance.

Model (2) adds the maths engagement measure ME to the baseline model. We see that ME is a statistically significant positive predictor of numeracy gain with the other variables controlled. With numeracy proficiency in 2012 and other variables controlled, numeracy proficiency gain averages 4.23 scale points more per unit of ME. Since ME is scaled with a standard deviation of 0.90 (Table 1), the ME increment is equivalent to about five numeracy scale points per standard deviation of maths engagement.

As with the literacy proficiency models, we were also interested to find out whether adding ME to the baseline model would significantly improve the overall fit of Model (2) compared to the baseline model. Since Model (1) is nested in Model (2), we were able to use a likelihood ratio test to compare how well the two models fit the data. The test statistic of twice the difference in the two models’ log likelihoods is asymptotically distributed as chi-squared with one degree of freedom for the extra parameter ME estimated for Model (2): χ2 = 37.8, df = 1, p < 0.001. We therefore rejected the null hypothesis that the two models fit the data equally well. We found that adding maths engagement to the predictive model significantly improved the overall fit.

Considering the effects of other practice engagement variables and combinations of practice engagement variables in Table 3, several points stand out. First, ME remains a statistically significant and positive predictor of gain in numeracy proficiency with RE and/or WE also in the model. Second, RE and WE are not statistically significant predictors of numeracy gain. These findings are consistent with the idea that the closer a set of practices is to the cognitive proficiency being assessed, the stronger the effect its practice engagement measure has on the growth of that proficiency. Since the numeracy proficiency assessed in PIAAC-L is framed around everyday maths, we expected and found ME to be the strongest predictor of numeracy proficiency growth (compared to the effects of RE and WE).

Summary and discussion

Our findings provide longitudinal support to PET using nationally representative data and proficiency measures from national and international surveys. The findings are consistent across the literacy and numeracy proficiency domains. In terms of the specific questions we asked in testing PET, we found that engagement in reading practices is positively associated with the development of individuals’ literacy proficiency over time. Likewise, engagement in maths practices is positively associated with the development of individuals’ numeracy proficiency over time. We found considerable specificity in which set of everyday practices best predicts the growth of these proficiencies. The closer a set of practices is to the cognitive proficiency being assessed, the more strongly its practice engagement measure predicts the growth of that proficiency. For literacy, engagement in reading practices is the strongest predictor of proficiency growth. For numeracy, engagement in maths practices is the strongest predictor of proficiency growth. In each proficiency domain, we found the magnitude of the practice engagement effect to be relatively small – about five proficiency scale points over a three-year period per standard deviation of practice engagement. Other longitudinal research suggests that these practice-based proficiency gains will continue to grow as practice engagement and associated proficiency development continue over longer time intervals (Reder 2009a). Additional longitudinal research that incorporates repeated measures of both practice engagement and proficiencies at multiple points in time can help clarify and quantify these relationships.

Concerning the covariates in our model, we found age and education significantly affect the observed gains, while neither gender nor employment status proved to be a statistically significant predictor in the multivariate modelling environments. In line with previous research, we found age to be negatively related to both literacy and numeracy gains, with younger adults averaging positive gains and older adults averaging negative gains. This trend is summarised by the statistically significant negative coefficient of age in the multivariate regression models predicting proficiency gains.

Much previous research, of course, has established education as a strong positive predictor of literacy and numeracy proficiencies in adult populations at a given point in time. Our results for predicting proficiency gain between two points in time, however, appear somewhat different.Footnote 19 We found years of education completed by 2012 to be a statistically significant positive predictor of proficiency gain between 2012 and 2015, indicating that education not only predicts adult proficiencies at a given point in time, it also predicts changes in adult proficiencies over time. Educational gain between 2012 and 2015, however, is not a significant predictor of proficiency gain over that period. These findings suggest that adult learners’ enrolment in formal education may not serve to foster the development of either literacy or numeracy proficiency. Previous research has also indicated that job-related training does not foster proficiency gains, either (Gauly and Lechner 2019; Gauly et al. forthcoming).

It will be helpful in future research to have more refined measures of practice engagement in order to obtain sharper differentiation of how specific engagement measures are related to the growth of specific proficiencies such as literacy and numeracy. The practice engagement measures we used in this study appear somewhat arbitrary in how some tasks are mapped onto specific practice engagement indexes. For example, “reading financial statements” is a constituent task of the reading engagement index RE, but not of the maths engagement index ME, even though both literacy and numeracy proficiencies may well be involved in the performance of the task. Although research organisations have invested heavily in developing proficiency assessment frameworks and instruments, much less effort has been made to develop corresponding frameworks and instruments for measuring practice engagement (skill use). Jonas (2018, pp. 69–70) offers a number of specific methodological improvements that would be very helpful in this regard. Progress in this area may well require iterative cycles of research and development that support better theory and understanding of the relationships between specific proficiencies and practices.

Future research, examining the PIAAC-L or other longitudinal data sets, should also look more closely at the role of employment in proficiency development. Although current employment status (at the time of interviews conducted in 2012) did not have a significant effect on proficiency gain, it is quite possible that employment activities between 2012 and 2015 might have had an impact on proficiency growth. There are indications in other research that skill development may be embedded in complex interactions between characteristics of workers and their workplaces (Felstead et al. 2019; Inanc et al. 2015).

Our findings have several important implications for programme design in adult education and lifelong learning. Victoria Purcell-Gates et al. (2002) studied adult education programmes that focus on helping students successfully engage with personally meaningful reading and writing practices. They found that these programmes increase levels of everyday practice engagement. This was observed not only during students’ enrolment in the programmes but also after the programmes had ended. Analyses of data from a longitudinal panel study that collected repeated measures of adults’ proficiencies and levels of practice engagement found that basic skills programmes foster short-term increases in practice engagement that over time lead to longer-term changes in proficiency (Reder 2009a). Together with this research, our present findings suggest that practice-centred formal and non-formal instructional programmes may point in a promising direction for innovation in adult education and lifelong learning (Reder 2009b, Sheehan-Holt and Smith 2000).

Our study also has important implications for policymaking. Policies that foster increased adult engagement in everyday reading, writing and maths practices will support lifelong learning and proficiency growth and should broaden access to continuing education and vocational training. These outcomes are central to the fourth United Nations Sustainable Development Goal (SDG 4), which focuses on education.

There is a growing base of evidence suggesting that increased practice engagement may also have wider benefits for individuals and societies. Previous research has found that key social outcomes measured in PIAAC in numerous countries – social trust, general health, political efficacy and volunteerism – are all positively associated with practice engagement, even with proficiency, education and other variables controlled (Jonas 2018; Reder 2017). These relationships hold for general adult populations, for low-proficiency adult populations and for other vulnerable populations such as incarcerated adults (Reder 2019b). Further research is needed that tracks these and other outcomes longitudinally along with levels of practice engagement in order to improve our understanding of the causalities that may be involved, and the potential impact of practice engagement-centred policies and programmes.