Skip to main content

Are Spanish Children Taking Advantage of their Weekly Classroom Time?

Abstract

There is a common belief in Spain that a large amount of classroom time is supposed to be an indicator of better academic achievement, due to children’s prolonged exposure to the teaching-learning process. Nevertheless, international evidence does not seem to support this concept, as the amount of weekly instruction hours that children receive in Spain is well above the one provided in other countries, which clearly perform better than Spain in international assessments. Because of that, this current research proposes to analyse two issues regarding weekly instruction time: firstly, whether or not instruction time per week affects the academic achievement of Spanish children; secondly, if this potential effect differs across Spanish regions –Autonomous Communities–. In order to do that, we have made use of student fixed effects between-subjects to obtain the potential causal effect of weekly instruction time on students’ academic achievement. The main results of this analysis have indicated that, in general for Spain, weekly instruction time does not seem to affect children’s academic achievement. However, this lack of influence may reflect the compensation of different effects of instruction time per week on students’ academic achievement for some Spanish Autonomous Communities –concretely, Catalonia, Navarra and the Basque Country–. In the view of these results, we propose some policy interventions. We also highlight the importance of studying each country’s particular case in respect to this instruction time issue, as it may present different effects in each country.

This is a preview of subscription content, access via your institution.

Notes

  1. A complete review of the literature regarding instruction and its effect on students’ academic achievement can be found in Gromada and Shewbridge (2016).

  2. These Autonomy Statutes were approved (for each AACC, the date of the last reform is indicated between brackets) in 1979 for the Basque Country and Catalonia (2006); 1981 for Andalusia (2007), Asturias (1999), Cantabria (1998) and Galicia; 1982 for Aragon (2007), Canary Islands (1996), Castile-La Mancha (2014), La Rioja (1999), Murcia (2013), Navarre (2010) and Valencia (2006); 1983 for Balearic Islands (2007), Castile and Leon (2007), Extremadura (2011) and Madrid (1998); 1995 for Ceuta and Melilla.

  3. To check this, we have performed a literature analysis of 5 relevant economics journals (Economics of Education Review, The Economic Journal, Education Economics, Empirical Economics and The Journal of Human Resources), together with the Revista de Educación, looking for publications which used PISA data for Spain since 2005. We found that in these journals only 24 papers dealt with Spanish PISA data, and only 4 of them studied or mentioned the effect of instruction time on students’ academic achievement in some way, but as a control variable, not as the main focus, and the obtained effect for this variable could not be considered as a causal one. This literature analysis has not been included for reasons of space, but will be provided by authors upon request.

  4. Specifically, the questions asked in PISA 2015 are: ST059 (Q01TA-Q03TA) “How many <class periods> per week are you typically required to attend for the following subjects?”, which students can answer separately for reading, mathematics and science, and ST61 “How many minutes, on average, are there in a < class period>?”.

  5. Thus, Ceuta and Melilla are not included in this analysis.

  6. Information about the content of the booklets for PISA 2015 is available in OECD (2016b, Chapter 2). The information about the imputation procedure is available in OECD (2009).

  7. The criterion used by PISA to select students is age; thus PISA samples 15-year-old students, not students in a particular course.

  8. It is relevant to highlight that these statistics differ from the ones reported in OECD (2016a) as we have not included in the sample those children who can potentially bias our results, as it has been described in the current section.

  9. Pursuant to the education law in 2015 (Organic Law 8/2013, 9th December, LOMCE, article 6 bis d) 3°, BOE 2013), Spanish schools had autonomy to define instruction time for each subject taught in their centres (subject to specific limitations of the Spanish Government and the Education Administrations on the minimum number of instruction hours per week). Thus, together with the previously highlighted heterogeneity in the amount of instruction time in each Spanish AACC due to their autonomy in education, this law highlights the potential existence of heterogeneity in the amount of instruction time per week received for each subject by Spanish children in different schools.

  10. Cattaneo et al. (2017), Lavy (2015) and Rivkin and Schiman (2015) studied this topic, but the latter two did not use the recommended practices indicated by PISA to analyse their data.

  11. When using student fixed effects to estimate equation (1) to obtain the base model, the characteristics that are the same within-student between-subjects do not have to be controlled.

  12. The average per school of the weekly instruction time self-reported by students has been employed in Lavy (2015) and Rivkin and Schiman (2015), who also aggregated students’ academic achievement to the school level, by grade and subject, since using self-reported weekly instruction time could cause problems, such as misreporting by students because they do not correctly recall the instruction time received per lesson or the number of weekly lessons, rather than due to other reasons such as absenteeism. Unfortunately, we cannot control by absenteeism because the information provided by PISA 2015 regarding this matter is not very reliable (students are asked questions about this but only in respect to the two previous weeks). However, as schooling for the sampled students is compulsory, it is not legal for them to skip classes; therefore not controlling by absenteeism should not be a problem. The results of a robustness check on this issue reported in Appendix 2 corroborate that employing the school average of self-reported student answers does not bias the conclusions obtained. Moreover, since the 2006 education law (“Ley Orgánica de Educación” of 2006, i.e. LOE; BOE 2006, the one prior to LOMCE, BOE 2013) the Spanish government does not include the distribution of hours per subject in each of the Spanish AACC in its annual education reports (MECD 2009, p. 170), so we cannot compare our results with official information from the government. Hence, PISA 2015 information about weekly instruction time is the best and most recent source available to study the current issue for Spain and its AACC.

  13. Lavy (2015) and Rivkin and Schiman (2015) did not test this assumption.

  14. To check this hypothesis it is necessary to verify whether the φk of both specifications are equal, i.e., φ2 for specification s = 1 is equal to φ1 in specification s = 2.

  15. This hypothesis can be tested by obtaining the net effect of βs from βs + φs, for each sth subject under analysis. Defining ϑs = βs + φs, we have to subtract from each of the sth ϑs coefficients the effect of the correspondent φs from the kth specification (being k ≠ s), i.e., for specification s = 1 we obtain β1 by subtracting from ϑ1 in this specification the coefficient of φ1 in specification s = 2, while for specification s = 2 we obtain β2 by subtracting from ϑ2 in this specification the coefficient of φ2 in specification s = 1.

  16. PISA 2015 included a teacher questionnaire which gathered information about teachers who were eligible to teach the tenth grade in Spain; however, it was only available for the general sample of Spain, which is a restricted sample (6736 student observations) not representative at the Autonomous Community level.

  17. In PISA 2015, Spain presented a mean score in reading of 496 and SD 87, while in mathematics it presented a mean score of 486 and SD 85 (OECD 2016a).

  18. Checks were also performed using the school quality variables employed in the robustness check in Table 10, Appendix 2, to see whether the quality of education interacts with each one of the Spanish AACC and instruction time. The effect of all school quality variables was null. These estimations are available from the authors upon request.

  19. Students of model A are those who are taught in Spanish in all levels, cycles and modalities, having Basque as a subject; students of model G are taught only in Spanish; students of model B are those who are taught in Basque, having Spanish as a subject and as language in some subjects, depending on the level, cycle and modalities; students of model D are taught only in Basque, with the exception of the Spanish language subject.

  20. The group of specific subjects students have to choose from includes: “Performing arts and dance”, “Scientific culture”, “Classic culture”, “Plastic, visual and audiovisual education”, “Philosophy”, “Music”, “Second foreign language”, “Information and communication technologies”, “One optional core subject” and “Autonomic specific subject”. The “One optional core subject” means that students can choose from those which they did not choose as “Optional core subject” (they had to select two optional core subjects from a list which is different depending on their track). In this sense, for the Academic track, the optional core subjects are: “Biology and Geology”, “Physics and Chemistry”, “Economics” and “Latin”; in the case of the Applied Education track, the optional core subjects are: “Applied sciences to professional activities”, “Technology” and “Initiation to entrepreneurship and business activity”.

  21. This declaration about “asignaturas que distraen” (subjects which distract) has been gathered by the Spanish press as, e.g., in the relevant Spanish newspaper “El País” (https://elpais.com/sociedad/2012/09/02/actualidad/1346620941_402605.html).

  22. The ESCS index was created using father’s education (questions ST007, ST008), father’s occupation (ST015), mother’s education (ST005, ST006), mother’s occupation (ST014) and home resources (ST011, ST012, ST013) (OECD 2016b, Chapter 16).

  23. The interpretation of this coefficient would be that a 1 minutes increase in weekly instruction time produces 0.001 SD increase on students’ academic achievement, so an increase of 1 weekly hour would entail 0.06 SD improvement of students’ academic achievement. Putting this figure into the PISA scale in which scores are measured with mean 500 and standard deviation 100, 1 hour increase of weekly instruction time would increase students’ academic achievement by 6 points for high ESCS students, which is a really low effect in the context of the PISA scale.

  24. Although it could be interesting to decompose weekly instruction time into the number of lessons per week and the duration of those lessons, the latter variable does not vary by subject in PISA 2015, which does not allow us to use student fixed effects.

  25. The precise composition of these variables is published in the PISA 2015 technical report (OECD 2016b, Chapter 16).

References

  • Andersen, S. C., Humlum, M. K., & Nandrup, A. B. (2016). Increasing instruction time in school does increase learning. Proceedings of the National Academy of Sciences of the United States of America, 113(27), 7481–7484. https://doi.org/10.1073/pnas.1516686113

    Article  Google Scholar 

  • Baker, D. P., Fabrega, R., Galindo, C., & Mishook, J. (2004). Instructional time and national achievement: Cross-national evidence. Prospects, 34(3), 311–334. https://doi.org/10.1007/s11125-004-5310-1

    Article  Google Scholar 

  • Belle, D. (1999). The After-school Lives of Children: Alone and With Others While Parents Work. Mahwah: Lawrence Erlbaum Associates.

  • BOE (2006). Organic Law 2/2006, 3rd May, of Education (LOE). N° 106, 4th May 2006, 17158–17207. Spain.

  • BOE (2013). Organic Law 8/2013, 9th December, for the improvement of the education quality (LOMCE). N° 295, 10th December 2013, 97858–97921. Spain.

  • BOJA (2016). Order 111/2016, 14th June. N° 122, 28th June 2016, 27–45. Andalusia.

  • BON (2015). Order 24/2015, 22nd April. N° 127, 2nd July 2015, 1–149. Navarre.

  • BOPV (2016). Order 236/2015, 22nd December. N° 9, 15th January 2016, 1–279. Basque Country.

  • Bunce, D. M., Flens, E. A., & Neiles, K. Y. (2010). How long can students pay attention in class? A study of student attention decline using clickers. Journal of Chemical Education, 87(12), 1438–1443. https://doi.org/10.1021/ed100409p

    Article  Google Scholar 

  • Cattaneo, M. A., Oggenfuss, C., & Wolter, S. C. (2017). The more, the better? The impact of instructional time on student performance. Education Economics, 25(5), 433–445. https://doi.org/10.1080/09645292.2017.1315055

    Article  Google Scholar 

  • DOGC (2015). Order 187/2015, 25th August. N° 6945, 28th August 2015. 29th august 2015. Catalonia.

  • Doncel, D. (2014). Curricular organization of collective identities in Spain. Revista de Educación, 366, 12–42. https://doi.org/10.4438/1988-592X-RE-2014-366-273

    Article  Google Scholar 

  • Durlak, J. A., Weissberg, R. P., & Pachan, M. (2010). A meta-analysis of after-school programs that seek to promote personal and social skills in children and adolescents. American Journal of Community Psychology, 45(3–4), 204–309. https://doi.org/10.1007/s10464-010-9300-6

    Article  Google Scholar 

  • García-Pérez, J. I., Hidalgo-Hidalgo, M., & Robles-Zurita, J. A. (2014). Does grade retention affect students’ achievement? Some evidence from PISA. Applied Economics, 46(12), 1373–1392. https://doi.org/10.1080/00036846.2013.872761

    Article  Google Scholar 

  • Gobierno de Navarra (2016). PISA 2015. Avance de resultados. Pamplona, Spain: Departamento de Educación. Sección de Evaluación y Calidad.

  • Gromada, A., & Shewbridge, C. (2016). Student Learning Time: A Literature Review. OECD Education Working Papers, No. 127. Paris: OECD Publishing. https://doi.org/10.1787/5jm409kqqkjh-en.

  • Hanushek, E. A. (2010). Education production functions: Developed countries evidence. In D. J. Brewer & P. J. McEwan (Eds.), International encyclopedia of education (pp. 132–136). Amsterdam: Elsevier.

    Google Scholar 

  • Hille, A., & Schupp, J. (2015). How learning a musical instrument affects the development of skills. Economics of Education Review, 44, 56–82. https://doi.org/10.1016/j.econedurev.2014.10.007

    Article  Google Scholar 

  • INE (2017). Encuesta de condiciones de vida. http://www.ine.es/jaxiT3/Tabla.htm?t=9963&L=0 (last accessed: June 2017).

  • Lavy, V. (2015). Do differences in schools’ instruction time explain international achievement gaps? Evidence from developed and developing countries. The Economic Journal, 125, F397–F424. https://doi.org/10.1111/ecoj.12233

    Article  Google Scholar 

  • Lipscomb, S. (2007). Secondary school extracurricular involvement and academic achievement: A fixed effects approach. Economics of Education Review, 26, 463–472. https://doi.org/10.1016/j.econedurev.2006.02.006

    Article  Google Scholar 

  • McCammon, L. A., Saldaña, J., Hines, A., & Omasta, M. (2012). Lifelong impact: Adult perceptions of their high school speech and/or theatre participation. Youth Theatre Journal, 26, 2), 2–2),25. https://doi.org/10.1080/08929092.2012.678223

    Article  Google Scholar 

  • MECD (2009). Sistema estatal de indicadores de la educación. Edición 2009. Madrid: Ministerio de Educación.

  • Metzler, J., & Woessmann, L. (2012). The impact of teacher subject knowledge on student achievement: Evidence from within-teacher within-student variation. Journal of Development Economics, 99(2), 486–496. https://doi.org/10.1016/j.jdeveco.2012.06.002

    Article  Google Scholar 

  • Mullis, I. V. S., Martin, M. O., Foy, P., & Drucker, K. T. (2012). PIRLS 2011 International Results in Reading. Chestnut Hill: TIMSS & PIRLS International Study Center.

  • OECD (2009). PISA Data Analysis Manual. SPSS Second Edition. OECD Publishing.

  • OECD (2011). Quality Time for Students: Learning In and Out of School. OECD Publishing. https://doi.org/10.1787/9789264087057-en.

  • OECD (2013). PISA 2012 results: What makes schools successful? Resources, policies and practices (volume IV). PISA, OECD Publishing. https://doi.org/10.1787/9789264201156-en.

  • OECD (2014). PISA 2012 Technical Report. PISA, OECD Publishing.

  • OECD (2015). Education at a Glance 2015: OECD Indicators. Paris: OECD Publishing. https://doi.org/10.1787/eag-2015-en.

  • OECD (2016a). PISA 2015 Results (Volume I): Excellence and Equity in Education. Paris: PISA, OECD Publishing. https://doi.org/10.1787/9789264266490-en.

  • OECD (2016b). PISA 2015 Technical Report. PISA, OECD Publishing.

  • OECD (2016c). PISA 2015 Technical Standards. PISA, OECD Publishing.

  • Patall, E. A., Cooper, H., & Allen, A. B. (2010). Extending the school day or school year. Review of Educational Research, 80(3), 401–436. https://doi.org/10.3102/0034654310377086

    Article  Google Scholar 

  • Rivkin, S. G., & Schiman, J. C. (2015). Instruction time, classroom quality, and academic achievement. The Economic Journal, 125, F425–F448. https://doi.org/10.1111/ecoj.12315

    Article  Google Scholar 

  • Walberg, H. J., Niemiec, R. P., & Frederick, W. C. (1994). Productive curriculum time. Peabody Journal of Education, 69(3), 86–100. https://doi.org/10.1080/01619569409538779

    Article  Google Scholar 

  • Woessmann, L. (2010). Institutional determinants of school efficiency and equity: German states as a microcosm for OECD countries. Journal of Economics and Statistics (Jahrbücher für Nationalökonomie und Statistik), 230(2), 234–270. https://doi.org/10.1515/jbnst-2010-0206

    Article  Google Scholar 

Download references

Acknowledgements

This work has been partly supported by the Consejería de Innovación, Ciencia y Empresa de la Junta de Andalucía (PAI group SEJ-532 and Excellence research group SEJ-2727); by the Ministerio de Economía y Competitividad (Research Project ECO2014-56397-P) and the FPU scholarship of the Ministerio de Educación, Cultura y Deporte (FPU2014 04518). Luis Alejandro Lopez-Agudo also acknowledges the training received from the University of Malaga PhD Programme in Economy and Business [Programa de Doctorado en Economía y Empresa de la Universidad de Malaga].

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Luis Alejandro Lopez–Agudo.

Ethics declarations

Conflict of Interest

The authors declare that they have no conflict of interest.

Appendices

Appendix 1

Table 4 Descriptive statistics of scores and weekly instruction time by Autonomous Community
Table 5 Estimations for checking the equality of students’ unobserved subject-specific effect of weekly instruction time on students’ academic achievement and the equality of the effect of weekly instruction time on students’ academic achievement of both subjects

Appendix 2. Robustness checks

Focusing on the robustness checks, in Table 6 the sample has been divided according to the tertile of the economic, social and cultural status index provided by PISA 2015 (ESCS)Footnote 22 –high/medium/low level–. As can be inferred from the results, weekly instruction time does not seem to affect academic achievement in any of the levels of ESCS, with the exception of the high ESCS tertile, but the significance of this coefficient (p-value of 0.085) and its quantitative amount in the context of PISA scale of scores does not seem to indicate an effect of weekly instruction time on academic achievement.Footnote 23 Thus, the null effect obtained in the base model is not the consequence of a compensation of different effects for each ESCS tertile.

Table 6 Estimations of the effect of weekly instructional time on students’ academic achievement by ESCS tertile

The robustness check presented in Table 7 was performed by dividing the Spanish AACC in three groups according to their relative at risk of poverty rate in 2015 (compared to all the AACC), using at risk of poverty rate data provided by INE (2017) for all AACC in 2015, in order to check whether the effect of instruction time per week on children’s academic achievement could be different depending on the relative at risk of poverty group to which AACC belong. Results have corroborated the lack of effect of weekly instruction time on the three groups, which reinforces our main results and indicates that this lack of effect is not the result of a compensation of considering all the three groups in the same regression.

Table 7 Estimations of the effect of weekly instructional time on students’ academic achievement by AACC relative at risk of poverty group

The next robustness check aims at analysing whether grouping students at school according to their ability could influence the results obtained regarding weekly instruction time, in the sense that the effect of weekly instruction time may be different for these groups. Hence, not considering this grouping could entail compensating the differential effect of weekly instruction time on children’s academic achievement in each grouping option, obtaining a lack of effect. Specifically, this time difference between groups could be due to the fact that, for example, children in higher ability groups may be receiving greater amounts of instruction time per week than the rest to take advantage of their better skills or, on the contrary, low ability children may be receiving higher amounts of weekly instruction time to compensate their lack of ability. To perform this check we have divided the sample according to the information provided by the head teacher regarding this grouping; specifically, if the school applies any of these criteria: grouping students by ability into different classes, grouping students by ability within their classes or not grouping students by ability. If we perform a descriptive analysis, we see that the three grouping options present similar amounts of weekly instruction time, so it seems that grouping would not influence a priori the lack of effect of weekly instruction time on academic achievement.

In addition, it is important to highlight that the estimations of this robustness check have been performed using students’ self-reported weekly instruction time, instead of the school average of weekly instruction time. We can thus account for the fact that the differences in the reported amounts of instruction time per week by students may be caused by their grouping. The results for this robustness check are presented in Table 8, and show the lack of effect of weekly instruction time for the three grouping options. This reinforces the argument in favour of averaging students’ self-reported instruction time per week by school in order to reduce reporting errors due to students’ difficulties in recalling it or because of absenteeism. Although we cannot check for absenteeism, as previously argued, this should not affect our results, since attendance is compulsory in secondary education.

Table 8 Estimations of the effect of weekly instructional time on students’ academic achievement conditioned by school grouping by ability

The robustness check presented in Table 9 is based on the assumption that weekly instruction may have a nonlinear effect (due to children’s decreasing attention; Bunce et al. 2010) on students’ academic achievement,Footnote 24 so this lack of effect may be due to a specification error. We have specified this nonlinear effect by means of two different strategies: using a squared term for weekly instruction time and its division into three categorical variables according to its distribution. In both cases results hold, so the lack of effect obtained would not be caused by a misspecification of the functional form of the model.

Table 9 Estimations of the nonlinear effect of weekly instructional time on students’ academic achievement

Finally, Table 10 presents the results related to the interaction of weekly instruction time with some measures of school quality provided by the head teacher in the school questionnaire. Specifically, the interacted school quality variables are all derived variables and indexes created by the OECD from other variables in the school questionnaire.Footnote 25 These are: shortage of educational material, shortage of educational staff, proportion of teachers with an ISCED level 5a bachelor qualification, proportion of teachers with an ISCED level 5a masters qualification, proportion of teachers with an ISCED level 6 qualification, student behaviour hindering learning and teacher behaviour hindering learning. The results obtained may be indicating that any of these school quality variables condition the lack of effect of weekly instruction time on children’s academic achievement. Although shortage of educational material is weakly significant, its effect is almost zero in quantity; the same happens with the student behaviour hindering learning index.

Table 10 Estimations of the effect of weekly instructional time on students’ academic achievement in relation to school quality

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Lopez–Agudo, L.A., Marcenaro–Gutierrez, O. Are Spanish Children Taking Advantage of their Weekly Classroom Time?. Child Ind Res 12, 187–211 (2019). https://doi.org/10.1007/s12187-018-9537-4

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12187-018-9537-4

Keywords

  • Weekly instruction time
  • Indicators of academic achievement
  • Spanish regions
  • Secondary education
  • Student fixed effects