1 Introduction

At the peak of the educational disruption caused by COVID-19, over 1.5 billion students in roughly 190 countries missed out on learning at school.Footnote 1 Given the scale and duration of school closures under COVID-19, the learning loss is likely to result in long-term negative consequences. During school closures, governments attempted to minimize the adverse effects. For instance, in at least 34 states in the US, local entities such as school districts and state departments of education partnered with public television stations to support online learning for students and teachers.Footnote 2 The severity of the educational disruption, including whether government attempts to alleviate it were successful, has not yet been examined in detail.Footnote 3 To contribute toward this gap in the literature, in this paper, we document how an online learning service enabled students to study during the COVID-19 pandemic in Japan.

The Japanese government began a nationwide school closure on March 2, 2020, which continued for 3 months until the end of May 2020. There is evidence that the school closure was unexpected and that many schools were unprepared for online learning. In this paper, we estimate the effect of school closure due to COVID-19 on the utilization of an online learning service by comparing user logs in 2020 with those from 2019. Our findings assist in understanding how an online learning service could make up the lost studying time during COVID-19.

For these analyses, we have access to the data on user logs of the online education service, SuRaLa, in Japan. Although the service offers variety of materials for students in wide range of grades, in this study, we focus on user logs of lecture materials and management functions for junior high school and high school students. Important feature of our dataset is that it includes students and their teacher who subscribed the service on a school basis. In other words, schools, rather than students and parents, made a final decision, upon introducing the service. This may imply that, conditional on being a student of a school in our sample, we do not expect household-level selection into adopting the service.

We find that the school closure during COVID-19 increased the study time of students using the online learning service, with the effect being strongest at the beginning of the school closure period. It decreased gradually and disappeared in June 2020, when most schools reopened. This result suggests that the online learning service enabled students to study online and compensated for the missed classes during the closure period. In addition, we find a positive effect on teachers’ effort levels, measured by messages sent from teachers to students via the online learning service. Interestingly, the effect on teachers’ effort is positively correlated with the effect on study time.

The contribution of this paper is to provide further evidence for the effects of online learning during the current COVID-19 pandemic. A few studies have investigated education under COVID-19, mostly with a focus on online learning (Bacher-Hicks et al., 2020; Chetty et al., 2020). Our study is most closely related to Chetty et al. (2020), in that both describe the utilization of an online education platform. Given that Figlio et al. (2013) find that online learning is a reasonable substitute for face-to-face learning, our results suggest that online learning services have mitigated the negative consequences of COVID-19 on education.

The remainder of the paper is organized as follows: In Sect. 2, we discuss the related literature. Section 3 outlines the institutional background and the online learning service. In Sect. 4, we describe our data and Sect. 5 presents and discusses our results. Section 6 offers concluding remarks.

2 Related literature

Some studies based on observational data find negative effects of online lectures on learning (Bacolod et al., 2018; Bettinger et al., 2017; Xu & Jaggars, 2014). However, by comparing the estimates from observational and experimental data, Joyce et al. (2015) argue that the former are likely to overestimate the effect of face-to-face lectures. Figlio et al. (2013) show that an online lecture in a university increased students’ grade points by up to 3 points out of 100 relative to the effect of face-to-face lectures, although it had adverse effects for males and academically weaker students. Overall, these findings suggest that an online lecture can be a reasonable substitute for a face-to-face lecture for many students. Note that the result from the literature may not be directly applicable to junior high school and high school students because these studies examine university students.

Several studies investigate the impacts of COVID-19 on education. One strand of literature deals with real-time data under COVID-19. Chetty et al. (2020) examine data from an online learning service largely used as part of a math curriculum in the US. They observe an acute drop of the average number of lessons completed in the service during COVID-19, with a particularly strong negative effect for students in low-income areas. Aucejo et al. (2020) conduct a survey of 1500 students at Arizona State University, asking how COVID-19 affected their studies. They find a decrease in study time and a negative effect on educational outcomes, including delays in graduating and greater dropout from courses. Both Chetty et al. (2020) and Aucejo et al. (2020) describe how students’ study behavior is affected by COVID-19 and report that the total amount of study time decreased. Finally, Bacher-Hicks et al. (2020) use regional-level data on Google searches and find that the search intensity for online education material increased after the school closure and that the increase was more prominent in high-income regions.

We contribute to this strand of literature by documenting students’ and teachers’ activities in an online learning service during the COVID-19 pandemic. Our study is most closely related to Chetty et al. (2020). Similar to our study, they examine study behavior of K-12 students in an online education service, Zearn, under school closure during the COVID-19. They also conduct their analyses by plotting the transition of the number of lessons completed in the service before and after the COVID-19 outbreak. The major difference between our settings is that, prior to COVID-19, Zearn is used as a part of official curriculum in the US, while SuRala is primarily offered as supplementary educational materials. We suspect that this difference may account for why study time increased during school closure in our study, whereas Chetty et al. (2020) find a decrease in study time.

A second strand of literature examines the long-term effect of COVID-19 based on estimates from past incidents, such as the 2001 foot and mouth disease epidemic in England (Cook, 2020), and a combination of summer vacation, weather-related school closure, and absenteeism (Kuhfeld et al., 2020). Both Cook (2020) and Kuhfeld et al. (2020) predict a negative long-term effect of COVID-19. However, these studies may overestimate the effect of COVID-19 because, as Kuhfeld et al. (2020) state, online education can mitigate the negative effects.

3 Background

In this section, we document the timeline of school closure due to COVID-19 in Japan to assist readers to understand the context. It should be noted that, in contrast to the situation in many other countries, the Japanese academic year begins in April and ends in March of the following year.

On February 27, 2020, Prime Minister Shinzo Abe made a public statement that the Japanese government requested all elementary schools, junior high schools, and high schools to close from March 2, 2020 until the beginning of spring break, which typically begins in the third or fourth week of March and lasts around two weeks, although the exact date and duration vary by region, school, and year. Thus, the statement implied roughly 3 weeks of school closure.

The announcement came as a complete surprise, as indicated by two pieces of evidenceFootnote 4 concerning the decision-making process of the government and the government school closure policy prior to the announcement. First, the Minister of Education, Culture, Sports, Science and Technology conceded in the Diet that Prime Minister Abe had only informed him of the school closure on the morning of February 27, the day of the public announcement. The Minister’s comment reveals that only a few people close to Prime Minister Abe were involved in the decision and that the information was not shared with the education minister, let alone with other policymakers, or teachers, parents, and students.

Second, prior to the announcement, the government’s school closure policy was, as requested by the Ministry of Education, Culture, Sports, Science and Technology, that each municipality’s school board should keep schools open unless a positive case of COVID-19 was found in a school. Given that most schools had no positive cases at that time, the school boards were unlikely to expect imminent school closures. In addition, on February 25, the Ministry requested that school boards begin making contingency plans for possible school closures.

The nationwide school closure due to COVID-19 began on March 2 and lasted until the end of May in most provinces.Footnote 5 Following the government’s announcement, nearly all schools managed to close on March 2, only 4 days after the announcement. At that time, the government stated that schools would be closed for about 3 weeks, until the beginning of spring break. On March 20, the government announced that the school closure would not be extended after spring break and released guidelines for reopening schools on March 24. Hence, schools were expected to reopen after spring break, which was the beginning of the new academic year.

However, schools did not reopen after spring break because on April 7 the government declared a state of emergency for seven provinces, which was later extended nationwide. Along with this declaration, the government asked municipalities to decide whether to reopen schools at their own discretion. This declaration resulted in a de facto continuation of the school closure. As shown in Table 1, most schools remained closed, or reclosed, after the declaration of the state of emergency.

3.1 Surala

Surala is an online learning service provider. Their service covers a variety of subjects, including math, Japanese, English, science, and social studies, and caters to all grades, from grade 1 to grade 12. Students can study materials above or below their own grade to review past materials or to prepare for university entrance exams. Subscription to the service can be on either an individual or a school basis. In this paper, we focus on users under school-wide contracts because this means that school and teacher characteristics are available. Students included in our data are enrolled in either junior high schools or high schools. Junior high school covers grades 7–9 and high school grades 10–12, but we omit grades 9 and 12 from our analysis, as discussed below.

The service provides three types of learning materials: lectures, drills, and tests. For the lecture material, students watch videos and answer quizzes during the lecture. As they can pause and rewind lectures, the time taken to complete a lecture varies between students. After each lecture, students are asked to solve drill questions. These differ from the quizzes during the lecture, which are primarily designed to draw students’ attention and ensure they understand the explanations provided in the lecture. By contrast, drills are problem sets designed to enhance students’ deeper understanding of the material.

A learning unit on Surala consists of a lecture and a drill. Several units relating to the same topic comprise one lesson. Finally, lessons are categorized into stages, depending on the level of advancement. Students may begin their study from any unit. Our data set includes user activity logs on units, but does not include any information on tests on Surala.

To manage students’ learning, Surala issues an account for teachers, which enables them to observe how their students study online. For instance, a teacher examines students’ understanding based on their progress and test scores with the learning service. In addition, teachers can send messages to students through the service, either to an individual student, a group of selected students, or to all students in the school.

4 Data

4.1 User activity log

We now describe our data set and define the variables used in the analysis. Our main data set is drawn from user activity logs on Surala. It includes information on when and what each student studied, and selected demographic characteristics, including grade and gender, as well as a school identifier. The data set also includes the teachers’ message log, which records the time and content of teachers’ messages to students.

4.1.1 Sample restrictions

We focus on students in junior high school (grades 7–9) and high school (grades 10–12), but we exclude the third-year students in both types of schools (i.e., those in grades 9 and 12) to avoid issues arising from attrition. Because the academic year ends in March and begins in April, most third-year students in both junior high school and high school move to a different school in April. Most junior high schools and high schools are separated, although some private schools provide a combined program.

We excluded observations for movers, students with missing information, and outliers. In the raw data, there are students who spent more than 10 h on one unit, whereas the average time to complete a unit is about 10 min. As we suspect that these students paid little attention to the material, despite being logged on to Surala, we exclude their entire study records from the data. Specifically, students with weekly study time exceeding the 99.9th percentile are dropped from the sample. Note that excluding outliers does not substantially change the mean study time, but slightly decreases the standard errors of our estimates. In addition, we excluded individuals with missing school identifiers for analyses that use school characteristics.

With regard to the message data, our sample includes messages sent from teachers to individual students. Although teachers can also send messages to a group of selected students or to all students in the school, in our data, 77% of messages were sent to individual students. In addition, when we construct a school-level variable, such as the weekly average number of messages per school, we include schools that used the service in January in our sample.

4.1.2 Summary statistics

Table 2 reports summary statistics for the key variables. The average time spent on completing a study unit is 8.98 min. During the period of analysis, 32% of students logged in to Surala in a given week. Among those who logged in, the average time for studying on Surala was 73.84 min. When we include students who did not log in (i.e., the majority), the average study time is 23.29 min. In our sample, 68% of students are in high school and 55% are males. Grades and gender were reported by the school as of April 2020.

We determine whether a student ever accessed the online learning service from home before school closure using the user log. Specifically, we consider that a student did not access the service from home prior to the school closure if he/she did not use the online learning service either after 8:00 p.m. or on the weekend during the period from April to December 2019. According to this definition, 17% of students had no prior access to the online learning service from home.

Our data include 224 schools, of which 136 agreed to disclose their names. They consist of 41 junior high schools, 85 high schools, and 10 combined junior and senior high schools. For these schools, we used a measure of school quality published by a private firm. The measured quality was originally scaled to a mean of 50 and a standard deviation of 10 among the schools on their list.Footnote 6 For junior high schools, the list includes only selective private schools, whereas for high schools, both public and private schools are included. The average quality index for junior high schools in our sample is 40.30, whereas that for the high schools is 49.81.

There were 4,596 messages sent from teachers to students and 95% of them were sent in 2020. Among the schools from which at least one message was sent from a teacher in a given week, the average weekly number of messages sent from a school is 10.66 and the average weekly number of teachers who sent a message is 1.41. Among teachers who sent at least one message in a given week, the average weekly number of messages is 7.56.

Finally, we collect data on the date of school closure. As described in Sect. 3, schools gradually reopened as the state of emergency was lifted in selected provinces on May 14 and in all provinces on May 25. Figure 1 shows the percentage of schools in our data that were closed. Most of them closed on March 2 and reopened from June 1, and this pattern is common across junior high schools and high schools.

Fig. 1
figure 1

Fraction of closed schools in the sample. Jr. HS and HS denote the fractions of closed junior high schools and high schools, respectively

5 Results

5.1 Students’ study time

In this section, we show how students’ study time and teachers’ messaging changed in response to the COVID-19 school closure. Figure 2 presents the average weekly study time in 2019 and 2020. We use study time in 2019 as a comparison to indicate what would have happened in 2020 if the COVID-19 pandemic had not occurred. The vertical dashed line (in red) indicates the beginning of the nationwide school closure. For the first week of January 2020 to the last week of February, there is little difference in study time between 2019 and 2020. However, the online study time surges in the first week of March 2020 relative to the same week in 2019. The average study time in 2020 continues to be longer than that of 2019 until the first week of June, when most schools reopened.

Fig. 2
figure 2

Average weekly study time. The figure shows the average study time for students in grades 7, 8, 10, and 11

Figure 3 shows the growth of the average weekly study time from 2019 to 2020 with a 95% confidence interval. From January 1 until the end of February, there is no statistically significant difference. However, from the start of the school closure period, the study time in 2020 is significantly longer than the corresponding time in 2019. In fact, based on Table 3, the study time in 2020 is 22 min longer per week, which roughly amounts up to two 45-min-classes in one month. We observe a statistically significant growth in study time until the end of April. Although there is a growth in study time until the first week of June, the growth is statistically insignificant in May.

Fig. 3
figure 3

Change in weekly study time from 2019 to 2020. The shaded region is the 95% confidence interval computed with standard errors robust to clustering at school level. Study time is shown for students in grades 7, 8, 10, and 11

According to the weekly log-in rate in Table 2, only 32% of students study online each week. We expect that a large portion of students remain unaffected by school closure because they do not have access to the online learning service at home. Next, to determine the main driver of the overall effect, we examine the extensive and intensive margins of changes in study time. Figures 4 and 5 show changes in the extensive margin, defined by a weekly log-in indicator. They show that there is a marginally significant effect of school closure from March to the beginning of April. That is, the effect on the extensive margin disappears slightly earlier than the overall effect.

Fig. 4
figure 4

Weekly log-in rate. The figure shows the log-in rate for students in grades 7, 8, 10, and 11. Log-in is defined as studying at least once in a given week

Fig. 5
figure 5

Change of weekly log-in rate from 2019 to 2020. The shaded region is the 95% confidence interval computed with standard errors robust to clustering at school level. The figure shows the log-in rate for students in grades 7, 8, 10, and 11. Log-in is defined as studying at least once in a given week

Figures 6 and 7 show the changes in the intensive margin, defined by the average weekly study time conditional on log-in. In 2019, there are spikes at the end of both March and April. These periods correspond to the end of spring break and the 1-week-long national holiday (Golden Week), respectively. A Surala manager explained the spikes by the fact that students study intensively to finish homework due after the spring break and the holidays. Regardless of the spikes, a significant positive effect for the intensive margin persists until the end of May. That is, the effect for the intensive margin persists longer than the overall effect. In fact, monthly analyses in Table 3 also shows that the magnitude of the effect on intensive margin is almost as large as the previous month while the effect on overall effect decreases by half in May.

Fig. 6
figure 6

Average weekly study time conditional on log-in. Average study time conditional on log-in for students in grades 7, 8, 10, and 11. Only students who study at least once in a given week are included

Fig. 7
figure 7

Change of weekly study time conditional on log-in from 2019 to 2020. The shaded region is the 95% confidence interval computed with standard errors robust to clustering at school level. The figure shows the average study time conditional on log-in for students in grades 7, 8, 10, and 11. Only students who study at least once in a given week are included

Finally, Table 3 summarizes results from analyses of monthly log-in rate and study time, which shows patterns consistent with the weekly analyses. Note that, in these analyses, study time is defined by weekly average study time in a given month. In Table 3, study time, as well as the intensive and extensive margin, responses to school closure in March, and the difference continues to be present by the time of schools reopening, June. For instance, the effect on unconditional study time in March is more than three times longer comparing to the effect in May. One difference is that we detect the statistically significant effect on unconditional study time in May. Overall, the analyses in Table 3 provide similar results to weekly analyses, with some improvement in precision.

5.2 Teachers’ messaging

Not only students but also teachers responded to the school closure. Figure 8 shows the total number of messages sent from school each week. Before the school closure, teachers sent virtually no messages, presumably because they communicated with their students in person. The number of messages after school closure increased in the months of March, April, and May. The largest increase occurred in the middle of April, whereas study time was longest in March. In fact, as in Tables 3 and 4, the effect on overall study time in March is roughly three times larger than that in May, whereas the effect on message is five times less. As March is the end of the school year, students may not have required much support from teachers, given that they mostly review materials taught in class at this time. By contrast, in April, at the beginning of the new school year, students studying new materials may require more assistance from teachers, leading them to send more messages in April.

Fig. 8
figure 8

Total number of messages from teachers per week. The figure shows the weekly average number of messages from teachers to students per school

Figure 9 presents the number of teachers online and the number of messages sent per teacher. The upper panel of Fig. 9 shows the number of teachers who sent at least one message each week. While there were more teachers online in 2020 in any week, the movements in the number of teachers online are parallel between 2019 and 2020 for January and February. There was a rise in the number of teachers online in the second week of March 2020, with a further rise in the second week of April when the new academic year began. However, the number of teachers online fell to the pre-COVID-19 level in the second week of June 2020. The lower panel of Fig. 9 shows the average number of messages per active teacher online. This number moves in a similar fashion to the number of active teachers online, although the pattern is clearer in the upper panel. The magnitude of the change is largest in April, which is consistent with the effect on the total number of messages shown in Fig. 8. Overall, therefore, the changes in the aggregate numbers of messages were driven by both extensive and intensive margins.

Fig. 9
figure 9

Numbers of teachers online (top) and messages sent per teacher. Teachers are defined as being online if they send at least one message in a given week. The number of messages per teacher is defined as the average number of messages per teacher conditional on being online

5.3 Correlation between study time and messages

Next, we examine the relationship between students’ study time and messages from teachers. One might expect that teachers paying attention to students by sending messages to them would encourage students to study more. Although we cannot test this hypothesis, we can examine the association between students’ study time and teachers’ messaging. Note that we exclude outliers, which we define as changes in study time exceeding 146 min (95th percentile) or changes in number of messages of more than 1337 (99th percentile). The results including the outliers are reported in Appendix IV.

As Fig. 10 shows, we find a positive correlation between changes in the school-level average study time and changes in the number of messages at the school level. The former is calculated by taking the difference in school-level average study time from March to May in 2019 and that in 2020, noting that school level means the average study time per student in each school. Similarly, a change in the number of messages at the school level is measured as the difference between the total number of messages from March to May in 2019 and that in 2020 in each school.

Fig. 10
figure 10

Correlation between study time and messages. Observations are excluded if the change in study time is greater than 146 min (the 95th percentile) or if the change in the number of messages exceeds 1,337 (the 99th percentile). The graph with the full sample is provided in Appendix IV. The number of observations is 123 and the unit of observation is the school. Study time represents the average study time within each school in March, April, and May 2020. The number of messages represents the total number of messages sent from teachers to students in the same period

In Fig. 11, we observe a positive correlation between changes in study time and changes in the number of teachers online at the school level. We define a teacher being online as a teacher who sent at least one message during the period from March to May in 2019 or during 2020. School level means that we consider the number of teachers online in each school.

Fig. 11
figure 11

Correlation between study time and the number of teachers online. Observations are excluded if they involve a change in study time that exceeds 1,891.319 (99th percentile) or a change in the number of messages that is greater than 11 (the 99th percentile). A graph based on the full sample is provided in Appendix IV. The number of observations is 123 and the unit of observation is the school. Study time represents the average study time within each school in March, April, and May. The number of teachers online represents the number of teachers who sent a message at least once in March, April, and May

5.4 Heterogeneity

In this section, we examine the heterogeneous effects of COVID-19 on study time by prior access to the online services from home, school quality, and student gender. In Appendix V, we report the heterogeneous effects by grade, region, past utilization, and in-class utilization. The analysis compares study time in 2020 between two groups in each case. For instance, we describe heterogeneity by school quality by comparing the weekly average study time of a high-quality school in 2020 with that of a low-quality school in 2020.

5.4.1 By prior access from home and school quality

Chetty et al. (2020) find that the number of lessons completed by students from different income areas varies during school closure. We suspect that this difference arises from variations in internet access at home and study habits. In the following, we examine heterogeneity in prior access to the online study services from home and school quality.

In Fig. 12, we compare the study time of students with and without prior access to online services from home. We consider that a student has no prior access to online services from home if he/she never logged in after 8:00 p.m. on weekdays or at any time on the weekend from April 2019 to the end of December 2019. Although the study time of students with prior access to online services from home increased after the school closure, the study time of those with no prior access decreased. As shown in Fig. 13, the difference decreased over time and continued to be statistically significant until the beginning of April.

Fig. 12
figure 12

Average weekly study time by prior online learning access from home in 2020. The figure shows the study time by prior access to online learning from home for students in grades 7, 8, 10, and 11. Studying at home is defined by accessing the online services after 8:00 p.m. or on the weekend. A student who never studied at home from April to December in 2019 is defined as one with no prior access from home. Note that students who never used the service from April to December in 2019 are excluded

Fig. 13
figure 13

Average weekly study time by prior access from home in 2020. The shaded region is the 95% confidence interval computed with standard errors robust to clustering at school level. The figure shows the study time for students in grades 7, 8, 10, and 11 with no prior access to online learning from home minus the study time for students in grades 7, 8, 10, and 11 with access from home. Studying at home is defined by accessing the online services after 8:00 p.m. or on the weekend. A student who never studied at home from April to December in 2019 is defined as one with no prior access from home. Note that students who never used the service from April to December in 2019 are excluded

We do not consider the large difference between the two groups at the beginning of January to be problematic for our identification because it is likely to have arisen from the company, SuRaLa Net Co., Ltd., promoting its services and the New Year holiday. The company’s promotion involved the students participating in a tournament, in which they were ranked by study times. Because students who could use the service from home would have found it easier to increase their study time in response to this promotion, they were likely to use the service more than students without access. In addition, as the New Year holiday occurs during the first week of January, students could not access the online service through school facilities at this time. This may explain why students with no prior access from home tend to study less at the beginning of January. However, except for this period, the trend in study time before the school closure period is similar between the two groups.

In summary, we observe that students with prior access to online services from home utilized the service more under COVID-19 than did students with no prior access from home.

Second, we examine students by the quality of their schools. We consider a school to be high quality if its quality index is above the median. Figures 14 and 15 describe heterogeneity with respect to school quality. We find that students from higher-quality schools consistently studied more during the school closure period, although the difference is not statistically significant.

Fig. 14
figure 14

Average weekly study time by school quality in 2020. The figure shows the study time by school quality for students in grades 7, 8, 10, and 11. School quality is defined by the level of the school, which is obtained from external sources. Low and high describe schools for which the quality level is below and above the median, respectively

Fig. 15
figure 15

Heterogeneity by school quality. The shaded region is the 95% confidence interval computed with standard errors robust to clustering at school level. The figure shows the study time for students in grades 7, 8, 10, and 11 in high-quality schools minus the study time for students in grades 7, 8, 10, and 11 in low-quality schools. School quality is defined by the level of school, which is obtained from external sources. Low and high describe schools for which the quality level is below and above the median, respectively

5.4.2 By gender

Figlio et al. (2013) show that male students tend to struggle with online learning; therefore, we examine any differences in study time between male and female students. Figures 16 and 17 show that there is no statistically significant difference in the average weekly study time between male and female students in 2020.

Fig. 16
figure 16

Average weekly study time by gender in 2020. The figure shows the study time by gender for students in grades 7, 8, 10, and 11

Fig. 17
figure 17

Heterogeneity by gender. The shaded region is the 95% confidence interval computed with standard errors robust to clustering at school level. The figure shows the study time for male students in grades 7, 8, 10, and 11 minus the study time for female students in grades 7, 8, 10, and 11

We also examine heterogeneity across several other variables, namely grade, region, previous usage, and in-class utilization. For most of these, we find no sizable difference across the two groups; however, the one exception is heterogeneity by grade. In fact, we find a statistically significant difference between high school and junior high school students. In addition, the result concerning heterogeneity by past utilization of the online learning service suggests that the schools with more experience of the service are more likely to use it both before and after the school closure period in 2020. More details of these results can be found in Appendix V.

6 Summary and concluding remarks

This paper documents the effect of the school closure under COVID-19 on students’ study time and teachers’ inputs in an online learning service. We find that online study time significantly increased during the school closure and that it returned to the pre-COVID-19 level when the school closure lifted at the end of May 2020. In addition, we find that teachers sent more messages to students via the online service during the school closure than before or after. We note that the effects of school closure are heterogeneous. Specifically, students with access to the online learning service from home and students at higher-quality schools increased their study time more than other students.

Our finding suggests that an online learning service may help students to study during school closure. Thus, the government may want to consider introducing online learning tools in preparation for future possible school closures. Further, we note that policy makers should be aware that a lack of internet and/or personal computer access can raise inequality in learning during school closure.

A potential limitation of our study is that we do not have data on students’ learning activities outside the service. For instance, students who studied intensively offline would not have suffered learning losses under COVID-19 school closures, despite not studying via the online learning service. In addition, home/housing environment and home-study support by parents/siblings may affect students’ learning, but we are unable to observe them in the current data. Future research on the impact of COVID-19 on education should complement our results by examining a more comprehensive measure of the impact, such as long-term educational attainments.