In this section, we prove our main results. The plan of the proof is the following:
-
we first study the limiting behaviour of the score functions \(\widehat{G}_{N,J}^\varepsilon \) and \(\widetilde{G}_{N,J}^\varepsilon \) defined in (2.14) and (2.17) as the number of observations N goes to infinity, i.e. as the final time T tends to infinity;
-
we then show the continuity of the limit of the score functions obtained in the previous step and we compute their limits as the multiscale parameter \(\varepsilon \) vanishes (Sect. 5.1);
-
we finally prove our main results, i.e. the asymptotic unbiasedness of the drift estimators (Sect. 5.2).
We first define the Jacobian matrix of the function \(g_j\) introduced in (2.13) with respect to a:
which will be employed in the following and where \(\otimes \) denotes the outer product in
\(\mathbb {R}^M\) and the dot denotes either the Jacobian matrix or the gradient with respect to a, e.g.
. Then note that, under Assumption 2.2, due to ergodicity and stationarity and by Bibby and Rensen (1995, Lemma 3.1) we have
$$\begin{aligned}&\lim _{N\rightarrow \infty } \frac{1}{N} \widehat{G}_{N,J}^\varepsilon (a) = \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\varphi ^\varepsilon } \left[ g_j \left( X_0^\varepsilon , X_\Delta ^\varepsilon , X_0^\varepsilon ; a \right) \right] \nonumber \\&\quad =:\widehat{\mathcal G}_J(\varepsilon ,a), \end{aligned}$$
and
$$\begin{aligned}&\lim _{N\rightarrow \infty } \frac{1}{N} \widetilde{G}_{N,J}^\varepsilon (a) = \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ g_j \left( X_0^\varepsilon , X_\Delta ^\varepsilon , \widetilde{Z}_0^\varepsilon ; a \right) \right] \nonumber \\&\quad =:\widetilde{\mathcal G}_J(\varepsilon ,a), \end{aligned}$$
(5.2)
where
and
denotes, respectively, that \(X_0^\varepsilon \) and
are distributed according to their invariant distribution. We remark that the invariant distribution \(\widetilde{\rho }^\varepsilon \) exists due to Lemma A.2. By equation (5.1) the Jacobian matrices of \(\widehat{\mathcal G}_J(\varepsilon ,a)\) and \(\widetilde{\mathcal G}_J(\varepsilon ,a)\) with respect to a are given by
$$\begin{aligned} \widehat{\mathcal H}_J(\varepsilon ,a)&:=\frac{\partial }{\partial a}\widehat{\mathcal G}_J(\varepsilon ,a)\nonumber \\&\quad = \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\varphi ^\varepsilon } \left[ h_j \left( X_0^\varepsilon , X_\Delta ^\varepsilon , X_0^\varepsilon ; a \right) \right] , \end{aligned}$$
and
$$\begin{aligned} \widetilde{\mathcal H}_J(\varepsilon ,a)&:=\frac{\partial }{\partial a}\widetilde{\mathcal G}_J(\varepsilon ,a)\nonumber \\&\quad = \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ h_j \left( X_0^\varepsilon , X_\Delta ^\varepsilon , \widetilde{Z}_0^\varepsilon ; a \right) \right] . \end{aligned}$$
(5.3)
Continuity of the limit of the score function
In this section, we first prove the continuity of the functions \(\widehat{\mathcal G}_J, \widetilde{\mathcal G}_J :(0,\infty ) \times \mathcal A \rightarrow \mathbb {R}^M\) and \(\widehat{\mathcal H}_J, \widetilde{\mathcal H}_J, :(0,\infty ) \times \mathcal A \rightarrow \mathbb {R}^{M \times M}\). We then study the limit of these functions for \(\varepsilon \rightarrow 0\). As the proof for the filtered and the non-filtered are similar, we will concentrate on the filtered case and comment on the non-filtered case. Before entering into the proof, we give two preliminary technical lemmas which will be used repeatedly and whose proof can be found, respectively, in Appendix A.1 and Appendix A.3.
Lemma 5.1
Let \(\widetilde{Z}^\varepsilon \) be defined in (2.16) and distributed according to the invariant measure \(\widetilde{\rho }^\varepsilon \) of the process \((\widetilde{X}_n, \widetilde{Z}_n)\). Then, for any \(p\ge 1\) there exists a constant \(C>0\) uniform in \(\varepsilon \) such that
$$\begin{aligned} \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left|\widetilde{Z}^\varepsilon \right|^p \le C. \end{aligned}$$
Lemma 5.2
Let \(f :\mathbb {R}\rightarrow \mathbb {R}\) be a \(\mathcal C^\infty (\mathbb {R})\) function which is polynomially bounded along with all its derivatives. Then,
$$\begin{aligned}&f(X_\Delta ^\varepsilon ) = f(X_0^\varepsilon ) - A \cdot V'(X_0^\varepsilon ) f'(X_0^\varepsilon ) \Delta + \Sigma f''(X_0^\varepsilon ) \Delta \nonumber \\&\quad + \sqrt{2\sigma } \int _0^\Delta f'(X_t^\varepsilon ) (1+\Phi '(Y_t^\varepsilon )) \,\mathrm {d}W_t + R(\varepsilon ,\Delta ), \end{aligned}$$
where \(R(\varepsilon ,\Delta )\) satisfies for all \(p\ge 1\) and for a constant \(C>0\) independent of \(\Delta \) and \(\varepsilon \)
$$\begin{aligned} \left( \mathbb {E}^{\varphi ^\varepsilon } \left|R(\varepsilon ,\Delta )\right|^p \right) ^{1/p} \le C(\varepsilon + \Delta ^{3/2}). \end{aligned}$$
We start here with a continuity result for the score function and its Jacobian matrix with respect to the unknown parameter.
Proposition 5.3
Under Assumption 2.5, the functions \(\widetilde{\mathcal G}_J : (0,\infty ) \times \mathcal A \rightarrow \mathbb {R}^M\) and \(\widetilde{\mathcal H}_J, :(0,\infty ) \times \mathcal A \rightarrow \mathbb {R}^{M \times M}\) defined in (5.2) and (5.3), where \(\Delta \) can be either independent of \(\varepsilon \) or \(\Delta =\varepsilon ^\zeta \) with \(\zeta >0\), are continuous.
Proof
We only prove the statement for \(\widetilde{\mathcal G}_J\), then the argument is similar for \(\widetilde{\mathcal H}_J\). Letting \(\varepsilon ^*\in (0,\infty )\) and \(a^*\in \mathcal A\), we want to show that
$$\begin{aligned} \lim _{(\varepsilon ,a) \rightarrow (\varepsilon ^*,a^*)} \left\| \widetilde{\mathcal G}_J(\varepsilon ,a) - \widetilde{\mathcal G}_J(\varepsilon ^*,a^*)\right\| = 0. \end{aligned}$$
By the triangle inequality, we have
$$\begin{aligned}&\left\| \widetilde{\mathcal G}_J(\varepsilon ,a) - \widetilde{\mathcal G}_J(\varepsilon ^*,a^*)\right\| \le \left\| \widetilde{\mathcal G}_J(\varepsilon ,a) - \widetilde{\mathcal G}_J(\varepsilon ,a^*)\right\| \nonumber \\&\quad + \left\| \widetilde{\mathcal G}_J(\varepsilon ,a^*) - \widetilde{\mathcal G}_J(\varepsilon ^*,a^*)\right\| =:Q_1(\varepsilon ,a) + Q_2(\varepsilon ),\nonumber \\ \end{aligned}$$
then we divide the proof in two steps and we show that the two terms vanish.
Step 1: \(Q_1(\varepsilon ,a) \rightarrow 0\) as \((\varepsilon ,a) \rightarrow (\varepsilon ^*,a^*)\).
Since \(\beta _j\) and \(\phi _j\) are continuously differentiable with respect to a for all \(j=1,\dots ,J\), respectively, due to Assumption 2.5 and Lemma A.4, then also \(g_j\) is continuously differentiable with respect to a. Therefore, by the mean value theorem for vector-valued functions, we have
$$\begin{aligned} \begin{aligned} Q_1(\varepsilon ,a)&\le \frac{1}{\Delta }\sum _{j=1}^J \left\| \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ g_j(X_0^\varepsilon , X_\Delta ^\varepsilon , \widetilde{Z}_0^\varepsilon ; a) \right] \right. \\&\left. - \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ g_j(X_0^\varepsilon , X_\Delta ^\varepsilon , \widetilde{Z}_0^\varepsilon ; a^*) \right] \right\| \\&= \frac{1}{\Delta }\sum _{j=1}^J \left\| \int _0^1 \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ h_j(X_0^\varepsilon , X_\Delta ^\varepsilon , \widetilde{Z}_0^\varepsilon ; a^* \right. \right. \\&\left. \left. + t(a-a^*)) \right] \,\mathrm {d}t \; (a-a^*)\right\| . \end{aligned}\nonumber \\ \end{aligned}$$
Then, letting \(C>0\) be a constant independent of \(\varepsilon \), since \(\beta _j\) and \(\phi _j\) are polynomially bounded still by Assumption 2.5 and \(X_0^\varepsilon \), \(X_\Delta ^\varepsilon \) and \(\widetilde{Z}_0^\varepsilon \) have bounded moments of any order by Pavliotis and Stuart (2007, Corollary 5.4) and Lemma 5.1, we obtain
$$\begin{aligned} Q_1(\varepsilon ,a) \le \frac{C}{\Delta }\left\| a-a^*\right\| , \end{aligned}$$
which implies that \(Q_1(\varepsilon ,a)\) vanishes as \((\varepsilon ,a)\) goes to \((\varepsilon ^*,a^*)\) both if \(\Delta \) is independent of \(\varepsilon \) and if \(\Delta =\varepsilon ^\xi \).
Step 2: \(Q_2(\varepsilon ) \rightarrow 0\) as \(\varepsilon \rightarrow \varepsilon ^*\).
If \(\Delta \) is independent of \(\varepsilon \), then we have
$$\begin{aligned} \begin{aligned} \lim _{\varepsilon \rightarrow \varepsilon ^*} Q_2(\varepsilon )&= \lim _{\varepsilon \rightarrow \varepsilon ^*} \left\| \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ g_j(X_0^\varepsilon , X_\Delta ^\varepsilon , \widetilde{Z}_0^\varepsilon ; a^*) \right] \right. \\&\left. - \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^{\varepsilon ^*}} \left[ g_j(X_0^{\varepsilon ^*}, X_\Delta ^{\varepsilon ^*}, \widetilde{Z}_0^{\varepsilon ^*}; a^*) \right] \right\| \\&\le \lim _{\varepsilon \rightarrow \varepsilon ^*} \frac{1}{\Delta }\sum _{j=1}^J \left\| \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ g_j(X_0^\varepsilon , X_\Delta ^\varepsilon , \widetilde{Z}_0^\varepsilon ; a^*) \right] \right. \\&\left. - \mathbb {E}^{\widetilde{\rho }^{\varepsilon ^*}} \left[ g_j(X_0^{\varepsilon ^*}, X_\Delta ^{\varepsilon ^*}, \widetilde{Z}_0^{\varepsilon ^*}; a^*) \right] \right\| , \end{aligned}\nonumber \\ \end{aligned}$$
and the right-hand side vanishes due to the continuity of \(g_j\) for all \(j=1,\dots ,J\) and the continuity of the solution of a stochastic differential equation with respect to a parameter (see Krylov 2009, Theorem 2.8.1). Let us now consider the case \(\Delta = \varepsilon ^\zeta \) with \(\zeta >0\) and let us assume, without loss of generality, that \(\varepsilon >\varepsilon ^*\). Denoting \(\Delta ^* = (\varepsilon ^*)^\zeta \) and applying Itô’s lemma we have for all \(j=1,\dots ,J\)
$$\begin{aligned} \begin{aligned} \phi _j(X_\Delta ^\varepsilon ;a^*)&= \phi _j(X_{\Delta ^*}^{\varepsilon };a^*)\\&- \alpha \cdot \int _{\Delta ^*}^\Delta V'(X_t^\varepsilon ) \phi _j'(X_t^\varepsilon ;a^*) \,\mathrm {d}t\\&- \frac{1}{\varepsilon }\int _{\Delta ^*}^\Delta \phi _j'(X_t^\varepsilon ;a^*) p' \left( \frac{X_t^\varepsilon }{\varepsilon } \right) \,\mathrm {d}t\\&+ \sigma \int _{\Delta ^*}^\Delta \phi _j''(X_t^\varepsilon ;a^*) \,\mathrm {d}t\\&+ \sqrt{2\sigma } \int _{\Delta ^*}^\Delta \phi _j'(X_t^\varepsilon ;a^*) \,\mathrm {d}W_t, \end{aligned} \end{aligned}$$
then we can write
$$\begin{aligned}&\widetilde{\mathcal G}_J(\varepsilon ,a^*) = \frac{1}{\Delta }\sum _{j=1}^J \left( \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) \phi _j(X_{\Delta ^*}^\varepsilon ;a^*) \right] \right. \nonumber \\&\quad \left. - e^{-\lambda (a^*)\Delta } \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) \phi _j(X_0^\varepsilon ;a^*) \right] \right) + R(\varepsilon ),\nonumber \\ \end{aligned}$$
where \(R(\varepsilon )\) is given by
$$\begin{aligned} \begin{aligned} R(\varepsilon )&= -\frac{1}{\Delta }\sum _{j=1}^J \int _{\Delta ^*}^\Delta \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) \phi _j'(X_t^\varepsilon ;a^*) \alpha \cdot V'(X_t^\varepsilon ) \right] \,\mathrm {d}t \\&\quad -\frac{1}{\varepsilon \Delta } \sum _{j=1}^J \int _{\Delta ^*}^\Delta \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) \phi _j'(X_t^\varepsilon ;a^*) p' \left( \frac{X_t^\varepsilon }{\varepsilon } \right) \right] \,\mathrm {d}t \\&\quad + \frac{\sigma }{\Delta }\int _{\Delta ^*}^\Delta \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) \phi _j''(X_t^\varepsilon ;a^*) \right] \,\mathrm {d}t\\&\quad + \frac{\sqrt{2\sigma }}{\Delta } \sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ \int _{\Delta ^*}^\Delta \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) \phi _j'(X_t^\varepsilon ;a^*) \,\mathrm {d}W_t \right] . \end{aligned}\nonumber \\ \end{aligned}$$
Let \(C>0\) be independent of \(\varepsilon \) and notice that since \(p'\) is bounded, \(\beta _j,\phi _j',\phi _j'',V'\) are polynomially bounded and \(X_t^\varepsilon \) and \(\widetilde{Z}_0^\varepsilon \) have bounded moments of any order by Pavliotis and Stuart (2007, Corollary 5.4) and Lemma 5.1, applying Hölder’s inequality we obtain
$$\begin{aligned} \left|R(\varepsilon )\right|&\le \frac{C}{\Delta }\left( \left\| \alpha \right\| + \sigma + \frac{1}{\varepsilon }\right) (\Delta - \Delta ^*)\nonumber \\&\quad + \frac{C}{\Delta }\sqrt{2\sigma } (\Delta - \Delta ^*)^{1/2}. \end{aligned}$$
(5.4)
Therefore, by the continuity of the solution of a stochastic differential equation with respect to a parameter (see Mishura et al. 2010) and due to the bound (5.4), we deduce that
$$\begin{aligned} \lim _{\varepsilon \rightarrow \varepsilon ^*} \widetilde{\mathcal G}_J(\varepsilon ,a^*)= & {} \frac{1}{\Delta ^*} \sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^{\varepsilon ^*}} \left[ \beta _j(\widetilde{Z}_0^{\varepsilon ^*};a^*) \left( \phi _j(X_{\Delta ^*}^{\varepsilon ^*};a^*) \right. \right. \nonumber \\&\left. \left. \quad - e^{-\lambda (a^*)\Delta ^*} \phi _j(X_0^{\varepsilon ^*};a^*) \right) \right] \nonumber \\&= \widetilde{\mathcal G}_J(\varepsilon ^*,a^*), \end{aligned}$$
which implies that \(Q_2(\varepsilon )\) vanishes as \(\varepsilon \) goes to \(\varepsilon ^*\) and concludes the proof.
Remark 5.4
Notice that the proof of Proposition 5.3 can be repeated analogously for the functions \(\widehat{\mathcal G}_J :(0,\infty ) \times \mathcal A \rightarrow \mathbb {R}^M\) and \(\widehat{\mathcal H}_J :(0,\infty ) \times \mathcal A \rightarrow \mathbb {R}^{M \times M}\) without filtered data in order to prove their continuity.
Next we study the limit as \(\varepsilon \) vanishes and we divide the analysis in two cases. In particular, we consider \(\Delta \) independent of \(\varepsilon \) and \(\Delta =\varepsilon ^\zeta \) with \(\zeta >0\). In the first case (Proposition 5.5), data are sampled at the homogenized regime ignoring the fact that the they are generated by a multiscale model, while in the second case (Proposition 5.7) the distance between two consecutive observations is proportional to the multiscale parameter and thus, data are sampled at the multiscale regime preserving the multiscale structure of the full path.
Proposition 5.5
Let the functions \(\widetilde{\mathcal G}_J :(0,\infty ) \times \mathcal A \rightarrow \mathbb {R}^M\) and \(\widetilde{\mathcal H}_J, :(0,\infty ) \times \mathcal A \rightarrow \mathbb {R}^{M \times M}\) be defined in (5.2) and (5.3) and let \(\Delta \) be independent of \(\varepsilon \). Under Assumption 2.5 and for any \(a^* \in \mathcal A\), we have
$$\begin{aligned} \begin{aligned} (i)&\lim _{(\varepsilon ,a) \rightarrow (0,a^*)} \widetilde{\mathcal G}_J(\varepsilon ,a)\\&= \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^0} \left[ g_j \left( X_0^0, X_\Delta ^0, \widetilde{Z}_0^0; a^* \right) \right] , \\ (ii)&\lim _{(\varepsilon ,a) \rightarrow (0,a^*)} \widetilde{\mathcal H}_J(\varepsilon ,a)\\&= \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^0} \left[ h_j \left( X_0^0, X_\Delta ^0, \widetilde{Z}_0^0; a^* \right) \right] . \end{aligned} \end{aligned}$$
Proof
We only prove the statement for \(\widetilde{\mathcal G}_J\), then the argument is similar for \(\widetilde{\mathcal H}_J\). By the triangle inequality, we have
$$\begin{aligned}&\left\| \widetilde{\mathcal G}_J(\varepsilon ,a) - \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^0} \left[ g_j \left( X_0^0, X_\Delta ^0, \widetilde{Z}_0^0; a^* \right) \right] \right\| \nonumber \\&\quad \le Q_1(\varepsilon ,a) + Q_2(\varepsilon ), \end{aligned}$$
where
$$\begin{aligned} Q_1(\varepsilon ,a) = \left\| \widetilde{\mathcal G}_J(\varepsilon ,a) - \widetilde{\mathcal G}_J(\varepsilon ,a^*)\right\| , \end{aligned}$$
which vanishes due to the first step of the proof of Proposition 5.3 and
$$\begin{aligned}&Q_2(\varepsilon ) =\left\| \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ g_j \left( X_0^\varepsilon , X_\Delta ^\varepsilon , \widetilde{Z}_0^\varepsilon ; a^* \right) \right] \right. \nonumber \\&\left. \quad - \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^0} \left[ g_j \left( X_0^0, X_\Delta ^0, \widetilde{Z}_0^0; a^* \right) \right] \right\| . \end{aligned}$$
Let us remark that the convergence in law of the joint process \(\{(\widetilde{X}^\varepsilon _n, \widetilde{Z}^\varepsilon _n)\}_{n=0}^N\) to the joint process \(\{(\widetilde{X}^0_n, \widetilde{Z}^0_n)\}_{n=0}^N\) by Lemma A.2 implies the convergence in law of the triple \((X_0^\varepsilon , X_\Delta ^\varepsilon , \widetilde{Z}_0^\varepsilon )\) to the triple \((X_0^0, X^0_\Delta , \widetilde{Z}^0_0)\) since \(\widetilde{X}_0^\varepsilon = X_0^\varepsilon \), \(\widetilde{X}_1^\varepsilon = X_\Delta ^\varepsilon \) and \(\widetilde{X}_0^0 = X_0^0\), \(\widetilde{X}_1^0 = X_\Delta ^0\). Therefore, we have
$$\begin{aligned}&\lim _{\varepsilon \rightarrow 0} Q_2(\varepsilon ) \le \lim _{\varepsilon \rightarrow 0} \frac{1}{\Delta }\sum _{j=1}^J \left\| \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ g_j \left( X_0^\varepsilon , X_\Delta ^\varepsilon , \widetilde{Z}_0^\varepsilon ; a^* \right) \right] \right. \nonumber \\&\left. \quad - \mathbb {E}^{\widetilde{\rho }^0} \left[ g_j \left( X_0^0, X_\Delta ^0, \widetilde{Z}_0^0; a^* \right) \right] \right\| = 0, \end{aligned}$$
which implies the desired result.
Remark 5.6
Similar results to Proposition 5.3 and Proposition 5.5 can be shown for the estimator without filtered data. In particular we have that \(\widehat{\mathcal G}_J(\varepsilon ,a)\) and \(\widehat{\mathcal H}_J(\varepsilon ,a)\) are continuous in \((0,\infty ) \times \mathcal A\) and
$$\begin{aligned} \begin{aligned} (i)&\lim _{(\varepsilon ,a) \rightarrow (0,a^*)} \widehat{\mathcal G}_J(\varepsilon ,a)\\&= \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\varphi ^0} \left[ g_j \left( X_0^0, X_\Delta ^0, X_0^0; a^* \right) \right] , \\ (ii)&\lim _{(\varepsilon ,a) \rightarrow (0,a^*)} \widehat{\mathcal H}_J(\varepsilon ,a)\\&= \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\varphi ^0} \left[ h_j \left( X_0^0, X_\Delta ^0, X_0^0; a^* \right) \right] . \end{aligned} \end{aligned}$$
Since the proof is analogous, we do not report here the details.
Proposition 5.7
Let the functions \(\widetilde{\mathcal G}_J :(0,\infty ) \times \mathcal A \rightarrow \mathbb {R}^M\) and \(\widetilde{\mathcal H}_J, :(0,\infty ) \times \mathcal A \rightarrow \mathbb {R}^{M \times M}\) be defined in (5.2) and (5.3) and let \(\Delta =\varepsilon ^\zeta \) with \(\zeta >0\) and \(\zeta \ne 1\), \(\zeta \ne 2\). Under Assumption 2.5 and for any \(a^* \in \mathcal A\), we have
-
(i)
\(\lim _{(\varepsilon ,a) \rightarrow (0,a^*)} \widetilde{\mathcal G}_J(\varepsilon ,a) = \widetilde{\mathfrak g}_J^0(a^*)\), where
$$\begin{aligned} \widetilde{\mathfrak g}_J^0(a) :=\sum _{j=1}^J \mathbb {E}^{\rho ^0} \left[ \beta _j(Z_0^0;a) \left( \mathcal {L}_A \phi _j(X_0^0;a) + \lambda _j(a) \phi _j(X_0^0;a) \right) \right] ,\nonumber \\ \end{aligned}$$
-
(ii)
\(\lim _{(\varepsilon ,a) \rightarrow (0,a^*)} \widetilde{\mathcal H}_J(\varepsilon ,a) = \widetilde{\mathfrak h}_J^0(a^*)\), where
where the generator \(\mathcal {L}_A\) is defined in (2.9).
Proof
We only prove the statement for \(\widetilde{\mathcal G}_J\), then the argument is similar for \(\widetilde{\mathcal H}_J\). By the triangle inequality, we have
$$\begin{aligned}&\left\| \widetilde{\mathcal G}_J(\varepsilon ,a) - \widetilde{\mathfrak g}_J^0(a^*)\right\| \nonumber \\&\quad \le \left\| \widetilde{\mathcal G}_J(\varepsilon ,a) - \widetilde{\mathcal G}_J(\varepsilon ,a^*)\right\| \nonumber \\&\quad + \left\| \widetilde{\mathcal G}_J(\varepsilon ,a^*) - \widetilde{\mathfrak g}_J^0(a^*)\right\| =:Q_1(\varepsilon ,a) + Q_2(\varepsilon ), \end{aligned}$$
then we need to show that the two terms vanish and we distinguish two cases.
Case 1: \(\zeta \in (0,1)\).
Applying Lemma 5.2 to the functions \(\phi _j(\cdot ;a^*)\) for all \(j=1,\dots ,J\) and noting that
$$\begin{aligned} \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) \int _0^\Delta \phi _j'(X_t^\varepsilon ;a^*) (1+\Phi '(Y_t^\varepsilon )) \,\mathrm {d}W_t \right] = 0,\nonumber \\ \end{aligned}$$
since
$$\begin{aligned} M_s :=\int _0^s \phi _j'(X_t^\varepsilon ;a^*) (1+\Phi '(Y_t^\varepsilon )) \,\mathrm {d}W_t \end{aligned}$$
is a martingale with \(M_0=0\), we have
$$\begin{aligned} \begin{aligned} \widetilde{\mathcal G}_J(\varepsilon ,a^*)&= \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) \left( \phi _j(X_\Delta ^\varepsilon ;a^*) - e^{-\lambda _j(a^*)\Delta } \phi _j(X_0^\varepsilon ;a^*) \right) \right] \\&= \frac{1 - e^{-\lambda _j(a^*)\Delta }}{\Delta } \sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) \phi _j(X_0^\varepsilon ;a^*) \right] \\&+ \sum _{j=1}^J \frac{1}{\Delta }\mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) R(\varepsilon ,\Delta ) \right] + \sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) \left( \Sigma \phi _j''(X_0^\varepsilon ;a^*) - A \cdot V'(X_0^\varepsilon ) \phi _j'(X_0^\varepsilon ;a^*) \right) \right] \\&=:I_1^\varepsilon + I_2^\varepsilon + I_3^\varepsilon , \end{aligned} \end{aligned}$$
where \(R(\varepsilon ,\Delta )\) satisfies for a constant \(C>0\) independent of \(\varepsilon \) and \(\Delta \) and for all \(p\ge 1\)
$$\begin{aligned} \left( \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left|R(\varepsilon ,\Delta )\right|^p \right) ^{1/p} \le C(\varepsilon + \Delta ^{3/2}). \end{aligned}$$
(5.5)
We now study the three terms separately. First, by Cauchy–Schwarz inequality, since \(\beta _j(\cdot ;a^*)\) is polynomially bounded, \(\widetilde{Z}_0^\varepsilon \) has bounded moments of any order by Lemma 5.1 and due to (5.5) we obtain
$$\begin{aligned} \left\| I_2^\varepsilon \right\| \le C \left( \varepsilon \Delta ^{-1} + \Delta ^{1/2} \right) . \end{aligned}$$
(5.6)
Let us now focus on \(I_1^\varepsilon \) for which we have
$$\begin{aligned} I_1^\varepsilon= & {} \frac{1 - e^{-\lambda _j(a^*)\Delta }}{\Delta } \sum _{j=1}^J \left( \mathbb {E}^{\rho ^\varepsilon } \left[ \beta _j(Z_0^\varepsilon ;a^*) \phi _j(X_0^\varepsilon ;a^*) \right] \right. \nonumber \\&\left. + \mathbb {E}\left[ \left( \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) - \beta _j(Z_0^\varepsilon ;a^*) \right) \phi _j(X_0^\varepsilon ;a^*) \right] \right) ,\nonumber \\ \end{aligned}$$
where \(Z_0^\varepsilon \) is distributed according to the invariant measure \(\rho ^\varepsilon \) of the continuous process \((X_t^\varepsilon ,Z_t^\varepsilon )\) and
$$\begin{aligned} \lim _{\varepsilon \rightarrow 0} \frac{1 - e^{-\lambda _j(a^*)\Delta }}{\Delta } = \lambda _j(a^*). \end{aligned}$$
(5.7)
By the mean value theorem for vector-valued functions, we have
$$\begin{aligned}&\mathbb {E}\left[ ( \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) - \beta _j(Z_0^\varepsilon ;a^*) ) \phi _j(X_0^\varepsilon ;a^*) \right] \nonumber \\&\quad = \mathbb {E}\left[ \int _0^1 \beta _j'(Z_0^\varepsilon + t(\widetilde{Z}_0^\varepsilon - Z_0^\varepsilon );a^*) \,\mathrm {d}t \; (\widetilde{Z}_0^\varepsilon - Z_0^\varepsilon ) \phi _j(X_0^\varepsilon ;a^*) \right] ,\nonumber \\ \end{aligned}$$
and since \(\beta _j'(\cdot ;a^*),\phi _j(\cdot ;a^*)\) are polynomially bounded, \(X_0^\varepsilon \), \(Z_0^\varepsilon \), \(\widetilde{Z}_0^\varepsilon \) have bounded moments of any order, respectively, by Pavliotis and Stuart (2007, Corollary 5.4), Abdulle et al. (2021, Lemma C.1) and Lemma 5.1 and applying Hölder’s inequality and Corollary A.3 we obtain
$$\begin{aligned}&\left\| \mathbb {E}\left[ \left( \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) - \beta _j(Z_0^\varepsilon ;a^*) \right) \phi _j(X_0^\varepsilon ;a^*) \right] \right\| \nonumber \\&\quad \le C \left( \Delta ^{1/2} + \varepsilon \right) . \end{aligned}$$
(5.8)
Moreover, notice that by homogenization theory (see Abdulle et al. 2021, Sect. 3.2) the joint process \((X_0^\varepsilon , Z_0^\varepsilon )\) converges in law to the joint process \((X_0^0, Z_0^0)\) and therefore
$$\begin{aligned}&\lim _{\varepsilon \rightarrow 0} \mathbb {E}^{\rho ^\varepsilon } \left[ \beta _j(Z_0^\varepsilon ;a^*) \phi _j(X_0^\varepsilon ;a^*) \right] \nonumber \\&\quad = \mathbb {E}^{\rho ^0} \left[ \beta _j(Z_0^0;a^*) \phi _j(X_0^0;a^*) \right] , \end{aligned}$$
which together with (5.7) and (5.8) yields
$$\begin{aligned} \lim _{\varepsilon \rightarrow 0} I_1^\varepsilon = \sum _{j=1}^J \lambda _j(a^*) \mathbb {E}^{\rho ^0} \left[ \beta _j(Z_0^0;a^*) \phi _j(X_0^0;a^*) \right] . \end{aligned}$$
(5.9)
We now consider \(I_3^\varepsilon \) and we follow an argument similar to \(I_2^\varepsilon \). We first have
$$\begin{aligned} \begin{aligned} I_3^\varepsilon&= \sum _{j=1}^J \mathbb {E}^{\rho ^\varepsilon } \left[ \beta _j(Z_0^\varepsilon ;a^*) \left( \Sigma \phi _j''(X_0^\varepsilon ;a^*)\right. \right. \nonumber \\&\quad \left. \left. - A \cdot V'(X_0^\varepsilon ) \phi _j'(X_0^\varepsilon ;a^*) \right) \right] \\&\quad + \sum _{j=1}^J \mathbb {E}\left[ \left( \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) - \beta _j(Z_0^\varepsilon ;a^*) \right) \right. \\&\quad \left. \left( \Sigma \phi _j''(X_0^\varepsilon ;a^*) - A \cdot V'(X_0^\varepsilon ) \phi _j'(X_0^\varepsilon ;a^*) \right) \right] \\&=:I_{3,1}^\varepsilon + I_{3,2}^\varepsilon , \end{aligned} \end{aligned}$$
where the first term in the right-hand side converges due to homogenization theory and the second one is bounded by
$$\begin{aligned} \left\| I_{3,2}^\varepsilon \right\| \le C \left( \Delta ^{1/2} + \varepsilon \right) . \end{aligned}$$
Therefore, we obtain
$$\begin{aligned}&\!\!\lim _{\varepsilon \rightarrow 0} I_3^\varepsilon =\nonumber \\&\!\!\sum _{j=1}^J \mathbb {E}^{\rho ^0} \left[ \beta _j(Z_0^0;a^*) \left( \Sigma \phi _j''(X_0^0;a^*) \!-\! A \cdot V'(X_0^0) \phi _j'(X_0^0;a^*) \right) \right] , \end{aligned}$$
which together with (5.6) and (5.9) implies
$$\begin{aligned}&\lim _{\varepsilon \rightarrow 0} \widetilde{\mathcal G}_J(\varepsilon ,a^*) = \nonumber \\&\sum _{j=1}^J \mathbb {E}^{\rho ^0}\left[ \beta _j(Z_0^0;a) \left( \Sigma \phi _j''(X_0^0;a^*) \right. \right. \nonumber \\&\quad - A \cdot V'(X_0^0) \phi _j'(X_0^0;a^*)\left. \left. + \lambda _j(a^*) \phi _j(X_0^0;a^*) \right) \right] , \end{aligned}$$
(5.10)
which shows that \(Q_2(\varepsilon )\) vanishes as \(\varepsilon \) goes to zero. Let us now consider \(Q_1(\varepsilon ,a)\). Following the first step of the proof of Proposition 5.3, we have
$$\begin{aligned} \begin{aligned} Q_1(\varepsilon ,a)&\le \frac{1}{\Delta }\sum _{j=1}^J \left\| \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ g_j(X_0^\varepsilon , X_\Delta ^\varepsilon , \widetilde{Z}_0^\varepsilon ; a) \right] \right. \\&\quad \left. - \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ g_j(X_0^\varepsilon , X_\Delta ^\varepsilon , \widetilde{Z}_0^\varepsilon ; a^*) \right] \right\| \\&\le \sum _{j=1}^J \left\| \frac{1}{\Delta }\mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ h_j(X_0^\varepsilon , X_\Delta ^\varepsilon , \widetilde{Z}_0^\varepsilon ; \widetilde{a}) \right] \right\| \\&\quad \left\| (a-a^*)\right\| , \end{aligned} \end{aligned}$$
where \(\widetilde{a}\) assumes values in the line connecting a and \(a^*\), and repeating the same computation as above we obtain
$$\begin{aligned} Q_1(\varepsilon ,a) \le C \left\| a-a^*\right\| , \end{aligned}$$
which together with (5.10) gives the desired result.
Case 2: \(\zeta \in (1,2) \cup (2,\infty )\).
Let \(Z_0^\varepsilon \) be distributed according to the invariant measure \(\rho ^\varepsilon \) of the continuous process \((X_t^\varepsilon ,Z_t^\varepsilon )\) and define
$$\begin{aligned} \begin{aligned} \widetilde{R}(\varepsilon ,\Delta )&:=\frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^\varepsilon } \left[ g_j(X_0^\varepsilon , X_\Delta ^\varepsilon , \widetilde{Z}_0^\varepsilon ; a^*) \right] \\&- \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\rho ^\varepsilon } \left[ g_j(X_0^\varepsilon , X_\Delta ^\varepsilon , Z_0^\varepsilon ; a^*) \right] \\&= \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}\left[ \left( \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) - \beta _j(Z_0^\varepsilon ;a^*) \right) \right. \\&\left. \left( \phi _j(X_\Delta ^\varepsilon ;a^*) - e^{-\lambda _j(a^*)\Delta } \phi _j(X_0^\varepsilon ;a^*) \right) \right] . \end{aligned} \end{aligned}$$
Then, we have
$$\begin{aligned}&\widetilde{\mathcal G}_J(\varepsilon ,a^*) = \sum _{j=1}^J \frac{1}{\Delta }\mathbb {E}^{\rho ^\varepsilon } \left[ g_j(X_0^\varepsilon , X_\Delta ^\varepsilon , Z_0^\varepsilon ; a^*) \right] \nonumber \\&\quad + \widetilde{R}(\varepsilon ,\Delta ) =:\sum _{j=1}^J Q_j^\varepsilon + \widetilde{R}(\varepsilon ,\Delta ), \end{aligned}$$
(5.11)
and we first bound the remainder \(\widetilde{R}(\varepsilon ,\Delta )\). Applying Itô’s lemma to the process \(X_t^\varepsilon \) with the functions \(\phi _j(\cdot ;a^*)\) for each \(j=1,\dots ,J\) we have
$$\begin{aligned} \begin{aligned} \phi _j(X_\Delta ^\varepsilon ;a^*)&= \phi _j(X_0^\varepsilon ;a^*) - \int _0^\Delta \alpha \cdot V'(X_t^\varepsilon ) \phi _j'(X_t^\varepsilon ;a^*) \,\mathrm {d}t \\&- \int _0^\Delta \frac{1}{\varepsilon }p' \left( \frac{X_t^\varepsilon }{\varepsilon } \right) \phi _j'(X_t^\varepsilon ;a^*) \,\mathrm {d}t \\&+ \sigma \int _0^\Delta \phi _j''(X_t^\varepsilon ;a^*) \,\mathrm {d}t \\&+ \sqrt{2\sigma } \int _0^\Delta \phi _j'(X_t^\varepsilon ;a^*) \,\mathrm {d}W_t, \end{aligned} \end{aligned}$$
(5.12)
and observing that
$$\begin{aligned} \mathbb {E}\left[ \left( \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) {-} \beta _j(Z_0^\varepsilon ;a^*) \right) \int _0^\Delta \phi _j'(X_t^\varepsilon ;a^*) \,\mathrm {d}W_t \right] {=} \,\, 0,\nonumber \\ \end{aligned}$$
(5.13)
since
$$\begin{aligned} M_s = \int _0^s \phi _j'(X_t^\varepsilon ;a^*) \,\mathrm {d}W_t \end{aligned}$$
is a martingale with \(M_0=0\), we obtain
$$\begin{aligned} \begin{aligned} \widetilde{R}(\varepsilon ,\Delta )&= \sum _{j=1}^J \frac{1-e^{-\lambda _j(a^*)\Delta }}{\Delta } \mathbb {E}\left[ \left( \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) - \beta _j(Z_0^\varepsilon ;a^*) \right) \phi _j(X_0^\varepsilon ;a^*) \right] \\&\quad + \sum _{j=1}^J \frac{1}{\Delta }\int _0^\Delta \mathbb {E}\left[ \left( \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) - \beta _j(Z_0^\varepsilon ;a^*) \right) \left( \sigma \phi _j''(X_t^\varepsilon ;a^*) - \alpha \cdot V'(X_t^\varepsilon ) \phi _j'(X_t^\varepsilon ;a^*) \right) \right] \,\mathrm {d}t \\&\quad - \sum _{j=1}^J \frac{1}{\varepsilon \Delta } \int _0^\Delta \mathbb {E}\left[ \left( \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) - \beta _j(Z_0^\varepsilon ;a^*) \right) p' \left( \frac{X_t^\varepsilon }{\varepsilon } \right) \phi _j'(X_t^\varepsilon ;a^*) \right] \,\mathrm {d}t \\&=:\widetilde{R}_1(\varepsilon ,\Delta ) + \widetilde{R}_2(\varepsilon ,\Delta ) + \widetilde{R}_3(\varepsilon ,\Delta ). \end{aligned}\nonumber \\ \end{aligned}$$
By the mean value theorem for vector-valued functions, we have
$$\begin{aligned}&\mathbb {E}\left[ ( \beta _j(\widetilde{Z}_0^\varepsilon ;a^*) - \beta _j(Z_0^\varepsilon ;a^*) ) \phi _j(X_0^\varepsilon ;a^*) \right] \nonumber \\&= \mathbb {E}\left[ \int _0^1 \beta _j'(Z_0^\varepsilon + t(\widetilde{Z}_0^\varepsilon - Z_0^\varepsilon );a^*) \,\mathrm {d}t \; ( \widetilde{Z}_0^\varepsilon - Z_0^\varepsilon ) \phi _j (X_0^\varepsilon ;a^*) \right] , \end{aligned}$$
and since \(\beta _j'(\cdot ;a^*),\phi _j(\cdot ;a^*)\) are polynomially bounded, \(X_0^\varepsilon \), \(Z_0^\varepsilon \), \(\widetilde{Z}_0^\varepsilon \) have bounded moments of any order, respectively, by Pavliotis and Stuart (2007, Corollary 5.4), Abdulle (2021, Lemma C.1) and Lemma 5.1 and applying Hölder’s inequality, we obtain
$$\begin{aligned} \left\| \widetilde{R}_1(\varepsilon ,\Delta )\right\| \le C \left( \mathbb {E}\left|\widetilde{Z}_0^\varepsilon - Z_0^\varepsilon \right|^{2} \right) ^{1/2}, \end{aligned}$$
(5.14)
for a constant \(C>0\) independent of \(\varepsilon \) and \(\Delta \). We repeat a similar argument for \(\widetilde{R}_2(\varepsilon ,\Delta )\) and \(\widetilde{R}_3(\varepsilon ,\Delta )\) to get
$$\begin{aligned}&\left\| \widetilde{R}_2(\varepsilon ,\Delta )\right\| \le C \left( \mathbb {E}\left|\widetilde{Z}_0^\varepsilon - Z_0^\varepsilon \right|^{2} \right) ^{1/2} \quad \text {and} \nonumber \\&\quad \left\| \widetilde{R}_3(\varepsilon ,\Delta )\right\| \le C \varepsilon ^{-1} \left( \mathbb {E}\left|\widetilde{Z}_0^\varepsilon - Z_0^\varepsilon \right|^{2} \right) ^{1/2}, \end{aligned}$$
which together with (5.14) yield
$$\begin{aligned} \left\| \widetilde{R}(\varepsilon ,\Delta )\right\| \le C \left( \mathbb {E}\left|\widetilde{Z}_0^\varepsilon - Z_0^\varepsilon \right|^{2} \right) ^{1/2} \left( 1 + \varepsilon ^{-1} \right) . \end{aligned}$$
(5.15)
Moreover, applying Lemma 5.2 and proceeding similarly to the first part of the first case of the proof, we have
$$\begin{aligned} \left\| \widetilde{R}(\varepsilon ,\Delta )\right\| \le C \left( \mathbb {E}\left|\widetilde{Z}_0^\varepsilon - Z_0^\varepsilon \right|^{2} \right) ^{1/2} \left( 1 + \varepsilon \Delta ^{-1} + \Delta ^{1/2} \right) ,\nonumber \\ \end{aligned}$$
which together with (5.15) and Corollary A.3 implies
$$\begin{aligned} \begin{aligned} \left\| \widetilde{R}(\varepsilon ,\Delta )\right\|&\le C \left( \mathbb {E}\left|\widetilde{Z}_0^\varepsilon - Z_0^\varepsilon \right|^{2} \right) ^{1/2} \left( 1 + \min \{ \varepsilon ^{-1}, \varepsilon \Delta ^{-1} + \Delta ^{1/2} \} \right) \\&\le C \left( \Delta ^{1/2} + \min \{ \varepsilon , \varepsilon ^{-1}\Delta \} \right) \\&\quad \left( 1 + \min \{ \varepsilon ^{-1}, \varepsilon \Delta ^{-1} + \Delta ^{1/2} \} \right) . \end{aligned}\nonumber \\ \end{aligned}$$
(5.16)
Let us now consider \(Q_j^\varepsilon \). Replacing equation (5.12) into the definition of \(Q_j^\varepsilon \) in (5.11) and observing that similarly to (5.13), it holds
$$\begin{aligned} \mathbb {E}^{\rho ^\varepsilon } \left[ \beta _j(Z_0^\varepsilon ;a^*) \int _0^\Delta \phi _j'(X_t^\varepsilon ;a^*) \,\mathrm {d}W_t \right] = 0, \end{aligned}$$
we obtain
$$\begin{aligned} \begin{aligned} Q_j^\varepsilon&= \frac{1-e^{-\lambda _j(a^*)}}{\Delta } \mathbb {E}^{\rho ^\varepsilon } \left[ \beta _j(Z_0^\varepsilon ;a^*) \phi _j(X_0^\varepsilon ;a^*) \right] \\&\quad - \frac{1}{\Delta }\left( \int _0^\Delta \mathbb {E}^{\rho ^\varepsilon } \left[ \left( \beta _j(Z_0^\varepsilon ;a^*) \otimes V'(X_t^\varepsilon ) \right) \phi _j'(X_t^\varepsilon ;a^*) \right] \,\mathrm {d}t \right) \alpha \\&\quad - \frac{1}{\Delta }\int _0^\Delta \mathbb {E}^{\rho ^\varepsilon } \left[ \beta _j(Z_0^\varepsilon ;a^*) \frac{1}{\varepsilon }p' \left( \frac{X_t^\varepsilon }{\varepsilon } \right) \phi _j'(X_t^\varepsilon ;a^*) \right] \,\mathrm {d}t \\&\quad + \frac{\sigma }{\Delta }\int _0^\Delta \mathbb {E}^{\rho ^\varepsilon } \left[ \beta _j(Z_0^\varepsilon ;a^*) \phi _j''(X_t^\varepsilon ;a^*) \right] \,\mathrm {d}t. \end{aligned}\nonumber \\ \end{aligned}$$
We rewrite \(\beta _j(Z_0^\varepsilon ;a^*)\) inside the integrals employing equation (2.21) and Itô’s lemma
$$\begin{aligned} \beta _j(Z_0^\varepsilon ;a^*) = \beta _j(Z_t^\varepsilon ;a^*) - \int _0^t \beta _j'(Z_s^\varepsilon ;a^*) \left( X_s^\varepsilon - Z_s^\varepsilon \right) \,\mathrm {d}s,\nonumber \\ \end{aligned}$$
hence due to stationarity we have
$$\begin{aligned} Q_j^\varepsilon = Q_{j,1}^\varepsilon + Q_{j,2}^{\varepsilon }, \end{aligned}$$
(5.17)
where
$$\begin{aligned} \begin{aligned} Q_{j,1}^\varepsilon&= \frac{1-e^{-\lambda _j(a^*)}}{\Delta } \mathbb {E}^{\rho ^\varepsilon } \left[ \beta _j(Z_0^\varepsilon ;a^*) \phi _j(X_0^\varepsilon ;a^*) \right] \\&\quad - \mathbb {E}^{\rho ^\varepsilon } \left[ \left( \beta _j(Z_0^\varepsilon ;a^*) \otimes V'(X_0^\varepsilon ) \right) \phi _j'(X_0^\varepsilon ;a^*) \right] \alpha \\&\quad - \mathbb {E}^{\rho ^\varepsilon } \left[ \beta _j(Z_0^\varepsilon ;a^*) \frac{1}{\varepsilon }p' \left( \frac{X_0^\varepsilon }{\varepsilon } \right) \phi _j'(X_0^\varepsilon ;a^*) \right] \\&\quad + \sigma \mathbb {E}^{\rho ^\varepsilon } \left[ \beta _j(Z_0^\varepsilon ;a^*) \phi _j''(X_0^\varepsilon ;a^*) \right] \end{aligned}\nonumber \\ \end{aligned}$$
and
$$\begin{aligned} \begin{aligned}&Q_{j,2}^\varepsilon = \frac{1}{\Delta }\left( \int _0^\Delta \int _0^t \mathbb {E}^{\rho ^\varepsilon } \left[ (\beta _j'(Z_s^\varepsilon ;a^*) \otimes V'(X_t^\varepsilon )) \phi _j'(X_t^\varepsilon ;a^*) (X_s^\varepsilon - Z_s^\varepsilon ) \right] \,\mathrm {d}s \,\mathrm {d}t \right) \alpha \\&\quad + \frac{1}{\Delta }\int _0^\Delta \int _0^t \mathbb {E}^{\rho ^\varepsilon } \left[ \beta _j'(Z_s^\varepsilon ;a^*) \frac{1}{\varepsilon }p' \left( \frac{X_t^\varepsilon }{\varepsilon } \right) \phi _j'(X_t^\varepsilon ;a^*) (X_s^\varepsilon - Z_s^\varepsilon ) \right] \,\mathrm {d}s \,\mathrm {d}t \\&\quad - \frac{\sigma }{\Delta }\int _0^\Delta \int _0^t \mathbb {E}^{\rho ^\varepsilon } \left[ \beta _j'(Z_s^\varepsilon ;a^*) \phi _j''(X_t^\varepsilon ;a^*) (X_s^\varepsilon - Z_s^\varepsilon ) \right] \,\mathrm {d}s \,\mathrm {d}t. \end{aligned} \end{aligned}$$
Since \(\phi _j'(\cdot ;a^*), \phi _j''(\cdot ;a^*)\) and \(\beta _j'(\cdot ;a^*)\) are polynomially bounded, \(p'\) is bounded and \(X_t^\varepsilon \) and \(Z_t^\varepsilon \) have bounded moments of any order, respectively, by Pavliotis and Stuart (2007, Corollary 5.4) and Abdulle et al. (2021, Lemma C.1), \(Q_{j,2}^\varepsilon \) is bounded by
$$\begin{aligned} \left\| Q_{j,2}^\varepsilon \right\| \le C \left( \Delta + \varepsilon ^{-1}\Delta \right) . \end{aligned}$$
(5.18)
Let us now move to \(Q_{j,1}^\varepsilon \) and let us define the functions
where \(\rho ^\varepsilon \) and \(\rho ^0\) are, respectively, the densities with respect to the Lebesgue measure of the invariant distributions of the joint processes \((X_t^\varepsilon ,Z_t^\varepsilon )\) and \((X_t^0,Z_t^0)\) and \(\varphi ^\varepsilon \) and \(\varphi ^0\) are their marginals with respect to the first component. Integrating by parts we have
$$\begin{aligned} \begin{aligned}&\mathbb {E}^{\rho ^\varepsilon } \left[ \beta _j(Z_0^\varepsilon ;a^*) \frac{1}{\varepsilon }p' \left( \frac{X_0^\varepsilon }{\varepsilon } \right) \phi _j'(X_0^\varepsilon ;a^*) \right] = \int _\mathbb {R}\int _\mathbb {R}\beta _j(z;a^*) \frac{1}{\varepsilon }p' \left( \frac{x}{\varepsilon }\right) \phi _j'(x;a^*) \rho ^\varepsilon (x,z) \,\mathrm {d}x \,\mathrm {d}z \\&\quad = -\sigma \int _\mathbb {R}\int _\mathbb {R}\frac{1}{C_{\varphi ^\varepsilon }} \beta _j(z;a^*) \frac{\mathrm {d}}{\,\mathrm {d}x} \left( e^{-\frac{1}{\sigma }p \left( \frac{x}{\varepsilon }\right) } \right) \phi _j'(x;a^*) e^{-\frac{1}{\sigma }\alpha \cdot V(x)} \eta ^\varepsilon (x,z) \,\mathrm {d}x \,\mathrm {d}z \\&\quad = \sigma \int _\mathbb {R}\int _\mathbb {R}\frac{1}{C_{\varphi ^\varepsilon }} \beta _j(z;a^*) \frac{\partial }{\partial x} \left( \phi _j'(x;a^*) e^{-\frac{1}{\sigma }\alpha \cdot V(x)} \eta ^\varepsilon (x,z) \right) e^{-\frac{1}{\sigma }p \left( \frac{x}{\varepsilon }\right) } \,\mathrm {d}x \,\mathrm {d}z, \end{aligned}\nonumber \\ \end{aligned}$$
which implies
$$\begin{aligned} \begin{aligned}&\mathbb {E}^{\rho ^\varepsilon } \left[ \beta _j(Z_0^\varepsilon ;a^*) \frac{1}{\varepsilon }p' \left( \frac{X_0^\varepsilon }{\varepsilon } \right) \phi _j'(X_0^\varepsilon ;a^*) \right] \\&\quad = \sigma \mathbb {E}^{\rho ^\varepsilon } \left[ \beta _j(Z_0^\varepsilon ;a^*) \phi _j''(X_0^\varepsilon ;a^*) \right] \\&\quad - \mathbb {E}^{\rho ^\varepsilon } \left[ (\beta _j(Z_0^\varepsilon ;a^*) \otimes V(X_0^\varepsilon )) \phi _j'(X_0^\varepsilon ;a^*) \right] \alpha \\&\quad + \sigma \int _\mathbb {R}\int _\mathbb {R}\beta _j(z;a^*) \phi _j'(x;a^*) \varphi ^\varepsilon (x) \frac{\partial }{\partial x} \eta ^\varepsilon (x,z) \,\mathrm {d}x \,\mathrm {d}z. \end{aligned} \end{aligned}$$
Employing the last equation in the proof of Lemma 3.5 in Abdulle et al. (2021) with \(\delta =1\) and \(f(x,z) = \beta _j(z;a^*) \phi _j'(x;a^*)\), we have
$$\begin{aligned}&\sigma \int _\mathbb {R}\int _\mathbb {R}\beta _j(z;a^*) \phi _j'(x;a^*) \varphi ^\varepsilon (x) \frac{\partial }{\partial x} \eta ^\varepsilon (x,z) \,\mathrm {d}x \,\mathrm {d}z\nonumber \\&\quad = \mathbb {E}^{\rho ^\varepsilon } \left[ \beta _j'(Z_0^\varepsilon ;a^*) \phi _j(X_0^\varepsilon ;a^*) (X_0^\varepsilon - Z_0^\varepsilon ) \right] , \end{aligned}$$
(5.19)
and thus we obtain
$$\begin{aligned}&Q_{j,1}^\varepsilon = \frac{1-e^{-\lambda _j(a^*)}}{\Delta } \mathbb {E}^{\rho ^\varepsilon } \left[ \beta _j(Z_0^\varepsilon ;a^*) \phi _j(X_0^\varepsilon ;a^*) \right] \nonumber \\&\quad - \mathbb {E}^{\rho ^\varepsilon } \left[ \beta _j'(Z_0^\varepsilon ;a^*) \phi _j(X_0^\varepsilon ;a^*) (X_0^\varepsilon - Z_0^\varepsilon ) \right] . \end{aligned}$$
Letting \(\varepsilon \) go to zero and due to homogenization theory, it follows
$$\begin{aligned}&\lim _{\varepsilon \rightarrow 0} Q_{j,1}^\varepsilon = \lambda _j(a^*) \mathbb {E}^{\rho ^0} \left[ \beta _j(Z_0^0;a^*) \phi _j(X_0^0;a^*) \right] \nonumber \\&\quad - \mathbb {E}^{\rho ^0} \left[ \beta _j'(Z_0^0;a^*) \phi _j(X_0^0;a^*) (X_0^0 - Z_0^0) \right] , \end{aligned}$$
then applying formula (5.19) for the homogenized equation, i.e. with \(p(y)=0\) and \(\alpha \) and \(\sigma \) replaced by A and \(\Sigma \), and integrating by parts we have
$$\begin{aligned} \begin{aligned}&\mathbb {E}^{\rho ^0} \left[ \beta _j'(Z_0^0;a^*) \phi _j(X_0^0;a^*) (X_0^0 - Z_0^0) \right] \\&\quad = \Sigma \int _\mathbb {R}\int _\mathbb {R}\beta _j(z;a^*) \phi _j'(x;a^*) \varphi ^0(x) \frac{\partial }{\partial x} \eta ^0(x,z) \,\mathrm {d}x \,\mathrm {d}z \\&\quad = - \Sigma \int _\mathbb {R}\int _\mathbb {R}\beta _j(z;a^*) \frac{\mathrm {d}}{\,\mathrm {d}x} \left( \phi _j'(x;a^*) \varphi ^0(x) \right) \eta ^0(x,z) \,\mathrm {d}x \,\mathrm {d}z \\&\quad = \mathbb {E}^{\rho ^0} \left[ \beta _j(Z_0^0;a^*) \left( \Sigma \phi _j''(X_0^0;a^*) - A \cdot V'(X_0^0) \phi _j'(X_0^0;a^*) \right) \right] . \end{aligned} \end{aligned}$$
Therefore, we obtain
$$\begin{aligned}&\lim _{\varepsilon \rightarrow 0} Q_{j,1}^\varepsilon = \mathbb {E}^{\rho ^0} \left[ \beta _j(Z_0^0;a^*) \left( \Sigma \phi _j''(X_0^0;a^*) \right. \right. \nonumber \\&\quad \quad \left. \left. - A \cdot V'(X_0^0) \phi _j'(X_0^0;a^*) + \lambda _j(a^*) \phi _j(X_0^0;a^*) \right) \right] , \end{aligned}$$
which together with (5.11), (5.17) and bounds (5.16) and (5.18) implies that \(Q_2(\varepsilon )\) vanishes as \(\varepsilon \) goes to zero. Finally, analogously to the first case we can show that also \(Q_1(\varepsilon ,a)\) vanishes, concluding the proof.
Remark 5.8
A similar result to Proposition 5.7 can be shown for the estimator without filtered data only if \(\zeta \in (0,1)\), i.e. the first case in the proof. In particular, we have
-
(i)
\(\lim _{(\varepsilon ,a) \rightarrow (0,a^*)} \widehat{\mathcal G}_J(\varepsilon ,a) = \widehat{\mathfrak g}_J^0(a^*)\), where
$$\begin{aligned}&\widehat{\mathfrak g}_J^0(a) :=\sum _{j=1}^J \mathbb {E}^{\varphi ^0}\left[ \beta _j(X_0^0;a) \left( \mathcal {L}_A \phi _j(X_0^0;a) + \lambda _j(a) \phi _j(X_0^0;a) \right) \right] ,\nonumber \\ \end{aligned}$$
-
(ii)
\(\lim _{(\varepsilon ,a) \rightarrow (0,a^*)} \widehat{\mathcal H}_J(\varepsilon ,a) = \widehat{\mathfrak h}_J^0(a^*)\), where
where the generator \(\mathcal {L}_A\) is defined in (2.9). Since the proof is analogous, we do not report here the details. On the other hand, if \(\zeta > 2\), we can show that
-
(i)
\(\lim _{(\varepsilon ,a) \rightarrow (0,a^*)} \widehat{\mathcal G}_J(\varepsilon ,a) = \mathfrak g_J^0(a^*)\), where
$$\begin{aligned}&\mathfrak g_J^0(a) :=\sum _{j=1}^J \mathbb {E}^{\varphi ^0} \left[ \beta _j(X_0^0;a) \left( \sigma \phi _j''(X_0^0;a) - \alpha \cdot V'(X_0^0) \phi _j'(X_0^0;a) \right. \right. \nonumber \\&\quad \left. \left. + \lambda _j(a) \phi _j(X_0^0;a) \right) \right] , \end{aligned}$$
(5.20)
-
(ii)
\(\lim _{(\varepsilon ,a) \rightarrow (0,a^*)} \widehat{\mathcal H}_J(\varepsilon ,a) = \mathfrak h_J^0(a^*)\), where
The proof is omitted since it is similar to the second case of the proof of Proposition 5.7.
Proof of the main results
Let us remark that we aim to prove the asymptotic unbiasedness of the proposed estimators, i.e. their convergence to the homogenized drift coefficient A as the number of observations N tends to infinity and the multiscale parameter \(\varepsilon \) vanishes. Therefore, we study the limit of the score functions and their Jacobian matrices as \(N\rightarrow \infty \) and \(\varepsilon \rightarrow 0\) evaluated in the desired limit point A.
We first analyse the case \(\Delta \) independent of \(\varepsilon \) and we consider the limit of Proposition 5.5 and Remark 5.6 evaluated in \(a^* = A\). Then, due to equation (2.12) we get
$$\begin{aligned}&\frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^0} \left[ g_j \left( X_0^0, X_\Delta ^0, \widetilde{Z}_0^0; A \right) \right] \nonumber \\&\quad = \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^0} \left[ \beta _j(\widetilde{Z}_0^0;A) \left( \phi _j(X_\Delta ^0;A)\right. \right. \nonumber \\&\quad \quad \quad \quad \quad \left. \left. - e^{-\lambda _j(A)\Delta } \phi _j(X_0^0;A) \right) \right] \nonumber \\&\quad = \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^0} \left[ \beta _j(\widetilde{Z}_0^0;A) \left( \mathbb {E}\left[ \left. \phi _j(X_\Delta ^0;A) \right| (X_0^0, \widetilde{Z}_0^0) \right] \right. \right. \nonumber \\&\left. \left. \quad \quad \quad \quad \quad - e^{-\lambda _j(A)\Delta } \phi _j(X_0^0;A) \right) \right] \nonumber \\&\quad = 0, \end{aligned}$$
(5.21)
and similarly we obtain
$$\begin{aligned} \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\varphi ^0} \left[ g_j \left( X_0^0, X_\Delta ^0, X_0^0; A \right) \right] = 0. \end{aligned}$$
On the other hand, if \(\Delta \) is a power of \(\varepsilon \), we study the limit of Proposition 5.7 and Remark 5.8 evaluated in \(a^* = A\) and by (2.10) we have
$$\begin{aligned} \widetilde{\mathfrak g}_J^0(A) = 0 \qquad \text {and} \qquad \widehat{\mathfrak g}_J^0(A) = 0. \end{aligned}$$
(5.22)
Moreover, differentiating equation (2.12) with respect to a, we get
where the process \(\nabla _a X_t(a)\) satisfies
$$\begin{aligned} \mathrm {d}\left( \nabla _a X_t(a) \right) = - V'(X_t) \,\mathrm {d}t - a \cdot V''(X_t) \nabla _a X_t(a) \,\mathrm {d}t.\nonumber \\ \end{aligned}$$
Therefore, due to (2.12) and (5.23), we have
$$\begin{aligned}&\frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^0} \left[ h_j \left( X_0^0, X_\Delta ^0, \widetilde{Z}_0^0; A \right) \right] \nonumber \\&\quad = - \sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^0} \left[ \left( \beta _j(\widetilde{Z}_0^0;A) \otimes \nabla _a X_\Delta (A) \right) \phi _j'(X_\Delta ^0;A) \right] ,\nonumber \\ \end{aligned}$$
(5.24)
and
$$\begin{aligned}&\frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\varphi ^0} \left[ h_j \left( X_0^0, X_\Delta ^0, X_0^0; A \right) \right] \nonumber \\&\quad = - \sum _{j=1}^J \mathbb {E}^{\varphi ^0} \left[ \left( \beta _j(X_0^0;A) \otimes \nabla _a X_\Delta (A) \right) \phi _j'(X_\Delta ^0;A) \right] .\nonumber \\ \end{aligned}$$
Then, due to Lemma A.4, we can differentiate the eigenvalue problem (2.11) with respect to a and deduce that
where the dot denotes the gradient with respect to a and which together with (2.11) implies
$$\begin{aligned} \widetilde{\mathfrak h}_J^0(A) = \sum _{j=1}^J \mathbb {E}^{\rho ^0} \left[ (\beta _j(Z_0^0;A) \otimes V'(X_0^0)) \phi _j'(X_0^0;A) \right] ,\nonumber \\ \end{aligned}$$
(5.25)
and
$$\begin{aligned} \widehat{\mathfrak h}_J^0(A) = \sum _{j=1}^J \mathbb {E}^{\varphi ^0} \left[ (\beta _j(X_0^0;A) \otimes V'(X_0^0)) \phi _j'(X_0^0;A) \right] .\nonumber \\ \end{aligned}$$
Before showing the main results, we need two auxiliary lemmas, which in turn rely on the technical Assumption 3.1, which can now be rewritten as:
-
(i)
\(\det \left( \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\widetilde{\rho }^0} \left[ h_j \left( X_0^0, X_\Delta ^0, \widetilde{Z}_0^0; A \right) \right] \right) \ne 0\),
-
(ii)
\(\det \left( \frac{1}{\Delta }\sum _{j=1}^J \mathbb {E}^{\varphi ^0} \left[ h_j \left( X_0^0, X_\Delta ^0, X_0^0; A \right) \right] \right) \ne 0\),
-
(iii)
\(\det \left( \widetilde{\mathfrak h}_J^0(A) \right) \ne 0\),
-
(iv)
\(\det \left( \widehat{\mathfrak h}_J^0(A) \right) \ne 0\).
Since the proofs of the two lemmas are similar, we only write the details of the first one.
Lemma 5.9
Under Assumption 2.5 and Assumption 3.1, there exists \(\varepsilon _0>0\) such that for all \(0<\varepsilon <\varepsilon _0\) there exists \(\widetilde{\gamma }= \widetilde{\gamma }(\varepsilon )\) such that if \(\Delta \) is independent of \(\varepsilon \) or \(\Delta =\varepsilon ^\zeta \) with \(\zeta >0\) and \(\zeta \ne 1\), \(\zeta \ne 2\)
$$\begin{aligned} \widetilde{\mathcal G}_J(\varepsilon , A {+} \widetilde{\gamma }(\varepsilon )) {=} 0 \qquad \text {and} \qquad \det \left( \widetilde{\mathcal H}_J(\varepsilon , A {+} \widetilde{\gamma }(\varepsilon )) \right) \ne 0.\nonumber \\ \end{aligned}$$
Moreover
$$\begin{aligned} \lim _{\varepsilon \rightarrow 0} \widetilde{\gamma }(\varepsilon ) = 0. \end{aligned}$$
Proof
Let us first extend the functions \(\widetilde{\mathcal G}_J\) and \(\widetilde{\mathcal H}_J\) by continuity in \(\varepsilon =0\) with their limit given by Proposition 5.5 and Proposition 5.7 depending on \(\Delta \) and note that due to (5.21) if \(\Delta \) is independent of \(\varepsilon \) and (5.22) otherwise, we have
$$\begin{aligned} \widetilde{\mathcal G}_J(0,A) = 0. \end{aligned}$$
Moreover, by (5.24), (5.25) and Assumption 3.1, we know that
$$\begin{aligned} \det \left( \widetilde{\mathcal H}_J(0,A) \right) \ne 0. \end{aligned}$$
Therefore, since the functions \(\widetilde{\mathcal G}_J\) and \(\widetilde{\mathcal H}_J\) are continuous by Proposition 5.3, the implicit function theorem (see Hurwicz and Richter 2003, Theorem 2) gives the desired result.
Lemma 5.10
Under Assumption 2.5 and Assumption 3.1, there exists \(\varepsilon _0>0\) such that for all \(0<\varepsilon <\varepsilon _0\) there exists \(\widehat{\gamma }= \widehat{\gamma }(\varepsilon )\) such that if \(\Delta \) is independent of \(\varepsilon \) or \(\Delta = \varepsilon ^\zeta \) with \(\zeta \in (0,1)\)
$$\begin{aligned} \widehat{\mathcal G}_J(\varepsilon , A + \widehat{\gamma }(\varepsilon )) = 0 \qquad \text {and}\,\,\det \left( \widehat{\mathcal H}_J(\varepsilon , A + \widehat{\gamma }(\varepsilon )) \right) \ne 0.\nonumber \\ \end{aligned}$$
Moreover,
$$\begin{aligned} \lim _{\varepsilon \rightarrow 0} \widehat{\gamma }(\varepsilon ) = 0. \end{aligned}$$
We are now ready to prove the asymptotic unbiasedness of the estimators, i.e. Theorems 3.3 and 3.4. We only prove Theorem 3.4 for the estimator \(\widetilde{A}^\varepsilon _{N,J}\) with filtered data. The proof of Theorem 3.3 for the estimator \(\widehat{A}^\varepsilon _{N,J}\) without filtered data is analogous and is omitted here.
Proof of Theorem 3.4
We need to show for a fixed \(0< \varepsilon < \varepsilon _0\):
-
(i)
the existence of the solution \(\widetilde{A}^\varepsilon _{N,J}\) of the system \(\widetilde{G}^\varepsilon _{N,J}(a) = 0\) with probability tending to one as \(N \rightarrow \infty \);
-
(ii)
\(\lim _{N \rightarrow \infty } \widetilde{A}^\varepsilon _{N,J} = A + \widetilde{\gamma }(\varepsilon )\) in probability with \(\lim _{\varepsilon \rightarrow 0} \widetilde{\gamma }(\varepsilon ) = 0\).
We first note that by Lemma 5.9 we have
$$\begin{aligned} \lim _{\varepsilon \rightarrow 0} \widetilde{\gamma }(\varepsilon ) = 0. \end{aligned}$$
We then follow the steps of the proof of Bibby and Rensen (1995, Theorem 3.2). Due to Barndorff–Nielsen and Sorensen (1994, Theorem A.1), claims (i) and (ii) hold true if we verify that
and as \(N \rightarrow \infty \)
$$\begin{aligned} \frac{1}{\sqrt{N}} \widetilde{G}^\varepsilon _{N,J}(A + \widetilde{\gamma }(\varepsilon )) \rightarrow \mathcal N \left( 0, \Lambda ^\varepsilon \right) , \qquad \text {in law}, \end{aligned}$$
(5.27)
where \(\Lambda ^\varepsilon \) is a positive definite covariance matrix and
$$\begin{aligned} B_{C,N}^\varepsilon = \left\{ a \in \mathcal A :\left\| a - (A + \widetilde{\gamma }(\varepsilon ))\right\| \le \frac{C}{\sqrt{N}} \right\} , \end{aligned}$$
for \(C>0\) small enough such that \(B_{C,1} \subset \mathcal A\). Result (5.27) is a consequence of Florens–Zmirou (1989, Theorem 1). We then have
where the right-hand side vanishes by Bibby and Rensen (1995, Lemma 3.3) and the continuity of \(\widetilde{\mathcal H}\) (Proposition 5.3), implying result (5.26). Hence, we proved (i) and (ii), which conclude the proof of the theorem.
Remark 5.11
Notice that if \(\Delta =\varepsilon ^\zeta \) with \(\zeta >2\) and we do not employ the filter, in view of (5.20) and following the same proof of Theorem 3.4, we could compute the asymptotic limit of \(\widehat{A}_{N,J}^\varepsilon \) as N goes to infinity and \(\varepsilon \) vanishes if we knew \(a^*\) such that
$$\begin{aligned}&\sum _{j=1}^J \mathbb {E}^{\varphi ^0} \left[ \beta _j(X_0^0;a^*) \left( \sigma \phi _j''(X_0^0;a^*) \right. \right. \nonumber \\&\quad \left. \left. - \alpha \cdot V'(X_0^0) \phi _j'(X_0^0;a^*) + \lambda _j(a^*) \phi _j(X_0^0;a^*) \right) \right] = 0.\nonumber \\ \end{aligned}$$
The value of \(a^*\) cannot be found analytically since it is, in general, different from the drift coefficients \(\alpha \) and A of the multiscale and homogenized equations (2.1) and (2.2). Nevertheless, we observe that in the simple scale of the multiscale Ornstein–Uhlenbeck process we have \(a^* = \alpha \).