1 Introduction

We propose an averaging procedure based on Banach’s concept of Lebesgue integral in abstract spaces [1]. To be specific, we are going to use a particular variant of Banach’s theory, connected with integration in \(l^{2}\). We denote by

$$\begin{aligned} S_{n}=\left\{ x\in {{\mathbb {R}}}^{n}:\sum _{k=1}^{n}x_{k}^{2}\le 1\right\} \quad \text {and}\quad S=\left\{ x\in {{\mathbb {R}}}^{{\mathbb {N}} }:\sum _{k=1}^{\infty }x_{k}^{2}\le 1\right\} \end{aligned}$$

the unit balls in \(l_{n}^{2}\) and \(l^{2}\), respectively. According to Banach’s result, the most general nonnegative linear functional defined on S and satisfying certain conditions listed in [1] (which we do not need to repeat here) has the form

$$\begin{aligned} F(\varPhi ) =\lim _{n\rightarrow \infty }F_{n}(\varPhi ) , \end{aligned}$$

where

$$\begin{aligned} F_{n}\left( \varPhi \right)&=\int _{S_{n}}\varPhi \left( x_{1},\ldots ,x_{n},0,\ldots \right) \rho _{n}\left( x_{1},\ldots ,x_{n}\right) \hbox {d}x_{1}\ldots \hbox {d}x_{n}, \\ \rho _{n}\left( x\right)&=\chi _{S_{n}}\left( x\right) \frac{g\left( x_{1}\right) g\left( x_{2}/\sqrt{1-x_{1}^{2}}\right) \ldots g\left( x_{n}/ \sqrt{1-x_{1}^{2}-\cdots -x_{n-1}^{2}}\right) }{\sqrt{\left( 1-x_{1}^{2}\right) \ldots \left( 1-x_{1}^{2}-\cdots -x_{n-1}^{2}\right) }}, \end{aligned}$$

and \(g:\left[ -1,1\right] \rightarrow \left[ 0,\infty \right) \), \( \int _{-1}^{1}g\left( t\right) \hbox {d}t=1\), \(\varPhi :{{\mathbb {R}}}^{{\mathbb {N}}}\rightarrow {{\mathbb {R}}}\) is a bounded Borel measurable function, and \(\chi _{A}\) is the indicator of A.

Although Banach’s considerations and constructions are purely deterministic and based on ideas coming from functional analysis, his expression of \(\rho _{n}\) can be easily reinterpreted in probabilistic terms, giving a probabilistic interpretation for his extension of Lebesgue integral. The first step in this direction is to find a stochastic sequence having probability density function \(\rho _{n}\). Such a sequence will be called a Banach random walk (BRW), or a standard Banach random walk (SBRW) if \( g\equiv 1\). The expression of \(F_{n}\left( \varPhi \right) \) in terms of BRW is immediate. In Sects. 3 and 4, an orthogonal expansion of square integrable functionals of the BRW [elements of \(l^{2}\left( S_{n}\right) \)] in terms of Legendre polynomials is obtained, and a chaotic decomposition of \(l^{2}\left( S\right) \) is presented. These are the main results of this paper.

2 Banach Random Walk on \(S_{n}\)

Choose a point \(x_{1}\) in \(S_{1}=\left[ -1,1\right] \) randomly with density g. Then, the point \(\left( x_{1},0\right) \) is in \(S_{2}\). Choose \(x_{2}\) randomly in \(\left[ -\sqrt{1-x_{1}^{2}},\sqrt{1-x_{1}^{2}}\right] \) with density \(g\left( x_{2}/\sqrt{1-x_{1}^{2}}\right) /\sqrt{1-x_{1}^{2}}\). Then, \( \left( x_{1},x_{2},0\right) \) is in \(S_{3}\). Choose \(x_{3}\) randomly in \( \left[ -\sqrt{1-x_{1}^{2}-x_{2}^{2}},\sqrt{1-x_{1}^{2}-x_{2}^{2}}\right] \) with density \(g\left( x_{3}/\sqrt{1{-}x_{1}^{2}{-}x_{2}^{2}}\right) /\sqrt{ 1{-}x_{1}^{2}-x_{2}^{2}}\), etc. The sequence \(x_{1},\ldots ,x_{n}\) is random, and the probability density function corresponding to this sample is \(\rho _{n}\left( x_{1},\ldots ,x_{n}\right) \), as in the Banach integral. To check that it is a density, it is enough to show that

$$\begin{aligned} I_{n}\triangleq \int _{S_{n}}\rho _{n}\left( x\right) \hbox {d}x=1 \end{aligned}$$

for any \(n\ge 1\). Indeed, this holds for \(n=1\), and for \(n\ge 2\) and \(x\in S_{n}\), we have

$$\begin{aligned} I_{n}&=\int _{S_{n-1}}\left[ \int _{-\sqrt{1-x_{1}^{2}-\cdots -x_{n-1}^{2}}}^{ \sqrt{1-x_{1}^{2}-\cdots -x_{n-1}^{2}}}\frac{g\left( x_{n}/\sqrt{ 1-x_{1}^{2}-\cdots -x_{n-1}^{2}}\right) }{\sqrt{1-x_{1}^{2}-\cdots -x_{n-1}^{2}}}\hbox {d}x_{n}\right] \\&\quad \times \rho _{n-1}\left( x_{1},\ldots ,x_{n-1}\right) \hbox {d}x_{1}\ldots \hbox {d}x_{n-1} \\&=I_{n-1}=I_{1}=\int _{-1}^{1}g\left( x_{1}\right) \hbox {d}x_{1}=1. \end{aligned}$$

3 Legendre Polynomials

Legendre polynomials in one variable are defined by the formulae

$$\begin{aligned} L_{p}\left( t\right) = {\left\{ \begin{array}{ll} 1 &{} \text { for }p=0, \\ \frac{1}{2^{p}p!}\frac{\hbox {d}^{p}}{\hbox {d}t^{p}}\left( t^{2}-1\right) ^{p} &{} \text { for }p=1,2,\ldots . \end{array}\right. } \end{aligned}$$

The polynomials are orthogonal:

$$\begin{aligned} 2^{-1}\int _{-1}^{1}L_{p}\left( t\right) L_{q}\left( t\right) \hbox {d}t= {\left\{ \begin{array}{ll} 0 &{} \text {if }p\ne q, \\ 1/(1+2p) &{} \text {if }p=q, \end{array}\right. } \end{aligned}$$

and \(\left\{ L_{p}\left( \cdot \right) :p=0,1,\ldots \right\} \) is a complete set in \(L^{2}\left[ -1,1\right] \). To extend these to the multivariate case, we introduce a mapping \(\varTheta :S_{n}\rightarrow \left[ -1,1\right] ^{n}\) by

$$\begin{aligned} \begin{aligned} y_{1}&=\varTheta _{1}\left( x\right) =x_{1}, \\ y_{2}&=\varTheta _{2}\left( x\right) =x_{2}/\sqrt{1-x_{1}^{2}}, \\ y_{n}&=\varTheta _{n}\left( x\right) =x_{n}/\sqrt{1-x_{1}^{2}-\cdots -x_{n-1}^{2}}, \end{aligned}\quad \begin{aligned} x_{1}&=\varTheta _{1}^{-1}\left( y\right) =y_{1}, \\ x_{2}&=\varTheta _{2}^{-1}\left( y\right) =y_{2}\sqrt{1-y_{1}^{2}}, \\ x_{n}&=\varTheta _{n}^{-1}\left( y\right) =y_{n}\sqrt{\left( 1-y_{1}^{2}\right) \ldots \left( 1-y_{n-1}^{2}\right) }, \end{aligned} \end{aligned}$$

and note that changing variables by means of \(y=\varTheta \left( x\right) \), we get

$$\begin{aligned}&\int _{S_{n}}\frac{\varPhi _{n}\left( x\right) \hbox {d}x}{2^{n}\sqrt{\left( 1-x_{1}^{2}\right) \ldots \left( 1-x_{1}^{2}-\cdots -x_{n-1}^{2}\right) }} \\&\quad =2^{-n}\int _{\left[ -1,1\right] ^{n}}\varPsi _{n}\left( y\right) \hbox {d}y\quad (\text {where }\varPsi _{n}=\varPhi _{n}\circ \varTheta ^{-1}). \end{aligned}$$

For a multi-index \(p=\left( p_{1},p_{2},\ldots \right) \), define

$$\begin{aligned} L_{n,p}\left( y\right) =\prod \limits _{i=1}^{n}L_{p_{i}}\left( y_{i}\right) , \end{aligned}$$

and note

$$\begin{aligned} 2^{-n}\int _{\left[ -1,1\right] ^{n}}L_{n,p}\left( y\right) L_{n,q}\left( y\right) \hbox {d}y= {\left\{ \begin{array}{ll} 0 &{} \text {if }p\ne q, \\ 1/\prod \nolimits _{i=1}^{n}\left( 1+2p_{i}\right) &{} \text {if }p=q. \end{array}\right. } \end{aligned}$$

This implies that the set \(\left\{ l_{n,p}\left( y\right) :p\in {{\mathbb {N}}} _{0}^{\infty }\right\} ,\) where \({{\mathbb {N}}}_{0}=0\cup {{\mathbb {N}}}\), \({{\mathbb {N}}}=\left\{ 1,2,...\right\} \) and

$$\begin{aligned} l_{n,p}\left( y\right) =\sqrt{\prod \limits _{i=1}^{n}\left( 1+2p_{i}\right) } \,L_{n,p}\left( y\right) , \end{aligned}$$

is an orthonormal basis for \(L^{2}\left( \left[ -1,1\right] ^{n},2^{-n}\hbox {d}x\right) \), and any element \(\varPsi _{n}\) of this space has a unique orthogonal expansion

$$\begin{aligned} \begin{aligned} \varPsi _{n}\left( y\right) =\sum _{p\in {{\mathbb {N}}} _{0}^{n}}\psi _{p}l_{n,p}\left( y\right) , \end{aligned} \end{aligned}$$

where

$$\begin{aligned} \psi _{p}=2^{-n}\int _{\left[ -1,1\right] ^{n}}l_{n,p}\left( y\right) \varPsi _{n}\left( y\right) \hbox {d}y. \end{aligned}$$

4 Orthogonal Decomposition of \(l^{2}\left( S,{{\mathbb {P}}}\right) \)

The orthogonal decomposition of spaces of square integrable random variables dates back to Wiener [2] and was continued by Ito [3] for the continuous-time counterpart of SBRW, which is the standard Wiener process. These results were applied to diffusion processes in [4, 5] and were recently extended to Lévy processes (see [6, 7] for instance). This line of research has several motivations, beginning with usefulness of orthogonal representations for approximation and ending with applications in Malliavin calculus (see [8, 9] for instance) and stochastic analysis in general (see [10]). Our situation is different since BRW is neither Gaussian nor Markov. Nevertheless, it appeared naturally in Banach’s extension of Lebesgue integral to abstract spaces. It is worth mentioning that Banach’s method uses functional analytic tools and “is not based on the notion of measure” (according to [1]).

Definition 1

We say that \(\varPhi \in l^{2}\left( S_{n}\right) \) if \(\varPhi :S_{n}\rightarrow {{\mathbb {R}}}\), and

$$\begin{aligned} \left\| \varPhi \right\| _{l^{2}\left( S_{n}\right) }^{2}=\int _{S_{n}} \frac{\varPhi ^{2}\left( x_{1},\ldots ,x_{n}\right) \hbox {d}x_{1}\ldots \hbox {d}x_{n}}{2^{n} \sqrt{\left( 1-x_{1}^{2}\right) \ldots \left( 1-x_{1}^{2}-\cdots -x_{n-1}^{2}\right) }}<\infty , \end{aligned}$$

and we say that \(\varPhi \in l^{2}\left( S\right) \) if \(\varPhi :S\rightarrow {{\mathbb {R}}}\), and

$$\begin{aligned} \left\| \varPhi \right\| _{l^{2}\left( S\right) }^{2}=\lim _{n\rightarrow \infty }\int _{S_{n}}\frac{\varPhi _{n}^{2}\left( x\right) \hbox {d}x_{1}\ldots \hbox {d}x_{n}}{ 2^{n}\sqrt{\left( 1-x_{1}^{2}\right) \ldots \left( 1-x_{1}^{2}-\cdots -x_{n-1}^{2}\right) }}<\infty , \end{aligned}$$

where

$$\begin{aligned} \varPhi _{n}\left( x\right) =\varPhi \left( x_{1},\ldots ,x_{n},0,\ldots \right) =\left( \varPhi \circ \pi _{n}\right) \left( x\right) , \end{aligned}$$

and where \(\pi _{n}:{{\mathbb {R}}}^{{\mathbb {N}}}\rightarrow {{\mathbb {R}}}^{n}\) is the projection onto the first n coordinates.

Definition 2

Let \(\varOmega =\left[ -1,1\right] ^{{\mathbb {N}}}\), \({{\mathcal {F}}}=\bigotimes _{n=1}^{\infty }{{\mathcal {B}}}\left( \left[ -1,1\right] \right) \), where \( {{\mathcal {B}}}\left( \left[ -1,1\right] \right) \) is the Borel sigma field on \( \left[ -1,1\right] \), and \({{\mathbb {P}}}=\bigotimes _{n=1}^{\infty }\frac{1}{2} \lambda _{\left[ -1,1\right] }\), where \(\lambda _{\left[ -1,1\right] }\) is the one-dimensional Lebesgue measure restricted to \(\left[ -1,1\right] \). On \(\left( \varOmega ,{{\mathcal {F}}},{{\mathbb {P}}}\right) \) define \({{\mathbb {Y}}}=\left( Y_{1},Y_{2},\ldots \right) \), where \(Y_{i}\left( \omega \right) =\omega _{i}\) , \(\omega =\left( \omega _{1},\omega _{2},\ldots \right) \in \varOmega \), i.e., \( Y_{i}\) is a sequence of i.i.d. random variables uniformly distributed on \( \left[ -1,1\right] \), and \({{\mathbb {X}}}=\left( X_{1},X_{2},\ldots \right) \), where

$$\begin{aligned} X_{n}\left( \omega \right) =\omega _{n}\sqrt{\left( 1-\omega _{1}^{2}\right) \ldots \left( 1-\omega _{n-1}^{2}\right) }. \end{aligned}$$

Proposition 1

\(X^{n}=\left( X_{1},\ldots ,X_{n}\right) =\pi _{n}\left( {{\mathbb {X}}}\right) \), \(n=1,2,\ldots ,\) is a SBRW.

Proof

Since \(X_{n}=\varTheta _{n}^{-1}\circ \pi _{n}\left( {{\mathbb {Y}}} \right) \), for any bounded measurable \(f:{{\mathbb {R}}}^{n}\rightarrow {{\mathbb {R}}} \), we have

$$\begin{aligned} {{\mathbb {E}}}\left[ f\left( X^{n}\right) \right]= & {} {{\mathbb {E}}}\left[ f\circ \varTheta ^{-1}\left( Y^{n}\right) \right] \\= & {} 2^{-n}\int _{\left[ -1,1\right] ^{n}}f\circ \varTheta ^{-1}\left( y\right) \hbox {d}y \\= & {} \int _{S_{n}}\frac{f\left( x\right) \hbox {d}x}{2^{n}\sqrt{\left( 1-x_{1}^{2}\right) \ldots \left( 1-x_{1}^{2}-\cdots -x_{n-1}^{2}\right) }}. \end{aligned}$$

\(\square \)

Definition 3

We say that a random variable \(F:\varOmega \rightarrow {{\mathbb {R}}}\) belongs to the space \(l^{2}\left( S_{n},{{\mathbb {P}}}_{n}\right) \) [(respectively, \( l^{2}\left( S,{{\mathbb {P}}}\right) \)] if it is of the form

$$\begin{aligned} F=\varPhi _{n}( X^{n}) \quad (\text {resp. } F=\varPhi ( {{\mathbb {X}}})) \end{aligned}$$

and

$$\begin{aligned} \left\| F\right\| _{l^{2}\left( S_{n},{{\mathbb {P}}}_{n}\right) }^{2}= & {} {{\mathbb {E}}}_{{{\mathbb {P}}}_{n}}\left[ \varPhi _{n}\left( X^{n}\right) \right] ^{2}<\infty , \\ \text {(respectively, }\left\| F\right\| _{l^{2}\left( S,{{\mathbb {P}}} \right) }^{2}= & {} {{\mathbb {E}}}_{{\mathbb {P}}}\left[ \varPhi \left( {{\mathbb {X}}}\right) \right] ^{2}<\infty \text {).} \end{aligned}$$

There is an obvious correspondence: \(F\in l^{2}\left( S_{n},{{\mathbb {P}}} _{n}\right) \) iff \(\varPhi \in l^{2}\left( S_{n}\right) \), and \(F\in l^{2}\left( S,{{\mathbb {P}}}\right) \) iff \(\varPhi \in l^{2}\left( S\right) \).

Theorem 1

If \(\varPhi \in l^{2}\left( S\right) \), then

$$\begin{aligned} \varPhi \left( {{\mathbb {X}}}\right)= & {} \lim _{n\rightarrow \infty }\varPhi _{n}\left( X^{n}\right) , \hbox { in the norm } l^{2}\left( S,{{\mathbb {P}}}\right) \\ l^{2}\left( S,{{\mathbb {P}}}\right)= & {} \bigoplus _{i=0}^{\infty }{{\mathfrak {B}}} _{i}, \end{aligned}$$

where

$$\begin{aligned} \varPhi _{n}\left( X^{n}\right)&=\sum _{p\in {{\mathbb {N}}}_{0}^{n}}\psi _{p}l_{p_{1}}\left( X_{1}\right) \prod \limits _{i=2}^{n}l_{p_{i}}\left( X_{i}/\sqrt{1-\left( X_{1}\right) ^{2}-\cdots -\left( X_{i-1}\right) ^{2}}\right) , \\ {{\mathfrak {B}}}_{i}&= {\left\{ \begin{array}{ll} {{\mathbb {R}}} &{} \text {for }i=0, \\ \mathbf{span}\overline{l_{p_{1}}\left( X_{1}\right) },\,p_{1}\in {{\mathbb {N}}}_{0} &{} \text {for }i=1, \\ \mathbf{span}\overline{l_{p_{i}}\left( X_{i}/\sqrt{1-\left( X_{1}\right) ^{2}-\cdots -\left( X_{i-1}\right) ^{2}}\right) },\,p_{i}\in {{\mathbb {N}}}_{0}, &{} \text {for }i\ge 2, \end{array}\right. } \end{aligned}$$

where \(\mathbf span \overline{l_{p_{i}}\left( \cdot \right) }\) stands for the closure in \(l^{2}\left( S,{{\mathbb {P}}}\right) \).

Proof

Since \(l^{2}\left( S,{{\mathbb {P}}}\right) \) is a Hilbert space, to prove the first equality, it is enough to show that \(\varPhi _{n}\left( X^{n}\right) \), \( n=1,2,\ldots ,\) is a Cauchy sequence. Indeed, if \(n\le m\), we have \(\varPhi _{n}\left( X^{n}\right) =\varPhi _{m}\left( X^{n},0,\ldots ,0\right) \), hence

$$\begin{aligned}&\varPhi _{m}\left( X^{m}\right) -\varPhi _{m}\left( X^{n},0,\ldots ,0\right) \\&\quad =\sum _{p\in {{\mathbb {N}}_0^m}\backslash {{\mathbb {N}}_0^n}\times {\{0\}}^{m-n}}\psi _{p}l_{p_{1}}\left( X_{1}\right) \prod \limits _{i=2}^{m}l_{p_{i}}\left( \frac{X_{i}}{\sqrt{1-\left( X_{1}\right) ^{2}-\cdots -\left( X_{i-1}\right) ^{2}}}\right) , \end{aligned}$$

where \(\psi _{p}=0\) for all multi-indices \(p=\left( p_{1},\ldots ,p_{n},0,\ldots \right) \); hence, by orthonormality of \(l_{p_{i}}\), we get

$$\begin{aligned} {{\mathbb {E}}}_{n}\left[ \varPhi _{n}\left( X^{n}\right) -\varPhi _{m}\left( X^{m}\right) \right] ^{2}=\sum _{p\in {{\mathbb {N}}_0^m}\backslash {{\mathbb {N}}_0^n}\times {\{0\}}^{m-n}}\psi _{p}^{2}\rightarrow 0\quad \text { as }n,m\rightarrow \infty . \end{aligned}$$

For the second equality, note that \(i\ne j\) implies

$$\begin{aligned}&{{\mathbb {E}}}\left[ l_{p_{i}}\left( \frac{X_{i}}{\sqrt{1-\left( X_{1}\right) ^{2}-\cdots -\left( X_{i-1}\right) ^{2}}}\right) l_{p_{j}}\left( \frac{X_{j} }{\sqrt{1-\left( X_{1}\right) ^{2}-\cdots -\left( X_{j-1}\right) ^{2}}} \right) \right] \\&\quad =\int _{S_{j}}\frac{l_{p_{i}}\left( \varTheta _{i}\left( x\right) \right) l_{p_{j}}\left( \varTheta _{j}\left( x\right) \right) \hbox {d}x}{2^{j}\sqrt{\left( 1-x_{1}^{2}\right) \ldots \left( 1-x_{1}^{2}-\cdots -x_{j-1}^{2}\right) }}\quad \left( \text {if }i<j\right) \\&\quad =2^{-j}\int _{\left[ -1,1\right] ^{j}}l_{p_{i}}\left( y_{i}\right) l_{p_{j}}\left( y_{j}\right) \hbox {d}y \\&\quad =2^{-2}\int _{\left[ -1,1\right] }l_{p_{i}}\left( y_{i}\right) \hbox {d}y_{i}\int _{ \left[ -1,1\right] }l_{p_{j}}\left( y_{j}\right) \hbox {d}y_{j}=0, \end{aligned}$$

hence

$$\begin{aligned} 0&=\sum _{p_{i},q_{j}\in {{\mathbb {N}}}_{0}}\psi _{p_{i}}\psi _{q_{j}}{{\mathbb {E}}} \left[ l_{p_{i}}\left( \frac{X_{i}}{\sqrt{1-\left( X_{1}\right) ^{2}-\cdots -\left( X_{i-1}\right) ^{2}}}\right) l_{q_{j}}\right. \\&\quad \times \left. \left( \frac{X_{j}}{\sqrt{ 1-\left( X_{1}\right) ^{2}-\cdots -\left( X_{j-1}\right) ^{2}}}\right) \right] \\&={{\mathbb {E}}}\left[ {{\mathfrak {B}}}_{i}{{\mathfrak {B}}}_{j}\right] \end{aligned}$$

for \(i\ne j\). \(\square \)

The crucial argument used in the proof above will be repeated below to show stochastic independence of the renormalized walk.

Proposition 2

If \({{\mathbb {X}}}=\left( X_{1},X_{2},\ldots \right) \) is a SBRW on some probability space \(\left( \varXi ,\digamma ,{{\mathbb {Q}}}\right) \), then the random variables

$$\begin{aligned} Y_{n}=\frac{X_{n}}{\sqrt{1-\left( X_{1}\right) ^{2}-\cdots -\left( X_{n-1}\right) ^{2}}},\quad n=1,2,\ldots , \end{aligned}$$

are stochastically independent with uniform distribution on \(\left[ -1,1 \right] \). Consequently, all the Banach chaoses \({{\mathfrak {B}}}_{i}\), \( i=0,1,\ldots ,\) are stochastically independent.

Proof

Indeed, for every Borel bounded measurable \(f:{{\mathbb {R}}} \rightarrow {{\mathbb {R}}}\) and \(g:{{\mathbb {R}}}\rightarrow {{\mathbb {R}}}\), we have

$$\begin{aligned}&{{\mathbb {E}}}_{{\mathbb {Q}}}\left[ f\left( Y_{n}\right) g\left( Y_{m}\right) \right] \\&\quad =\int _{S_{n}}\frac{f\left( \varTheta _{n}\left( x\right) \right) g\left( \varTheta _{m}\left( x\right) \right) \hbox {d}x}{2^{n}\sqrt{\left( 1-x_{1}^{2}\right) \ldots \left( 1-x_{1}^{2}-\cdots -x_{n-1}^{2}\right) }}\text {(}n>m\text {)} \\&\quad =2^{-n}\int _{\left[ -1,1\right] ^{n}}f\left( y_{n}\right) g\left( y_{m}\right) \hbox {d}y \\&\quad =\int _{\left[ -1,1\right] }2^{-1}f\left( y_{n}\right) \hbox {d}y_{n}\int _{\left[ -1,1\right] }2^{-1}g\left( y_{m}\right) \hbox {d}y_{m} \\&\quad =\int _{S_{n}}\frac{f\left( \varTheta _{n}\left( x\right) \right) \hbox {d}x}{2^{n} \sqrt{\left( 1-x_{1}^{2}\right) \ldots \left( 1-x_{1}^{2}-\cdots -x_{n-1}^{2}\right) }}\int _{S_{m}}\\&\qquad \times \frac{g\left( \varTheta _{m}\left( x\right) \right) \hbox {d}x}{2^{m}\sqrt{\left( 1-x_{1}^{2}\right) \ldots \left( 1-x_{1}^{2}-\cdots -x_{m-1}^{2}\right) }} \\&\quad ={{\mathbb {E}}}_{{\mathbb {Q}}}\left[ f\left( Y_{n}\right) \right] {{\mathbb {E}}}_{ {\mathbb {Q}}}\left[ g\left( Y_{m}\right) \right] . \end{aligned}$$

\(\square \)

Remark 1

For a purely deterministic mathematical object, namely a linear, nonnegative functional on \(l^{2}\left( S\right) \) expressed in the form of Banach’s extension of Lebesgue integral, we found a deeply hidden random object, namely a SBRW, which is closely connected with it and can be used for its equivalent representation. This implies a natural question; is it true that with nonnegative linear functionals defined on Banach spaces more general than \(l^{2}\left( S\right) \) and satisfying conditions (A)–(R) on the first page of [1], one can associate a stochastic process such that this functional is the expectation with respect to the probability measure induced by this process?