New estimation method for periodic autoregressive time series of order 1 with additive noise

The periodic behavior of real data can be manifested in the time series or in its characteristics. One of the characteristics that often manifests the periodic behavior is the sample autocovariance function. In this case, the periodically correlated (PC) behavior is considered. One of the main models that exhibits PC property is the periodic autoregressive (PARMA) model that is considered as the generalization of the classical autoregressive moving average (ARMA) process. However, when one considers the real data, practically the observed trajectory corresponds to the “pure” model with the additional noise which is a result of the noise of the measurement device or other external forces. Thus, in this paper we consider the model that is a sum of the periodic autoregressive (PAR) time series and the additive noise with finite-variance distribution. We present the main properties of the considered model indicating its PC property. One of the main goals of this paper is to introduce the new estimation method for the considered model’s parameters. The novel algorithm takes under consideration the additive noise in the model and can be considered as the modification of the classical Yule–Walker algorithm that utilizes the autocovariance function. Here, we propose two versions of the new method, namely the classical and the robust ones. The effectiveness of the proposed methodology is verified by Monte Carlo simulations. The comparison with the classical Yule–Walker method is presented. The approach proposed in this paper is universal and can be applied to any finite-variance models with the additive noise.


Introduction
Many real data exhibit periodic behavior. The periodicity can be manifested in the time series or in its characteristics. One of the characteristics that often manifests the periodic behavior is the sample autocovariance function. In this case, we can say about periodically correlated behavior and the corresponding theoretical process is called periodically correlated (PC) or second-order cyclostationary. The idea of periodically correlated processes was initiated in [1,2] and then extensively extended by many authors [3][4][5]. One of the most known PC models is the periodic autoregressive moving average (PARMA) time series, [6,7]. In the last years, the PARMA models were discussed in different directions [8][9][10][11][12][13][14][15]. They are treated as the generalization of the classical autoregressive moving average (ARMA) time series [16]. However, in the PARMA models the corresponding parameters are periodic in contrast to the ARMA time series, where they are constant. The PARMA models are also considered as a special case of the time-dependent coefficients ARMA time series [17][18][19]. The classical definition of the PARMA models assumes that innovations are finite-variance distributed, e.g., Gaussian distributed. & Wojciech _ Zuławiński wojciech.zulawinski@pwr.edu.pl Agnieszka Wyłomańska agnieszka.wylomanska@pwr.edu.pl 1 The PC models (and PARMA time series) were applied in various real problems, including mechanical systems [20,21], hydrology [6,22], climatology and meteorology [23,24], economics [25,26], medicine and biology [27,28] and many others.
In the literature, there are many known modifications of classical PARMA models that are more adequate for the specific behavior of the data. One of the most known is the replacement of the finite-variance innovations (mostly Gaussian distributed) by the innovations with non-Gaussian distribution. In general, one can consider the PARMA models with infinite-variance distribution (e.g., heavytailed). Such models were considered in different applications, e.g., finance [29], physics [30], electricity market [31], technical diagnostics [32][33][34], geophysical science [35,36] and many others. See also the bibliography positions, where the PARMA models with heavy-tailed behavior were considered from the theoretical point of view [37][38][39].
When one analyzes the real measurement data with some specific behavior (like PC behavior) adequate for given theoretical model, we may assume it is always disturbed by measurement noise. Thus, in practice we do not observe the trajectory of the ''pure'' model. Therefore, one may assume that the model under consideration is disturbed by the additional noise that may be related to the noise of the measurement device or to other sources influencing the observations. This problem was discussed for instance in [40], where the authors considered the ''pure'' fractional Brownian motion with the additive noise. See also interesting bibliography positions where the similar problem was considered for various models [41][42][43][44][45].
In this paper, we consider the model described by Eq. (1) that is the classical PAR (periodic autoregressive) time series disturbed by the additive noise with finitevariance distribution. The assumption that the real data correspond to this model influences that the classical estimation methods for the PARMA models' parameters may not be effective. The situation is simpler, when the additive noise has smaller variance. In that case, the known algorithms can be accepted. However, when the level of the additive noise is noticeable, the classical algorithms need to be modified in order to include the additional disturbances. Moreover, as it is shown in our simulation study, the distribution of the additive noise has also influence on the final results. The different behavior we observe in case of the Gaussian additive noise and completely different in case when the additional noise is considered as the additive outlier, i.e., large observations (in absolute value) appearing in the data with given probability. The considered problem in the context of the PARMA models was discussed in [46] where the authors proposed to use the robust algorithms for PARMA models' parameters estimation without changing the estimation procedures. See also [47][48][49][50][51][52][53][54].
In this paper, we go step forward and propose to modify the classical algorithm of the PARMA models' parameters estimation taking under consideration the additive noise. The classical estimation method useful for the PARMA models is based on the so-called Yule-Walker approach [4]. It utilizes the autocovariance function of the time series, and at the final step, the classical measure of dependence is replaced by its empirical counterpart. The classical Yule-Walker algorithm is efficient for ''pure'' PARMA models. Moreover, it does not assume any specific distribution of the data; thus, in this sense it is universal. However, when the model under consideration is described by the process given in Eq. (1), the classical Yule-Walker algorithm seems to be not effective, especially for the additional noise with large variance.
The main goal of this paper is to introduce the general model that is a sum of the ''pure'' PAR time series and the additive noise and demonstrate its main properties. We have shown that the considered model is still PC; however, it does not satisfy the PARMA equation. The second goal is to propose the simple modification of the classical estimation algorithm and demonstrate its effectiveness for the considered model. Moreover, we introduce the classical and robust version of the modified Yule-Walker method and show their efficiency with respect to the distribution of the additive noise. The estimation results for the new algorithms are compared with the classical Yule-Walker approach with classical and robust estimator of the autocovariance function for various distributions of the additive noise.
The rest of the paper is organized as follows. In Sect. 2, we introduce the considered model and present its main properties. Next, in Sect. 3 we propose a novel estimation algorithm for the considered model's parameters and demonstrate its classical and robust version. In Sect. 4, using the Monte Carlo simulations, we demonstrate the effectiveness of the proposed methodology and present the comparative study, where the results for the modified and classical Yule-Walker algorithms are discussed. The last section concludes the paper and presents the future study.

Model description
The model under consideration is defined as follows: where fX t ; t 2 Zg is the periodic autoregressive time series of order 1 (PAR(1)) with period T and finite-variance innovations (called later the finite-variance PAR(1) model), while fZ t ; t 2 Zg is the sequence of independent random variables (called later the additive noise) with zero mean and constant variance r 2 Z . We assume that the time series fX t g and fZ t g are independent. In the further parts of the paper, model defined by Eq. (1) will be called the finitevariance PAR(1) time series with additive noise.
In the following part, we remind the definition of finitevariance PAR(1) model and the properties that guarantee the existence of its unique bounded solution. The PAR(1) time series is a special case of periodic autoregressive moving average (PARMA) model that is a periodic extension of the well-known ARMA (autoregressive moving average) system [16]. The finite-variance (called also second-order) PARMA(p; q) time series is defined as follows.
Definition 1 [55] The sequence fX t ; t 2 Zg is a secondorder PARMA(p,q) (p; q 2 N) model with period T 2 N when it satisfies the following equation: In Eq. (2), the fn t ; t 2 Zg sequence constitutes a sample of uncorrelated random variables with mean equal to zero. We assume that the variance for each n t is r 2 n ðtÞ. Moreover, the scalar sequences f/ i ðtÞ; i ¼ 1; . . .; pg, fh j ðtÞ; j ¼ 1; . . .; qg and r 2 n ðtÞ are periodic in t with the same period T.
Usually, it is assumed that fn t g is a sequence of Gaussian distributed random variables. And this is the case considered in this paper. However, all presented properties hold also for any finite-variance distribution. For T ¼ 1, the finite-variance PARMA time series reduces to the classical finite-variance ARMA model [16].
When p ¼ 1 and q ¼ 0, the PARMA model given in Definition 1 is called the PAR(1) time series and is given by the following equation: In that case, the unique bounded (in the sense of l 2 norm) solution of Eq. (3) exists if and only if: and is given by [17]: where U n k ¼ Q n r¼k /ðrÞ with the convention U n k ¼ 1 when k [ n, see also [39]. One of the main features of finitevariance PARMA time series (and thus its special case finite-variance PAR(1) model) is that it exhibits the second-order cyclostationarity, called periodically correlated (PC) property. We remind that the finitevariance time series fX t ; t 2 Zg is periodically correlated with period T 2 N if its mean and autocovariance functions are periodic in t with the period T, i.e., when the following conditions hold for any s; t 2 Z [4]: where covðX s ; X t Þ ¼ EðX s X t Þ À EðX s ÞEðX t Þ is the autocovariance function of the time series fX t g. Indeed, under the condition given in Eq. (4), using Eq. (5), the fact that the sequence fn t g constitutes a sample of uncorrelated random variables and the fact that, for each t 2 Z, En t ¼ 0, Varðn t Þ ¼ r 2 n ðtÞ, we obtain the following: The interesting properties of finite-variance PAR(1) time series in the time and frequency domains one can find for instance in [56].
Using the properties of the finite-variance PAR(1) model presented above, one can discuss the similar properties of the model defined in Eq. (1). One can easily show that the time series fY t g has mean equal to zero and autocovariance function given by: where I A is the indicator of the set A. Thus, from Eq. (8), one can conclude that the sequence fY t g also exhibits the PC property. However, it does not satisfy the PAR(1) model given in Eq. (3). In the following remark, we confirm this statement.
Remark 1 Let the time series fY t ; t 2 Zg be defined as in Eq. (1), where the time series fX t g is finite-variance PAR(1) model with period T defined in Eq. (3) and fZ t g is the sequence of independent random variables with zero mean and finite variance r 2 Z . Moreover, the time series fX t g and fZ t g are independent. Then, the time series fY t g satisfies the following equation: Proof The proof of Eq. (9) follows directly from the definition of the time series fY t g given in Eq.
(1) and the definition of the finite-variance PAR(1) time series fX t g given in Eq. (3). Indeed, we have the following: And thus, we obtain: which corresponds to the thesis. h Using the form of the bounded solution for PAR(1) time series given in Eq. (5), one can show that when the sequences fn t g and fZ t g are Gaussian distributed, the time series fY t g is also Gaussian distributed.
In Fig. 1a, we present the sample trajectory of the PAR(1) time series with period T ¼ 10 and exemplary values of the parameters (i.e., /ð1Þ; . . .; /ð10Þ) with Gaussian distributed innovation sequence fn t g with zero mean and unit variance for each t. One can clearly see that the periodic behavior is observable on the level of the time series. In Fig. 1b-d, we present the sample trajectories of the model defined in Eq. (1) with three different fZ t g sequences added to the trajectory presented in panel (a). In panel (b), we assume that fZ t g constitutes a sequence of independent random variables from Gaussian distribution with zero mean and r Z ¼ 0:8, in panel (c) we assume that for each t Z t has Student's t distribution with m ¼ 3 degrees of freedom tðmÞ. The definition of Student's t distribution is presented in ''Appendix.'' In panel (d), we demonstrate the sample trajectory of the model (1) when fZ t g is the sequence of additive outliers (AO) with parameters a ¼ 10 and p ¼ 0:02 (AO(a, p)). The definition of AO is also presented in ''Appendix.'' One can see that the main difference between the trajectories presented in panels (a)-(d) in Fig. 1 is related to the scale (amplitude) of the data.
For comparison, we present the sample variances (i.e., covðY t ; Y t ÞÞ) calculated for M ¼ 10,000 simulated trajectories of the model (1), see Fig. 2. Similarly as previously, we assume that fX t g is a PAR(1) time series with T ¼ 10 with Gaussian distributed innovation series with zero mean and unit variance. We took the same three different distributions (with the same parameters) of the sequence fZ t g as discussed above, i.e., in panel (b) we present the Gaussian distribution case, in panel (c)-the Student's t distribution case, while in panel (d)-AO case. One can see, for the pure PAR(1) time series, the periodic behavior is clearly seen. The same situation we have for the Gaussian distributed case (panel (b)). For Student's t distribution and AO cases, the periodic behavior is disturbed by the variance of the fZ t g sequence. Moreover, the scales in panels (c) and (d) differ from those presented in panels (a) and (b). In this section, we present the algorithms that can be used for estimation of the analyzed model parameters. First, we remind the classical method of estimation of finite-variance PAR(1) model's parameters, namely, the classical Yule-Walker algorithm [4]. This approach can be acceptable when the r 2 Z parameter in model (1) is relatively small. Then, we propose a modification of the classical Yule-Walker algorithm that takes under consideration the additive noise fZ t g in the model (1). Finally, we discuss the robust version of the classical Yule-Walker algorithm and the modified Yule-Walker method introduced in this paper that utilizes the robust estimator of the autocovariance function [57]. It is worth mentioning that in all considered algorithms we do not assume the specific distribution of the innovations of the PAR(1) model nor the fZ t g sequence in the model (1). The only assumption needed here is the finite variance of both sequences.
In the classical Yule-Walker approach dedicated to finite-variance PAR(1) time series with period T, we proceed as follows. Let us note that each t 2 Z can be repre- (3) can be written in the following form: Because fX t g is a PC sequence, we take the notation for its autocovariance function: Now, taking under consideration that the sequence fn t g has zero mean and assuming that jPj\1, we multiply the above equation by X nTþv and take the expected value of both sides. Thus, we obtain the following: c X ðv; 0Þ À /ðvÞc X ðv; 1Þ ¼ r 2 n ðvÞ: We repeat the above step by multiplying Eq. (10) by X nTþvÀ1 and taking the expected value of both sides: Thus, we obtain the following formulas for /ðvÞ and r 2 n ðvÞ parameters.
Remark 2 If fX t ; t 2 Zg is the finite-variance PAR(1) time series with period T given in Eq. (3), then the parameters of the model satisfy the following system of equations: 1Þ c X ðv À 1; 0Þ ; r 2 n ðvÞ ¼ c X ðv; 0Þ À c 2 X ðv; 1Þ c X ðv À 1; 0Þ : In the classical Yule-Walker approach, the theoretical autocovariance functions cðÁ; ÁÞ in Eq. (14) are replaced by their empirical counterparts. In this paper, we consider two estimators of the autocovariance function.
If X 1 ; X 2 ; . . .; X N is a sample realization of the finitevariance PAR(1) model with period T given in Eq. (3), then the classical estimator for c X ðv; kÞ for v ¼ 1; 2; . . .; T and k 2 Z is given by: where to ensure that both indices nT þ v and nT þ v À k are in the range 1; 2; . . .; N for each n.
Besides the classical estimator of the sample autocovariance function for a given random sample, one can also consider its robust version. One of the examples is the estimator presented in [57]. It is based on the following scale-based formula for autocovariance: The robustness of the described algorithm comes from the application of the robust scale estimator, proposed in [58]. For a sample V ¼ ðV 1 ; . . .; V N Þ, it is the following order statistic: where c is a constant for consistency purpose. For Gaussian distribution, we set c ¼ 2:2191. From Eqs. (17) and (18), we can obtain the following robust autocovariance estimator for a sample X 1 ; . . .; X N : where u ¼ ðX lTþv ; . . .; X rTþv Þ; w ¼ ðX lTþvÀk ; . . .; X rTþvÀk Þ ð20Þ with l and r defined as in Eq. (16). Let us note that the time complexity of Q(V) calculation from Eq. (18) is OðN 2 Þ. However, one can compute it in OðN log NÞ time using the algorithm introduced in [59]. In the simulation study, its implementation in the MATLAB library LIBRA [60] was used.
In this paper, we consider two classical Yule-Walker algorithms: the classical Yule-Walker method (YW) and the robust classical Yule-Walker method (robust YW). The latter was introduced in [46].
In the second approach proposed in this paper, we introduce the modified Yule-Walker algorithm that is strictly dedicated to the model defined in Eq. (1) and takes into consideration the additive noise. Similarly as for the pure PAR(1) time series, Eq. (9) can be written in the equivalent form: for n 2 Z; v ¼ 1; 2. . .; T. Because fY t g time series is PC, we take the notation: In the next step, we proceed analogously as in the classical Yule-Walker algorithm. First, we multiply Eq. (21) by Y nTþv and take the expected value of both sides of the obtained equation: In the second step, we multiply Eq. (21) by Y nTþvÀ1 and take the expected value of both sides of the obtained equation: In the final step, we multiply Eq. (21) by Y nTþvÀ2 and take the expected value of both sides of the obtained equation: Taking into consideration Eqs. (23), (24) and (25), we obtain the following remark.
Remark 3 Let the time series fY t ; t 2 Zg be defined as in Eq. (1), where the time series fX t g is finite-variance PAR(1) model with period T defined in Eq. (3) and fZ t g is the additive noise that constitutes sequence of uncorrelated random variables with zero mean and finite variance r 2 Z . Moreover, the time series fX t g and fZ t g are independent. Then, the corresponding parameters satisfy the following system of equations:

Simulation study
In this section, the methods presented in the previous section are compared using Monte Carlo simulations of PAR(1) model with additive noise (see Eq. (1)). We focus on the estimation of /ðtÞ coefficients. First, we analyze the case when fZ t g is Gaussian distributed. After that, the other mentioned types of additive noise (i.e., Student's t and AO) are considered. The simulations are performed for the two following set-ups assuming T ¼ 2: • Model 1: /ð1Þ ¼ 0:6, /ð2Þ ¼ 0:8, • Model 2: /ð1Þ ¼ 0:1, /ð2Þ ¼ 0:8.
Throughout the whole study, we set the constant variance of innovations r 2 n ðtÞ ¼ 1.

Gaussian additive noise
In our study, we simulate M ¼ 1000 trajectories of length N of Model 1 with additive noise from Gaussian distribution with r Z ¼ 0:8. The boxplots of the estimated values for all considered methods for the case of N ¼ 1000 are presented in Fig. 3. One can see the clear difference between both classical and both modified versions of Yule-Walker estimator. The results show that the former are significantly biased, unlike the latter which take the presence of additive noise into account. The boxplots for longer samples with N ¼ 10,000, presented in Fig. 4, confirm these observations-although the variance of all methods is lower than in the previous case, the estimated values for both classical estimators (i.e., YW and robust YW) are still significantly different than the true ones.
To quantify the effectiveness of the analyzed estimators, let us calculate the parameter-wise mean squared errors: Their values for both /ðtÞ parameters and both considered lengths N are presented in Table 1. The results confirm the advantage of the introduced modified approaches. In particular, the MYW estimator seems to be the most effective. As mentioned before, it is caused by the significant bias of the classical Yule-Walker estimators and its lack in the modified algorithms, which can be seen in the bias results presented in Table 2. The bias is calculated as the difference between the median of the estimated values and the corresponding true value: Let us note that the performance of classical estimators is visibly worse for a larger value of /ð2Þ ¼ 0:8 than for /ð1Þ ¼ 0:6. Now, let us consider Model 2. Here, the only change in comparison with the previous case is that we set /ð1Þ ¼ 0:1, which is a value closer to zero than in Model 1. We perform the same experiment as before. The boxplots of the estimated values for N ¼ 1000 are presented in Fig. 5. For the case of /ð1Þ, the observed behavior is similar to the ones seen for Model 1-in particular, the YW and robust YW methods are visibly biased. However, in this case the bias is not that significant as before, as the true value is close to zero. On the other hand, both MYW and robust MYW methods once again seem to be unbiased, although with larger variance than the classical estimators.
However, the results for /ð2Þ ¼ 0:8 show that the introduced methods may significantly fail, regardless of the autocovariance estimator used. To explain this behavior, let us recall the form of the proposed estimator with additive noise consideration: /ð2Þ ¼ c Y ð2;2Þ c Y ð1;1Þ . It turns out that when /ð1Þ is close to zero, the denominator of the expression for /ð2Þ is too. Hence, even small errors of denominator estimation may result in large errors of /ð2Þ estimation. The results for N ¼ 10,000, illustrated by the boxplots in Fig. 6, show that for longer samples this drawback of the modified Yule-Walker methods is mitigated and, because of their unbiasedness, they prevail also in this case. The parameter-wise mean squared errors and biases for this set of simulations are presented in Tables 3 and 4 respectively. In particular, let us note that even for /ð2Þ and N ¼ 1000, where the introduced methods are prone to large errors, they have much less significant bias than the classical Yule-Walker algorithms.  The best results are marked in bold Table 2 Biases for estimated values from M ¼ 1000 simulated trajectories of length N of Model 1 with additive noise fZ t g from Gaussian distribution with r Z ¼ 0:8 The best results are marked in bold  Table 4 Biases for estimated values from M ¼ 1000 simulated trajectories of length N of Model 2 with additive noise fZ t g from Gaussian distribution with r Z ¼ 0:8 The best results are marked in bold  (27)): Moreover, we calculate the mean absolute bias (MAB) which is the average of the absolute values of biases for both estimated parameters (see Eq. (28)): The results for Model 1 are presented in Fig. 7. In the left panel, one can see that for low values of r Z , when the additive noise can be considered as not significantly relevant, the YW and robust YW algorithms yield slightly better results than the modified versions. However, for larger r Z , the advantage of the latter increases rapidly and is clearly visible for r Z > 0:5. As before, it is caused by the bias of classical methods. Their MAB values, as one can see in the right panel of Fig. 7, are growing for higher scales of additive noise-whereas for the modified estimators they seem to be at a constant, low level. Such observation is consistent with the fact that the proposed methods take into account the additive noise in the model. Among these two estimators, the MYW performs better in this case. In Fig. 8, the results for Model 2 are illustrated. As it was shown in the boxplots described before, for this set of /ðtÞ parameters, the introduced modified methods may yield large errors, which is confirmed by the plot of MSE values in the left panel of Fig. 8. Nevertheless, for larger r Z , their MAB values are still much closer to zero than the ones for classical estimators, as one can see in the right panel of Fig. 8.

Other types of additive noise
In this part, we discuss the effectiveness of the considered algorithms for other than Gaussian distributed additive noise. Here, we perform simulations only for Model 1. First, let us consider the case when fZ t g has the Student's t distribution with m ¼ 3 degrees of freedom. Similarly as before, we simulate M ¼ 1000 trajectories of length N ¼ 1000 of Model 1 with such additive noise and perform the estimations. The boxplots of the estimated values are presented in Fig. 9. Once again, one can see the significant advantage of the modified Yule-Walker methods. Even though the variance of the YW and robust YW estimators is relatively low, they are still biased-even stronger than in the corresponding Gaussian additive noise case. Let us note that in this case the robust MYW method is better than its counterpart to the classical autocovariance estimator, as it copes better with the outliers resulting from the heavytailed Student's t distribution of the additive noise. The MSE and MAB results for this case are presented in Table 5.
Let us now turn to another type of additive noise. We perform the same set of simulations as the latest one, but for fZ t g being additive outliers with a ¼ 10 and p ¼ 0:02. The boxplots of the estimated values in this case are presented in Fig. 10. The MSE and MAB results can be found in Table 5. First of all, one can see that now the robust YW method (which was introduced specifically for the additive outliers presence, see [46]) has a very good performance. Although it is still slightly biased, it is able to achieve lower MSE than for the MYW method because of the much lower variance. However, once again, the best results are obtained for the robust MYW method. Let us mention that this estimator connects the traits of both previously mentioned methods-it takes into account the presence of additive noise in its derivation (hence, it has low bias) and is able to handle the outliers (thus, it has lower variance than the plain MYW).

Conclusions
In this paper, the PAR(1) time series with additive noise was considered. The analyzed model is more practical than the ''pure'' model as it takes under consideration the possible noise of the measurement device that influences the real observations. The considered model shares important  Fig. 9 Boxplots of estimated values from M ¼ 1000 simulated trajectories of length N ¼ 1000 of Model 1 with additive noise fZ t g from Student's t distribution with m ¼ 3 degrees of freedom The future study will be related to a more extensive discussion about the new estimator properties, like its limiting distribution. Moreover, the obtained theoretical results will be extended to the general PARMA model. The comparison with the other existing methods will be performed taking into account the robust versions of the known algorithms. Moreover, other distributions need to be discussed. The new area of interest is also related to the analysis of the model with additive noise that is also described by some time series (e.g., ARMA model). This aspect has great practical potential as such behavior is observed for many real trajectories. The last point that needs to be mentioned is the consideration of the models with infinite-variance distributions.
Funding The work of A.W. was supported by National Center of Science under Opus Grant 2020/37/B/HS4/00120 ''Market risk model identification and validation using novel statistical, probabilistic, and machine learning tools.'' Code availability Code is available upon request.

Declarations
Conflict of interest The author declares no conflict of interest.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.