Background

Probability theory, since it was founded by Kolmogorov in 1933, has been a crucial tool to model indeterminacy phenomena when probability distributions of the possible events are available. However, due to economical or technological reasons, we often cannot obtain sample data, based on which we estimate the probability distribution via statistics. In this case, we have to invite domain experts to evaluate their belief degree. Since human beings tend to overweight unlikely events, the belief degree usually has a much larger range than the real frequency (Kahneman and Tversky [1]). As a result, it cannot be treated as probability, otherwise some counterintuitive results may happen. An extreme counterexample was proposed by Liu [2].

In order to model the belief degree, an uncertainty theory was founded by Liu [3], and refined by Liu [4] based on normality, duality, subadditivity and product axioms. So far, it has been applied to many areas, and has brought many branches such as uncertain programming (Liu [5]), uncertain risk analysis (Liu [6]), uncertain inference (Liu [7]), uncertain logic (Liu [8]), and uncertain statistics (Liu [4]).

In order to describe the evolution of an uncertain phenomenon, Liu [9] proposed a concept of uncertain process. Then Liu [10] designed a Liu process that is an uncertain process with stationary and independent normal uncertain increments, and founded uncertain calculus to deal with the integral and differential of an uncertain process with respect to Liu process. Then Liu and Yao [11] extended uncertain integral from single Liu process to multiple ones. Besides, Chen and Ralescu [12] founded uncertain calculus with respect to general Liu process. As a complementary, Yao [13] founded uncertain calculus with respect to uncertain renewal process.

Uncertain differential equation was first proposed by Liu [9] as a type of differential equation driven by Liu process. Chen and Liu [14] gave an analytic solution for linear uncertain differential equation. Following that, Liu [15] and Yao [16] gave some methods to solve two types of nonlinear uncertain differential equations. Then Yao and Chen [17] proposed a numerical method for solving uncertain differential equation. As extensions of uncertain differential equation, uncertain differential equation with jumps was proposed by Yao [13], and uncertain delayed differential equation was studied among others by Barbacioru [18], Liu and Fei [19], and Ge and Zhu [20]. In addition, backward uncertain differential equation was proposed by Ge and Zhu [21].

Due to the paradox of stochastic finance theory (Liu [22]), Liu [10] presented an uncertain stock model via uncertain differential equation, and gave its European option pricing formulas, opening the door of uncertain finance theory. Then Chen [23] derived the American option pricing formulas of the stock model. After that, Peng and Yao [24] presented a mean-reverting uncertain stock model, and Chen et al[25]proposed an uncertain stock model with periodic dividends. Besides, Chen and Gao [26] proposed an uncertain interest rate model, and Liu et al[27] proposed an uncertain currency model. In addition, Zhu [28] applied uncertain differential equation to optimal control problems.

With many applications of uncertain differential equation, the study on properties of the solutions was also developed well. Chen and Liu [14] gave a sufficient condition for an uncertain differential equation having a unique solution. Then Gao [29] weakened the condition. After that, Yao et al[30] gave a sufficient condition for an uncertain differential equation being stable.

In this paper, we will consider the extreme values, first hitting time, and integral of the solution of an uncertain diffusion process. The rest of this paper is organized as follows. In the section of Preliminary, we will review some basic concepts about uncertain variable, uncertain process and uncertain differential equation. After that, we study the extreme values of the solution of an uncertain differential equation, and give their uncertainty distributions in the section of Extreme values. Then by the relationship between first hitting time and extreme value, we give an uncertainty distribution of the first hitting time of the solution of an uncertain differential equation in the section of First hitting time. Following that, we consider the integral of the solution of an uncertain differential equation, and give its inverse uncertainty distribution in the section of Integral. At last, some remarks are given in the section of Conclusions.

Preliminary

In this section, we will first review some basic concepts and results in uncertainty theory. Then we introduce the concept of uncertain process, uncertain calculus and uncertain differential equation.

Uncertainty theory

Definition 1.

(Liu [3]) Let $ℒ$ be a σ-algebra on a nonempty set Γ. A set function $ℳ:ℒ\to \left[0,1\right]$ is called an uncertain measure if it satisfies the following axioms:

Axiom 1: (Normality Axiom) $ℳ\left\{\mathrm{\Gamma }\right\}=1$ for the universal set Γ.

Axiom 2: (Duality Axiom) $ℳ\left\{\mathrm{\Lambda }\right\}+ℳ\left\{{\mathrm{\Lambda }}^{c}\right\}=1$ for any event Λ.

Axiom 3: (Subadditivity Axiom) For every countable sequence of events Λ12,⋯, we have

$ℳ\left\{\bigcup _{i=1}^{\infty }{\mathrm{\Lambda }}_{i}\right\}\le \sum _{i=1}^{\infty }ℳ\left\{{\mathrm{\Lambda }}_{i}\right\}.$

Besides, in order to provide the operational law, Liu [10] defined the product uncertain measure on the product σ-algebre $\mathcal{L}$ as follows.

Axiom 4: (Product Axiom) Let $\left({\mathrm{\Gamma }}_{k},{\mathcal{L}}_{k},{ℳ}_{k}\right)$ be uncertainty spaces for k = 1,2,⋯ Then the product uncertain measure $ℳ$ is an uncertain measure satisfying

$ℳ\left\{\prod _{i=1}^{\infty }{\mathrm{\Lambda }}_{k}\right\}=\underset{k=1}{\overset{\infty }{\wedge }}{ℳ}_{k}\left\{{\mathrm{\Lambda }}_{k}\right\}$

where Λ k  are arbitrarily chosen events from ${\mathcal{L}}_{k}$ for k = 1,2,⋯, respectively.

An uncertain variable is essentially a measurable function from an uncertainty space to the set of real numbers. In order to describe an uncertain variable, a concept of uncertainty distribution is defined as follows.

Definition 2.

(Liu [3]) The uncertainty distribution of an uncertain variable ξ is defined by

$\mathrm{\Phi }\left(x\right)=ℳ\left\{\xi \le x\right\}$

for any $x\mathrm{\in }\mathfrak{R}.$

Expected value is regarded as the average value of an uncertain variable in the sense of uncertain measure.

Definition 3.

(Liu [3]) The expected value of an uncertain variable ξ is defined by

$E\left[\xi \right]={\int }_{0}^{+\infty }ℳ\left\{\xi \ge x\right\}\mathrm{d}x-{\int }_{-\infty }^{0}ℳ\left\{\xi \le x\right\}\mathrm{d}x$

provided that at least one of the two integrals exists.

Assuming ξ has an uncertainty distribution Φ, Liu [3] proved that the expected value of ξ is

$E\left[\xi \right]={\int }_{0}^{+\infty }\left(1-\mathrm{\Phi }\left(x\right)\right)\mathrm{d}x-{\int }_{-\infty }^{0}\mathrm{\Phi }\left(x\right)\mathrm{d}x.$

The inverse function Φ−1 of the uncertainty distribution Φ of uncertain variable ξ is called the inverse uncertainty distribution of ξ if it exists and is unique for each α ∈ (0,1). Inverse uncertainty distribution plays a crucial role in operations of independent uncertain variables.

Definition 4.

(Liu[10]) The uncertain variables ξ 1,ξ 2,⋯,ξ n  are said to be independent if

$ℳ\left\{\bigcap _{i=1}^{n}\left({\xi }_{i}\in {B}_{i}\right)\right\}=\underset{i=1}{\overset{n}{\wedge }}ℳ\left\{{\xi }_{i}\in {B}_{i}\right\}$

for any Borel set B 1,B 2,⋯,B n  of real numbers.

Theorem 1.

(Liu [4]) Let ξ 1,ξ 2,⋯,ξ n  be independent uncertain variables with uncertainty distributions Φ12,⋯,Φ n , respectively. If f(x 1,x 2,⋯,x n ) is strictly increasing with respect to x 1,x 2,⋯,x m  and strictly decreasing with respect to x m+1,x m+2,⋯,x n , then ξ = f (ξ 1,ξ 2,⋯,ξ n ) is an uncertain variable with an inverse uncertainty distribution

${\mathrm{\Phi }}^{-1}\left(\alpha \right)=f\left({\mathrm{\Phi }}_{1}^{-1}\left(\alpha \right),\cdots \phantom{\rule{0.3em}{0ex}},{\mathrm{\Phi }}_{m}^{-1}\left(\alpha \right),{\mathrm{\Phi }}_{m+1}^{-1}\left(1-\alpha \right),\cdots \phantom{\rule{0.3em}{0ex}},{\mathrm{\Phi }}_{n}^{-1}\left(1-\alpha \right)\right).$

Theorem 2.

(Liu and Ha [31]) Let ξ 1,ξ 2,⋯,ξ n  be independent uncertain variables with uncertainty distributions Φ12,⋯,Φ n , respectively. If f (x 1,x 2,⋯,x n ) is strictly increasing with respect to x 1,x 2,⋯,x m  and strictly decreasing with respect to x m+1,x m+2,⋯,x n , then the expected value of uncertain variable ξ  =  f(ξ 1,ξ 2,⋯,ξ n ) is

$E\left[\xi \right]={\int }_{0}^{1}f\left({\mathrm{\Phi }}_{1}^{-1}\left(\alpha \right),\cdots \phantom{\rule{0.3em}{0ex}},{\mathrm{\Phi }}_{m}^{-1}\left(\alpha \right),{\mathrm{\Phi }}_{m+1}^{-1}\left(1-\alpha \right),\cdots \phantom{\rule{0.3em}{0ex}},{\mathrm{\Phi }}_{n}^{-1}\left(1-\alpha \right)\right)\mathrm{d}\alpha .$

Uncertain process

In order to model the evolution of uncertain phenomena, an uncertain process was proposed by Liu [9] as a sequence of uncertain variables driven by time or space.

Definition 5.

(Liu [9]) Let T be an index set, and let $\left(\mathrm{\Gamma },ℒ,ℳ\right)$ be an uncertainty space. An uncertain process is a measurable function from $T×\left(\mathrm{\Gamma },ℒ,ℳ\right)$ to the set of real numbers, i.e., for each t ∈ T and any Borel set B of real numbers, the set

$\left\{{X}_{t}\in B\right\}=\left\{\gamma |{X}_{t}\left(\gamma \right)\in B\right\}$

is an event.

Definition 6.

Let X t  be an uncertain process and let z be a given level. Then the uncertain variable

${\tau }_{z}=\text{inf}\left\{t\ge 0|{X}_{t}=z\right\}$

is called the first hitting time that X t  reaches the level z.

Independent increment uncertain process is an important type of uncertain processes. Its formal definition is given below.

Definition 7.

(Liu [9]) An uncertain process is said to have independent increments if

${X}_{{t}_{0}},{X}_{{t}_{1}}-{X}_{{t}_{0}},{X}_{{t}_{2}}-{X}_{{t}_{1}},\cdots \phantom{\rule{0.3em}{0ex}},{X}_{{t}_{k}}-{X}_{{t}_{k-1}}$

are independent uncertain variables where t 0 is the initial time and t 1,t 2,⋯,t k  are any times with t 0 < t 1 < ⋯ < t k .

For a sample-continuous independent increment process X t , Liu [32] proved the following extreme value theorem,

$ℳ\left\{\underset{0\le t\le s}{\text{sup}}{X}_{t}\le x\right\}=\underset{0\le t\le s}{\text{inf}}ℳ\left\{{X}_{t}\le x\right\},$
$ℳ\left\{\underset{0\le t\le s}{\text{inf}}{X}_{t}\le x\right\}=\underset{0\le t\le s}{\text{sup}}ℳ\left\{{X}_{t}\le x\right\}.$

Definition 8.

(Liu [9]) An uncertain process is said to have stationary increments if for any given t > 0, the increments X t+s  − X s  are identically distributed uncertain variables for all s>0.

Uncertain calculus

Definition 9.

(Liu [10]) An uncertain process C t  is said to be a canonical Liu process if

(i) C 0 = 0 and almost all sample paths are Lipschitz continuous,

(ii) C t  has stationary and independent increments,

(iii) every increment C s+t  − C s  is a normal uncertain variable with expected value 0 and variance t2, whose uncertainty distribution is

${\mathrm{\Phi }}_{t}\left(x\right)={\left(1+\text{exp}\left(-\frac{\pi x}{\sqrt{3}t}\right)\right)}^{-1},\phantom{\rule{1em}{0ex}}x\in \mathfrak{R}.$

Definition 10.

(Liu [10]) Let X t  be an uncertain process and C t  be a canonical Liu process. For any partition of closed interval [a,b] with a = t 1 < t 2 < ⋯ < t k+1 = b, the mesh is written as

$\mathrm{\Delta }=\underset{1\le i\le k}{\text{max}}|{t}_{i+1}-{t}_{i}|.$

Then Liu integral of X t  is defined by

$\underset{a}{\overset{b}{\int }}{X}_{t}\mathrm{d}{C}_{t}=\underset{\mathrm{\Delta }\to 0}{\text{lim}}\sum _{i=1}^{k}{X}_{{t}_{i}}·\left({C}_{{t}_{i+1}}-{C}_{{t}_{i}}\right)$

provided that the limit exists almost surely and is finite. In this case, the uncertain process X t  is said to be Liu integrable.

Definition 11.

(Liu [10]) Let C t  be a canonical Liu process, and μ s  and σ s  be two uncertain processes. Then the uncertain process

${Z}_{t}={Z}_{0}+{\int }_{0}^{t}{\mu }_{s}\mathrm{d}s+{\int }_{0}^{t}{\sigma }_{s}\mathrm{d}{C}_{s}$

is called a Liu process with drift μ t  and diffusion σ t . The differential form of Liu process is written as

$\mathrm{d}{Z}_{t}={\mu }_{t}\mathrm{d}t+{\sigma }_{t}\mathrm{d}{C}_{t}.$

Theorem 3.

(Liu [10]) (Fundamental Theorem of Uncertain Calculus) Let C t  be a canonical Liu process, and h(t,c) be a continuously differentiable function. Then Z t  = h(t,C t ) is a Liu process with

$\mathrm{d}{Z}_{t}=\frac{\partial h}{\partial t}\left(t,{C}_{t}\right)\mathrm{d}t+\frac{\partial h}{\mathrm{\partial c}}\left(t,{C}_{t}\right)\mathrm{d}{C}_{t}.$

Uncertain differential equation

An uncertain differential equation is essentially a type of differential equation driven by Liu process.

Definition 12.

(Liu [9]) Suppose C t  is a canonical Liu process, and f and g are two given functions. Then

$\mathrm{d}{X}_{t}=f\left(t,{X}_{t}\right)\mathrm{d}t+g\left(t,{X}_{t}\right)\mathrm{d}{C}_{t}$

is called an uncertain differential equation.

Yao and Chen [17] proposed a concept of α-path, and found a connection between an uncertain differential equation and a spectrum of ordinary differential equations.

Definition 13.

(Yao and Chen [17]) Let α be a number with 0 < α < 1. An uncertain differential equation

$\mathrm{d}{X}_{t}=f\left(t,{X}_{t}\right)\mathrm{d}t+g\left(t,{X}_{t}\right)\mathrm{d}{C}_{t}$

is said to have an α-path ${X}_{t}^{\alpha }$ if it solves the corresponding ordinary differential equation

$\mathrm{d}{X}_{t}^{\alpha }=f\left(t,{X}_{t}^{\alpha }\right)\mathrm{d}t+|g\left(t,{X}_{t}^{\alpha }\right)|{\mathrm{\Phi }}^{-1}\left(\alpha \right)\mathrm{d}t$

where Φ−1(α) is the inverse uncertainty distribution of a standard normal uncertain variable.

Theorem 4.

(Yao and Chen [17]) Let X t  and ${X}_{t}^{\alpha }$ be the solution and α-path of the uncertain differential equation

$\mathrm{d}{X}_{t}=f\left(t,{X}_{t}\right)\mathrm{d}t+g\left(t,{X}_{t}\right)\mathrm{d}{C}_{t},$

respectively. Then

$ℳ\left\{{X}_{t}\le {X}_{t}^{\alpha },\forall t\right\}=\alpha ,$
$ℳ\left\{{X}_{t}>{X}_{t}^{\alpha },\forall t\right\}=1-\alpha .$

As a corollary, the solution X t  has an inverse uncertainty distribution ${\mathrm{\Phi }}_{t}^{-1}\left(\alpha \right)={X}_{t}^{\alpha }.$

Extreme values

In this section, we study the extreme values of the solution of an uncertain differential equation, and give their uncertainty distributions. In addition, we design some numerical methods to obtain the uncertainty distributions.

Supremum

Theorem 5.

Let X t  and ${X}_{t}^{\alpha }$ be the solution and α-path of the uncertain differential equation

$\mathrm{d}{X}_{t}=f\left(t,{X}_{t}\right)\mathrm{d}t+g\left(t,{X}_{t}\right)\mathrm{d}{C}_{t},$

respectively. Then for a strictly increasing function J(x), the supremum

$\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)$

has an inverse uncertainty distribution

${\mathrm{\Psi }}_{s}^{-1}\left(\alpha \right)=\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}^{\alpha }\right).$

Proof.

Since J(x) is a strictly increasing function, we have

$\left\{\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)\le \underset{0\le t\le s}{\text{sup}}J\left({X}_{t}^{\alpha }\right)\right\}\supset \left\{{X}_{t}\le {X}_{t}^{\alpha },\forall t\right\}$

and

$\left\{\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)>\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}^{\alpha }\right)\right\}\supset \left\{{X}_{t}>{X}_{t}^{\alpha },\forall t\right\}.$

By Theorem 4 and the monotonicity of uncertain measure, we have

$ℳ\left\{\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)\le \underset{0\le t\le s}{\text{sup}}J\left({X}_{t}^{\alpha }\right)\right\}\ge ℳ\left\{{X}_{t}\le {X}_{t}^{\alpha },\forall t\right\}=\alpha$

and

$ℳ\left\{\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)>\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}^{\alpha }\right)\right\}\ge ℳ\left\{{X}_{t}>{X}_{t}^{\alpha },\forall t\right\}=1-\alpha .$

It follows from the duality axiom of uncertain measure that

$ℳ\left\{\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)\le \underset{0\le t\le s}{\text{sup}}J\left({X}_{t}^{\alpha }\right)\right\}=\alpha ,$

i.e., the supremum

$\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)$

has an inverse uncertainty distribution

${\mathrm{\Psi }}_{s}^{-1}\left(\alpha \right)=\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}^{\alpha }\right).$

In order to calculate the inverse uncertainty distribution of the supremum, we design a numerical method as below. □

Step 1: Fix α in (0,1), and fix h as the step length. Set i = 0, N = s/h, ${X}_{0}^{\alpha }={X}_{0}$, and H = J(X 0).

Step 2: Employ the recursion formula

${X}_{i+1}^{\alpha }={X}_{i}^{\alpha }+f\left({t}_{i},{X}_{i}^{\alpha }\right)h+g\left({t}_{i},{X}_{i}^{\alpha }\right){\mathrm{\Phi }}^{-1}\left(\alpha \right)h,$

and calculate ${X}_{i+1}^{\alpha }.$

Step 3: Set $H←\text{max}\phantom{\rule{.5em}{0ex}}\left(H,J\left({X}_{i+1}^{\alpha }\right)\right),i←i+1.$

Step 4: Repeat Step 2 and Step 3 for N times.

Step 5: The inverse uncertainty distribution of

$\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)$

is determined by

${\mathrm{\Psi }}_{s}^{-1}\left(\alpha \right)=H.$

Theorem 6.

Let X t  be the solution of an uncertain differential equation dX t  = f(t,X t )dt + g(t,X t )dC t . Assume X t  has an uncertainty distribution Φ t (x) at each time t. Then for a strictly increasing function J(x), the supremum

$\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)$

has an uncertainty distribution

${\mathrm{\Psi }}_{s}\left(x\right)=\underset{0\le t\le s}{\text{inf}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(x\right)\right).$

Proof.

Since ${X}_{t}^{\alpha }={\mathrm{\Phi }}_{t}^{-1}\left(\alpha \right),$ we have

$ℳ\left\{\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)\le \underset{0\le t\le s}{\text{sup}}J\left({\mathrm{\Phi }}_{t}^{-1}\left(\alpha \right)\right)\right\}=\alpha$

by Theorem 5. Write

$x=\underset{0\le t\le s}{\text{sup}}J\left({\mathrm{\Phi }}_{t}^{-1}\left(\alpha \right)\right),$

i.e.,

$\alpha =\underset{0\le t\le s}{\text{inf}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(x\right)\right).$

Then we have

$ℳ\left\{\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)\le x\right\}=\underset{0\le t\le s}{\text{inf}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(x\right)\right).$

In other words, the supremum

$\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)$

has an uncertainty distribution

${\mathrm{\Psi }}_{s}\left(x\right)=\underset{0\le t\le s}{\text{inf}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(x\right)\right).$

Theorem 7.

Let X t  and ${X}_{t}^{\alpha }$ be the solution and α-path of the uncertain differential equation

$\mathrm{d}{X}_{t}=f\left(t,{X}_{t}\right)\mathrm{d}t+g\left(t,{X}_{t}\right)\mathrm{d}{C}_{t},$

respectively. Then for a strictly decreasing function J(x), the supremum

$\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)$

has an inverse uncertainty distribution

${\mathrm{\Psi }}_{s}^{-1}\left(\alpha \right)=\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}^{1-\alpha }\right).$

Proof.

Since J(x) is a strictly decreasing function, we have

$\left\{\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)\le \underset{0\le t\le s}{\text{sup}}J\left({X}_{t}^{1-\alpha }\right)\right\}\supset \left\{{X}_{t}\ge {X}_{t}^{1-\alpha },\forall t\right\}$

and

$\left\{\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)>\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}^{1-\alpha }\right)\right\}\supset \left\{{X}_{t}<{X}_{t}^{1-\alpha },\forall t\right\}.$

By Theorem 4 and the monotonicity of uncertain measure, we have

$ℳ\left\{\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)\le \underset{0\le t\le s}{\text{sup}}J\left({X}_{t}^{1-\alpha }\right)\right\}\ge ℳ\left\{{X}_{t}\ge {X}_{t}^{1-\alpha },\forall t\right\}=\alpha$

and

$ℳ\left\{\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)>\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}^{1-\alpha }\right)\right\}\ge ℳ\left\{{X}_{t}<{X}_{t}^{1-\alpha },\forall t\right\}=1-\alpha .$

It follows from the duality axiom of uncertain measure that

$ℳ\left\{\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)\le \underset{0\le t\le s}{\text{sup}}J\left({X}_{t}^{1-\alpha }\right)\right\}=\alpha ,$

i.e., the supremum

$\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)$

has an inverse uncertainty distribution

${\mathrm{\Psi }}_{s}^{-1}\left(\alpha \right)=\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}^{1-\alpha }\right).$

Theorem 8.

Let X t  be the solution of an uncertain differential equation dX t  = f(t,X t )dt + g(t,X t )dC t . Assume X t  has an uncertainty distribution Φ t (x) at each time t. Then for a strictly decreasing function J(x), the supremum

$\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)$

has an uncertainty distribution

${\mathrm{\Psi }}_{s}\left(x\right)=1-\underset{0\le t\le s}{\text{sup}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(x\right)\right).$

Proof.

Since ${X}_{t}^{1-\alpha }={\mathrm{\Phi }}_{t}^{-1}\left(1-\alpha \right),$ we have

$ℳ\left\{\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)\le \underset{0\le t\le s}{\text{sup}}J\left({\mathrm{\Phi }}_{t}^{-1}\left(1-\alpha \right)\right)\right\}=\alpha$

by Theorem 7. Write

$x=\underset{0\le t\le s}{\text{sup}}J\left({\mathrm{\Phi }}_{t}^{-1}\left(1-\alpha \right)\right),$

i.e.,

$\alpha =1-\underset{0\le t\le s}{\text{sup}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(x\right)\right).$

Then we have

$ℳ\left\{\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)\le x\right\}=1-\underset{0\le t\le s}{\text{sup}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(x\right)\right).$

In other words, the supremum

$\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)$

has an uncertainty distribution

${\mathrm{\Psi }}_{s}\left(x\right)=1-\underset{0\le t\le s}{\text{sup}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(x\right)\right).$

Infimum

Theorem 9.

Let X t  and ${X}_{t}^{\alpha }$ be the solution and α-path of the uncertain differential equation

$\mathrm{d}{X}_{t}=f\left(t,{X}_{t}\right)\mathrm{d}t+g\left(t,{X}_{t}\right)\mathrm{d}{C}_{t},$

respectively. Then for a strictly increasing function J(x), the infimum

$\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)$

has an inverse uncertainty distribution

${\Upsilon }_{s}^{-1}\left(\alpha \right)=\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}^{\alpha }\right).$

Proof.

Since J(x) is a strictly increasing function, we have

$\left\{\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)\le \underset{0\le t\le s}{\text{inf}}J\left({X}_{t}^{\alpha }\right)\right\}\supset \left\{{X}_{t}\le {X}_{t}^{\alpha },\forall t\right\}$

and

$\left\{\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)>\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}^{\alpha }\right)\right\}\supset \left\{{X}_{t}>{X}_{t}^{\alpha },\forall t\right\}.$

By Theorem 4 and the monotonicity of uncertain measure, we have

$ℳ\left\{\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)\le \underset{0\le t\le s}{\text{inf}}J\left({X}_{t}^{\alpha }\right)\right\}\ge ℳ\left\{{X}_{t}\le {X}_{t}^{\alpha },\forall t\right\}=\alpha$

and

$ℳ\left\{\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)>\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}^{\alpha }\right)\right\}\ge ℳ\left\{{X}_{t}>{X}_{t}^{\alpha },\forall t\right\}=1-\alpha .$

It follows from the duality axiom of uncertain measure that

$ℳ\left\{\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)\le \underset{0\le t\le s}{\text{inf}}J\left({X}_{t}^{\alpha }\right)\right\}=\alpha ,$

i.e., the infimum

$\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)$

has an inverse uncertainty distribution

${\Upsilon }_{s}^{-1}\left(\alpha \right)=\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}^{\alpha }\right).$

In order to calculate the uncertainty distribution of the infimum, we design a numerical method as below. □

Step 1: Fix α in (0,1), and fix h as the step length. Set i = 0, N = s/h, ${X}_{0}^{\alpha }={X}_{0}$, and H = J(X 0).

Step 2: Employ the recursion formula

${X}_{i+1}^{\alpha }={X}_{i}^{\alpha }+f\left({t}_{i},{X}_{i}^{\alpha }\right)h+g\left({t}_{i},{X}_{i}^{\alpha }\right){\mathrm{\Phi }}^{-1}\left(\alpha \right)h,$

and calculate ${X}_{i+1}^{\alpha }$ and $J\left({X}_{i+1}^{\alpha }\right)$

Step 3: Set $H←\text{min}\left(H,J\left({X}_{i+1}^{\alpha }\right)\right),i←i+1.$

Step 4: Repeat Step 2 and Step 3 for N times.

Step 5: The inverse uncertainty distribution of $\underset{0\le t\le s}{\text{inf}}{X}_{t}$ is determined by

${\Upsilon }_{s}^{-1}\left(\alpha \right)=H.$

Theorem 10.

Let X t  be the solution of an uncertain differential equation dX t  = f(t,X t )dt + g(t,X t )dC t . Assume X t  has an uncertainty distribution Φ t (x) at each time t. Then for a strictly increasing function J(x), the infimum

$\underset{0\le t\le s}{\text{inf}}{X}_{t}$

has an uncertainty distribution

${\Upsilon }_{s}\left(x\right)=\underset{0\le t\le s}{\text{sup}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(x\right)\right).$

Proof.

Since ${X}_{t}^{\alpha }={\mathrm{\Phi }}_{t}^{-1}\left(\alpha \right),$ we have

$ℳ\left\{\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)\le \underset{0\le t\le s}{\text{inf}}J\left({\mathrm{\Phi }}_{t}^{-1}\left(\alpha \right)\right)\right\}=\alpha$

by Theorem 9. Write

$x=\underset{0\le t\le s}{\text{inf}}J\left({\mathrm{\Phi }}_{t}^{-1}\left(\alpha \right)\right),$

i.e.,

$\alpha =\underset{0\le t\le s}{\text{sup}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(x\right)\right).$

Then we have

$ℳ\left\{\underset{0\le t\le s}{\text{inf}}{X}_{t}\le x\right\}=\underset{0\le t\le s}{\text{sup}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(x\right)\right).$

In other words, the infimum

$\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)$

has an uncertainty distribution

${\Upsilon }_{s}\left(x\right)=\underset{0\le t\le s}{\text{sup}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(x\right)\right).$

Theorem 11.

Let X t  and ${X}_{t}^{\alpha }$ be the solution and α-path of the uncertain differential equation

$\mathrm{d}{X}_{t}=f\left(t,{X}_{t}\right)\mathrm{d}t+g\left(t,{X}_{t}\right)\mathrm{d}{C}_{t},$

respectively. Then for a strictly decreasing function J(x), the infimum

$\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)$

has an inverse uncertainty distribution

${\Upsilon }_{s}^{-1}\left(\alpha \right)=\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}^{1-\alpha }\right).$

Proof.

Since J(x) is a strictly decreasing function, we have

$\left\{\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)\le \underset{0\le t\le s}{\text{inf}}J\left({X}_{t}^{1-\alpha }\right)\right\}\supset \left\{{X}_{t}\ge {X}_{t}^{1-\alpha },\forall t\right\}$

and

$\left\{\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)>\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}^{1-\alpha }\right)\right\}\supset \left\{{X}_{t}<{X}_{t}^{1-\alpha },\forall t\right\}.$

By Theorem 4 and the monotonicity of uncertain measure, we have

$ℳ\left\{\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)\le \underset{0\le t\le s}{\text{inf}}J\left({X}_{t}^{1-\alpha }\right)\right\}\ge ℳ\left\{{X}_{t}\ge {X}_{t}^{1-\alpha },\forall t\right\}=\alpha$

and

$ℳ\left\{\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)>\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}^{1-\alpha }\right)\right\}\ge ℳ\left\{{X}_{t}<{X}_{t}^{1-\alpha },\forall t\right\}=1-\alpha .$

It follows from the duality axiom of uncertain measure that

$ℳ\left\{\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)\le \underset{0\le t\le s}{\text{inf}}J\left({X}_{t}^{1-\alpha }\right)\right\}=\alpha ,$

i.e., the infimum

$\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)$

has an inverse uncertainty distribution

${\Upsilon }_{s}^{-1}\left(\alpha \right)=\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}^{1-\alpha }\right).$

Theorem 12.

Let X t  be the solution of an uncertain differential equation dX t  = f(t,X t )dt + g(t,X t )dC t . Assume X t has an uncertainty distribution Φ t (x) at each time t. Then for a strictly decreasing function J(x), the infimum

$\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)$

has an uncertainty distribution

${\Upsilon }_{s}\left(x\right)=1-\underset{0\le t\le s}{\text{inf}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(x\right)\right).$

Proof.

Since ${X}_{t}^{1-\alpha }={\mathrm{\Phi }}_{t}^{-1}\left(1-\alpha \right),$ we have

$ℳ\left\{\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)\le \underset{0\le t\le s}{\text{inf}}J\left({\mathrm{\Phi }}_{t}^{-1}\left(1-\alpha \right)\right)\right\}=\alpha$

by Theorem 11. Write

$x=\underset{0\le t\le s}{\text{inf}}J\left({\mathrm{\Phi }}_{t}^{-1}\left(1-\alpha \right)\right),$

i.e.,

$\alpha =1-\underset{0\le t\le s}{\text{inf}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(x\right)\right).$

Then we have

$ℳ\left\{\underset{0\le t\le s}{\text{inf}}{X}_{t}\le x\right\}=1-\underset{0\le t\le s}{\text{inf}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(x\right)\right).$

In other words, the infimum

$\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)$

has an uncertainty distribution

${\Upsilon }_{s}\left(x\right)=1-\underset{0\le t\le s}{\text{inf}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(x\right)\right).$

First hitting time

In this section, we study the first hitting time of the solution of an uncertain differential equation, and give the uncertainty distributions in different cases.

First hitting time of strictly increasing function of the solution

Theorem 13.

Let X t  and ${X}_{t}^{\alpha }$ be the solution and α-path of the uncertain differential equation

$\mathrm{d}{X}_{t}=f\left(t,{X}_{t}\right)\mathrm{d}t+g\left(t,{X}_{t}\right)\mathrm{d}{C}_{t}$

with an initial value X 0, respectively. Given a strictly increasing function J(x), and a level z > J(X 0), the first hitting time τ z  that J(X t ) reaches z has an uncertainty distribution

$\mathrm{\Psi }\left(s\right)=1-\text{inf}\left\{\alpha \in \left(0,1\right)\left|\phantom{\rule{0.3em}{0ex}}\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}^{\alpha }\right)\ge z\right\right\}.$

Proof.

Write

$\begin{array}{ll}{\alpha }_{0}& =\text{inf}\left\{\alpha \in \left(0,1\right)\left|\phantom{\rule{0.3em}{0ex}}\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}^{\alpha }\right)\ge z\right\right\}.\phantom{\rule{2em}{0ex}}\end{array}$

Since J(x) is a strictly increasing function, we have

$\left\{{\tau }_{z}\le s\right\}\supset \left\{J\left({X}_{t}\right)\ge J\left({X}_{t}^{{\alpha }_{0}}\right),\forall t\right\}=\left\{{X}_{t}\ge {X}_{t}^{{\alpha }_{0}},\forall t\right\},$
$\left\{{\tau }_{z}>s\right\}\supset \left\{J\left({X}_{t}\right)

By Theorem 4 and the monotonicity of uncertain measure, we have

$ℳ\left\{{\tau }_{z}\le s\right\}\ge ℳ\left\{{X}_{t}\ge {X}_{t}^{{\alpha }_{0}},\forall t\right\}=1-{\alpha }_{0},$
$ℳ\left\{{\tau }_{z}>s\right\}\ge \left\{{X}_{t}<{X}_{t}^{{\alpha }_{0}},\forall t\right\}={\alpha }_{0}.$

It follows from the duality axiom of uncertain measure that

$ℳ\left\{{\tau }_{z}\le s\right\}=1-{\alpha }_{0}.$

This completes the proof. □

For a strictly increasing function J(x), in order to calculate the uncertainty distribution Ψ(s) of the first hitting time τ z  that J(X t ) reaches z when J(X 0) < z, we design a numerical method as below.

Step 1: Fix ε as the accuracy, and fix h as the step length. Set N = s/h.

Step 2: Employ the recursion formula

${X}_{i+1}^{\epsilon }={X}_{i}^{\epsilon }+f\left({t}_{i},{X}_{i}^{\epsilon }\right)h+g\left({t}_{i},{X}_{i}^{\epsilon }\right){\mathrm{\Phi }}^{-1}\left(\epsilon \right)h$

for N times, and calculate ${X}_{i}^{\epsilon },i=1,2,\cdots \phantom{\rule{0.3em}{0ex}},N.$ If

$\underset{1\le i\le N}{\text{max}}J\left({X}_{i}^{\epsilon }\right)\ge z,$

then return 1 − ε and stop.

Step 3: Employ the recursion formula

${X}_{i+1}^{1-\epsilon }={X}_{i}^{1-\epsilon }+f\left({t}_{i},{X}_{i}^{1-\epsilon }\right)h+g\left({t}_{i},{X}_{i}^{1-\epsilon }\right){\mathrm{\Phi }}^{-1}\left(1-\epsilon \right)h$

for N times, and calculate ${X}_{i}^{1-\epsilon },i=1,2,\cdots \phantom{\rule{0.3em}{0ex}},N.$ If

$\underset{1\le i\le N}{\text{max}}J\left({X}_{i}^{1-\epsilon }\right)

then return ε and stop.

Step 4: Set α 1 = ε, α 2 = 1 − ε.

Step 5: Set α = (α 1 + α 2)/2.

Step 6: Employ the recursion formula

${X}_{i+1}^{\alpha }={X}_{i}^{\alpha }+f\left({t}_{i},{X}_{i}^{\alpha }\right)h+g\left({t}_{i},{X}_{i}^{\alpha }\right){\mathrm{\Phi }}^{-1}\left(\alpha \right)h$

for N times, and calculate ${X}_{i+1}^{\alpha },i=1,2,\cdots \phantom{\rule{0.3em}{0ex}},N.$ If

$\underset{1\le i\le N}{\text{max}}J\left({X}_{t}^{\alpha }\right)

then set α 1 = α. Otherwise, set α 2 = α.

Step 7: If |α 2 − α 1| ≤ ε, then return 1 − α and stop. Otherwise, go to Step 5.

Theorem 14.

Let X t  be the solution of an uncertain differential equation dX t  = f(t,X t )dt + g(t,X t )dC t  with an initial value X 0. Assume X t  has an uncertainty distribution Φ t (x) at each time t. Then given a strictly increasing function J(x) and a level z > J(X 0), the first hitting time τ z  that J(X t ) reaches z has an uncertainty distribution

$\mathrm{\Psi }\left(s\right)=1-\underset{0\le t\le s}{\text{inf}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(z\right)\right).$

Proof.

Since the event {τ z  ≤ s} is equivalent to the event

$\left\{\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)\ge z\right\}$

provided z > J(X 0), it follows from Theorem 6 that

$\begin{array}{ll}\phantom{\rule{1em}{0ex}}\mathrm{\Psi }\left(s\right)& =ℳ\left\{\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)\ge z\right\}\phantom{\rule{2em}{0ex}}\\ =ℳ\left\{\underset{0\le t\le s}{\text{sup}}{X}_{t}\ge {J}^{-1}\left(z\right)\right\}\phantom{\rule{2em}{0ex}}\\ =1-\underset{0\le t\le s}{\text{inf}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(z\right)\right).\phantom{\rule{2em}{0ex}}\end{array}$

This completes the proof. □

Theorem 15.

Let X t  and ${X}_{t}^{\alpha }$ be the solution and α-path of the uncertain differential equation

$\mathrm{d}{X}_{t}=f\left(t,{X}_{t}\right)\mathrm{d}t+g\left(t,{X}_{t}\right)\mathrm{d}{C}_{t}$

with an initial value X 0, respectively. Given a strictly increasing function J(x) and a level z < J(X 0), the first hitting time τ z  that J(X t ) reaches z has an uncertainty distribution

$\Upsilon \left(s\right)=\text{sup}\left\{\alpha \in \left(0,1\right)\left|\phantom{\rule{0.3em}{0ex}}\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}^{\alpha }\right)\le z\right\right\}.$

Proof.

Write

$\begin{array}{ll}\phantom{\rule{1em}{0ex}}{\alpha }_{0}& =\text{sup}\left\{\alpha \in \left(0,1\right)\left|\phantom{\rule{0.3em}{0ex}}\underset{0\le t\le s}{\text{inf}}J\left(\underset{t}{\overset{\alpha }{X}}\right)\le z\right\right\}.\phantom{\rule{2em}{0ex}}\end{array}$

Then

$\left\{{\tau }_{z}\le s\right\}\supset \left\{J\left({X}_{t}\right)\le J\left({X}_{t}^{{\alpha }_{0}}\right),\forall t\right\}=\left\{{X}_{t}\le {X}_{t}^{{\alpha }_{0}},\forall t\right\},$
$\left\{{\tau }_{z}>s\right\}\supset \left\{J\left({X}_{t}\right)>J\left({X}_{t}^{{\alpha }_{0}}\right),\forall t\right\}=\left\{{X}_{t}>{X}_{t}^{{\alpha }_{0}},\forall t\right\}.$

By Theorem 4 and the monotonicity of uncertain measure, we have

$ℳ\left\{{\tau }_{z}\le s\right\}\ge ℳ\left\{{X}_{t}\le {X}_{t}^{{\alpha }_{0}},\forall t\right\}={\alpha }_{0},$
$ℳ\left\{{\tau }_{z}>s\right\}\ge \left\{{X}_{t}>{X}_{t}^{{\alpha }_{0}},\forall t\right\}=1-{\alpha }_{0}.$

It follows from the duality axiom of uncertain measure that

$ℳ\left\{{\tau }_{z}\le s\right\}={\alpha }_{0}.$

This completes the proof. □

For a strictly increasing function J(x), in order to calculate the uncertainty distribution ϒ(s) of the first hitting time τ z that J(X t ) reaches z when J(X 0) > z, we design a numerical method as below.

Step 1: Fix ε as the accuracy, and fix h as the step length. Set N = s/h.

Step 2: Employ the recursion formula

${X}_{i+1}^{\epsilon }={X}_{i}^{\epsilon }+f\left({t}_{i},{X}_{i}^{\epsilon }\right)h+g\left({t}_{i},{X}_{i}^{\epsilon }\right){\mathrm{\Phi }}^{-1}\left(\epsilon \right)h$

for N times, and calculate ${X}_{i}^{\epsilon },i=1,2,\cdots \phantom{\rule{0.3em}{0ex}},N.$ If

$\underset{1\le i\le N}{\text{min}}J\left({X}_{i}^{\epsilon }\right)>z,$

then return 1 − ε and stop.

Step 3: Employ the recursion formula

${X}_{i+1}^{1-\epsilon }={X}_{i}^{1-\epsilon }+f\left({t}_{i},{X}_{i}^{1-\epsilon }\right)h+g\left({t}_{i},{X}_{i}^{1-\epsilon }\right){\mathrm{\Phi }}^{-1}\left(1-\epsilon \right)h$

for N times, and calculate ${X}_{i}^{1-\epsilon },i=1,2,\cdots \phantom{\rule{0.3em}{0ex}},N.$ If

$\underset{1\le i\le N}{\text{min}}J\left({X}_{i}^{1-\epsilon }\right)\le z,$

then return ε and stop.

Step 4: Set α 1 = ε, α 2 = 1 − ε.

Step 5: Set α = (α 1 + α 2)/2.

Step 6: Employ the recursion formula

${X}_{i+1}^{\alpha }={X}_{i}^{\alpha }+f\left({t}_{i},{X}_{i}^{\alpha }\right)h+g\left({t}_{i},{X}_{i}^{\alpha }\right){\mathrm{\Phi }}^{-1}\left(\alpha \right)h$

for N times, and calculate ${X}_{i+1}^{\alpha },i=1,2,\cdots \phantom{\rule{0.3em}{0ex}},N.$ If

$\underset{1\le i\le N}{\text{min}}J\left({X}_{t}^{\alpha }\right)

then set α 1 = α. Otherwise, set α 2 = α.

Step 7: If |α 2 − α 1| ≤ ε, then return α and stop. Otherwise, go to Step 5.

Theorem 16.

Let X t  be the solution of an uncertain differential equation dX t  = f(t,X t )dt + g(t,X t )dC t  with an initial value X 0. Assume X t  has an uncertainty distribution Φ t (x) at each time t. Then given a strictly increasing function J(x) and a level z < J(X 0), the first hitting time τ z  that J(X t ) reaches z has an uncertainty distribution

$\mathrm{\Psi }\left(s\right)=\underset{0\le t\le s}{\text{sup}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(z\right)\right).$

Proof.

Since the event {τ z  ≤ s} is equivalent to the event

$\left\{\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)\le z\right\}$

provided z < J(X 0), it follows from Theorem 10 that

$\begin{array}{ll}\phantom{\rule{1em}{0ex}}\mathrm{\Psi }\left(s\right)& =ℳ\left\{\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)\le z\right\}\phantom{\rule{2em}{0ex}}\\ =ℳ\left\{\underset{0\le t\le s}{\text{inf}}{X}_{t}\le {J}^{-1}\left(z\right)\right\}\phantom{\rule{2em}{0ex}}\\ =\underset{0\le t\le s}{\text{sup}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(z\right)\right).\phantom{\rule{2em}{0ex}}\end{array}$

This completes the proof. □

First hitting time of strictly decreasing function of the solution

Theorem 17.

Let X t  and ${X}_{t}^{\alpha }$ be the solution and α-path of the uncertain differential equation

$\mathrm{d}{X}_{t}=f\left(t,{X}_{t}\right)\mathrm{d}t+g\left(t,{X}_{t}\right)\mathrm{d}{C}_{t}$

with an initial value X 0, respectively. Given a strictly decreasing function J(x), and a level z > J(X 0), the first hitting time τ z vthat J(X t ) reaches z has an uncertainty distribution

$\mathrm{\Psi }\left(s\right)=\text{sup}\left\{\alpha \in \left(0,1\right)\left|\phantom{\rule{0.3em}{0ex}}\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}^{\alpha }\right)\ge z\right\right\}.$

Proof.

Write

$\begin{array}{ll}{\alpha }_{0}& =\text{sup}\left\{\alpha \in \left(0,1\right)\left|\phantom{\rule{0.3em}{0ex}}\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}^{\alpha }\right)\ge z\right\right\}.\phantom{\rule{2em}{0ex}}\end{array}$

Since J(x) is a strictly decreasing function, we have

$\left\{{\tau }_{z}\le s\right\}\supset \left\{J\left({X}_{t}\right)\ge J\left({X}_{t}^{{\alpha }_{0}}\right),\forall t\right\}=\left\{{X}_{t}\le {X}_{t}^{{\alpha }_{0}},\forall t\right\},$
$\left\{{\tau }_{z}>s\right\}\supset \left\{J\left({X}_{t}\right){X}_{t}^{{\alpha }_{0}},\forall t\right\}.$

By Theorem 4 and the monotonicity of uncertain measure, we have

$ℳ\left\{{\tau }_{z}\le s\right\}\ge ℳ\left\{{X}_{t}\le {X}_{t}^{{\alpha }_{0}},\forall t\right\}={\alpha }_{0},$
$ℳ\left\{{\tau }_{z}>s\right\}\ge \left\{{X}_{t}>{X}_{t}^{{\alpha }_{0}},\forall t\right\}=1-{\alpha }_{0}.$

It follows from the duality axiom of uncertain measure that

$ℳ\left\{{\tau }_{z}\le s\right\}={\alpha }_{0}.$

This completes the proof. □

Theorem 18.

Let X t  be the solution of an uncertain differential equation dX t  = f(t,X t )dt + g(t,X t )dC t  with an initial value X 0. Assume X t  has an uncertainty distribution Φ t (x) at each time t. Then given a strictly decreasing function J(x) and a level z > J(X 0), the first hitting time τ z  that J(X t ) reaches z has an uncertainty distribution

$\mathrm{\Psi }\left(s\right)=\underset{0\le t\le s}{\text{sup}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(z\right)\right).$

Proof.

Since the event {τ z  ≤ s} is equivalent to the event

$\left\{\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)\ge z\right\}$

provided z > J(X 0), it follows from Theorem 10 that

$\begin{array}{ll}\phantom{\rule{1em}{0ex}}\mathrm{\Psi }\left(s\right)& =ℳ\left\{\underset{0\le t\le s}{\text{sup}}J\left({X}_{t}\right)\ge z\right\}\phantom{\rule{2em}{0ex}}\\ =ℳ\left\{\underset{0\le t\le s}{\text{inf}}{X}_{t}\le {J}^{-1}\left(z\right)\right\}\phantom{\rule{2em}{0ex}}\\ =\underset{0\le t\le s}{\text{sup}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(z\right)\right).\phantom{\rule{2em}{0ex}}\end{array}$

This completes the proof. □

Theorem 19.

Let X t  and ${X}_{t}^{\alpha }$ be the solution and α-path of the uncertain differential equation

$\mathrm{d}{X}_{t}=f\left(t,{X}_{t}\right)\mathrm{d}t+g\left(t,{X}_{t}\right)\mathrm{d}{C}_{t}$

with an initial value X 0, respectively. Given a strictly decreasing function J(x) and a level z < J(X 0), the first hitting time τ z  that J(X t ) reaches z has an uncertainty distribution

$\Upsilon \left(s\right)=1-\text{inf}\left\{\alpha \in \left(0,1\right)\left|\phantom{\rule{0.3em}{0ex}}\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}^{\alpha }\right)\le z\right\right\}.$

Proof.

Write

$\begin{array}{ll}\phantom{\rule{1em}{0ex}}{\alpha }_{0}& =\text{inf}\left\{\alpha \in \left(0,1\right)\left|\phantom{\rule{0.3em}{0ex}}\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}^{\alpha }\right)\le z\right\right\}.\phantom{\rule{2em}{0ex}}\end{array}$

Then

$\left\{{\tau }_{z}\le s\right\}\supset \left\{J\left({X}_{t}\right)\le J\left({X}_{t}^{{\alpha }_{0}}\right),\forall t\right\}=\left\{{X}_{t}\ge {X}_{t}^{{\alpha }_{0}},\forall t\right\},$
$\left\{{\tau }_{z}>s\right\}\supset \left\{J\left({X}_{t}\right)>J\left({X}_{t}^{{\alpha }_{0}}\right),\forall t\right\}=\left\{{X}_{t}<{X}_{t}^{{\alpha }_{0}},\forall t\right\}.$

By Theorem 4 and the monotonicity of uncertain measure, we have

$ℳ\left\{{\tau }_{z}\le s\right\}\ge ℳ\left\{{X}_{t}\ge {X}_{t}^{{\alpha }_{0}},\forall t\right\}=1-{\alpha }_{0},$
$ℳ\left\{{\tau }_{z}>s\right\}\ge \left\{{X}_{t}<{X}_{t}^{{\alpha }_{0}},\forall t\right\}={\alpha }_{0}.$

It follows from the duality axiom of uncertain measure that

$ℳ\left\{{\tau }_{z}\le s\right\}=1-{\alpha }_{0}.$

This completes the proof. □

Theorem 20.

Let X t  be the solution of an uncertain differential equation dX t  = f(t,X t )dt + g(t,X t )dC t  with an initial value X 0. Assume X t  has an uncertainty distribution Φ t (x) at each time t. Then given a strictly decreasing function J(x) and a level z < J(X 0), the first hitting time τ z  that J(X t ) reaches z has an uncertainty distribution

$\mathrm{\Psi }\left(s\right)=1-\underset{0\le t\le s}{\text{inf}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(z\right)\right).$

Proof.

Since the event {τ z  ≤ s} is equivalent to the event

$\left\{\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)\le z\right\}$

provided z < J(X 0), it follows from Theorem 6 that

$\begin{array}{ll}\phantom{\rule{1em}{0ex}}\mathrm{\Psi }\left(s\right)& =ℳ\left\{\underset{0\le t\le s}{\text{inf}}J\left({X}_{t}\right)\le z\right\}\phantom{\rule{2em}{0ex}}\\ =ℳ\left\{\underset{0\le t\le s}{\text{sup}}{X}_{t}\ge {J}^{-1}\left(z\right)\right\}\phantom{\rule{2em}{0ex}}\\ =1-\underset{0\le t\le s}{\text{inf}}{\mathrm{\Phi }}_{t}\left({J}^{-1}\left(z\right)\right).\phantom{\rule{2em}{0ex}}\end{array}$

This completes the proof. □

Integral

In this section, we study the integral of the solution of an uncertain differential equation, and give its uncertainty distribution. Besides, we design a numerical method to obtain the uncertainty distribution.

Theorem 21.

Let X t  and ${X}_{t}^{\alpha }$ be the solution and α-path of the uncertain differential equation

$\mathrm{d}{X}_{t}=f\left(t,{X}_{t}\right)\mathrm{d}t+g\left(t,{X}_{t}\right)\mathrm{d}{C}_{t},$

respectively. Assume J(x) is a strictly increasing function. Then the integral

${\int }_{0}^{s}J\left({X}_{t}\right)\mathrm{d}t$

has an inverse uncertainty distribution

${\mathrm{\Psi }}^{-1}\left(\alpha \right)={\int }_{0}^{s}J\left({X}_{t}^{\alpha }\right)\mathrm{d}t.$

Proof.

Since J(x) is a strictly increasing function, we have

$\begin{array}{l}\left\{{\int }_{0}^{s}J\left({X}_{t}\right)\mathrm{d}t\le {\int }_{0}^{s}J\left({X}_{t}^{\alpha }\right)\mathrm{d}t\right\}\supset \left\{J\left({X}_{t}\right)\le J\left({X}_{t}^{\alpha }\right),\forall t\right\}=\left\{{X}_{t}\le {X}_{t}^{\alpha },\forall t\right\}\end{array}$

and

$\begin{array}{l}\left\{{\int }_{0}^{s}J\left({X}_{t}\right)\mathrm{d}t>{\int }_{0}^{s}J\left({X}_{t}^{\alpha }\right)\mathrm{d}t\right\}\supset \left\{J\left({X}_{t}\right)>J\left({X}_{t}^{\alpha }\right),\forall t\right\}=\left\{{X}_{t}>{X}_{t}^{\alpha },\forall t\right\}.\end{array}$

By Theorem 4 and the monotonicity of uncertain measure, we have

$ℳ\left\{{\int }_{0}^{s}J\left({X}_{t}\right)\mathrm{d}t\le {\int }_{0}^{s}J\left({X}_{t}^{\alpha }\right)\mathrm{d}t\right\}\ge ℳ\left\{{X}_{t}\le {X}_{t}^{\alpha },\forall t\right\}=\alpha$

and

$ℳ\left\{{\int }_{0}^{s}J\left({X}_{t}\right)\mathrm{d}t>{\int }_{0}^{s}J\left({X}_{t}^{\alpha }\right)\mathrm{d}t\right\}\ge ℳ\left\{{X}_{t}>{X}_{t}^{\alpha },\forall t\right\}=1-\alpha .$

It follows from the duality axiom of uncertain measure that

$ℳ\left\{{\int }_{0}^{s}J\left({X}_{t}\right)\mathrm{d}t\le {\int }_{0}^{s}J\left({X}_{t}^{\alpha }\right)\mathrm{d}t\right\}=\alpha .$

In other words, the integral

${\int }_{0}^{s}J\left({X}_{t}\right)\mathrm{d}t$

has an inverse uncertainty distribution

${\mathrm{\Psi }}_{s}^{-1}\left(\alpha \right)={\int }_{0}^{s}J\left({X}_{t}^{\alpha }\right)\mathrm{d}t.$

Example 1.

Let X t  and ${X}_{t}^{\alpha }$ be the solution and α-path of the uncertain differential equation dX t  = f(t,X t )dt + g(t,X t )dC t , respectively. Consider a function h(t,x) = exp(−r t) x. Since h(t,x) is strictly increasing with respect to x, the integral

${\int }_{0}^{s}h\left(t,{X}_{t}\right)\mathrm{d}t={\int }_{0}^{s}\text{exp}\left(-\mathit{\text{rt}}\right){X}_{t}\mathrm{d}t$

has an inverse uncertainty distribution

${\mathrm{\Psi }}_{s}^{-1}\left(\alpha \right)={\int }_{0}^{s}h\left(t,{X}_{t}^{\alpha }\right)\mathrm{d}t={\int }_{0}^{s}\text{exp}\left(-\mathit{\text{rt}}\right){X}_{t}^{\alpha }\mathrm{d}t.$

When J(x) is a strictly increasing function, in order to calculate the uncertainty distribution of the integral of J(X t ), we design a numerical method as below.

Step 1: Fix α in (0,1), and fix h as the step length. Set i = 0, N = s/h, and ${X}_{0}^{\alpha }={X}_{0}.$

Step 2: Employ the recursion formula

${X}_{i+1}^{\alpha }={X}_{i}^{\alpha }+f\left({t}_{i},{X}_{i}^{\alpha }\right)h+g\left({t}_{i},{X}_{i}^{\alpha }\right){\mathrm{\Phi }}^{-1}\left(\alpha \right)h,$

and calculate ${X}_{i+1}^{\alpha }$ and $J\left({X}_{i+1}^{\alpha }\right).$

Step 3: Set i ← i + 1.

Step 4: Repeat Step 2 and Step 3 for N times.

Step 5: The inverse uncertainty distribution of

${\int }_{0}^{s}J\left({X}_{t}\right)\mathrm{d}t$

is determined by

${\mathrm{\Psi }}_{s}^{-1}\left(\alpha \right)=\sum _{i=1}^{N}J\left({X}_{i}^{\alpha }\right)H.$

Theorem 22.

Let X t  and ${X}_{t}^{\alpha }$ be the solution and α-path of the uncertain differential equation

$\mathrm{d}{X}_{t}=f\left(t,{X}_{t}\right)\mathrm{d}t+g\left(t,{X}_{t}\right)\mathrm{d}{C}_{t},$

respectively. Assume J(x) is a strictly decreasing function. Then the integral

${\int }_{0}^{s}J\left({X}_{t}\right)\mathrm{d}t$

has an inverse uncertainty distribution

${\Upsilon }_{s}^{-1}\left(\alpha \right)={\int }_{0}^{s}J\left({X}_{t}^{1-\alpha }\right)\mathrm{d}t.$

Proof.

Since J(x) is a strictly decreasing function, we have

$\begin{array}{ll}\phantom{\rule{6pt}{0ex}}\left\{{\int }_{0}^{s}J\left({X}_{t}\right)\mathrm{d}t\le {\int }_{0}^{s}J\left({X}_{t}^{1-\alpha }\right)\mathrm{d}t\right\}& \supset \left\{J\left({X}_{t}\right)\le J\left({X}_{t}^{1-\alpha }\right),\forall t\right\}\phantom{\rule{2em}{0ex}}\\ =\left\{{X}_{t}\ge {X}_{t}^{1-\alpha },\forall t\right\}\phantom{\rule{2em}{0ex}}\end{array}$

and

$\begin{array}{ll}\phantom{\rule{6pt}{0ex}}\left\{{\int }_{0}^{s}J\left({X}_{t}\right)\mathrm{d}t>{\int }_{0}^{s}J\left({X}_{t}^{1-\alpha }\right)\mathrm{d}t\right\}& \supset \left\{J\left({X}_{t}\right)>J\left({X}_{t}^{1-\alpha }\right),\forall t\right\}\phantom{\rule{2em}{0ex}}\\ =\left\{{X}_{t}<{X}_{t}^{1-\alpha },\forall t\right\}.\phantom{\rule{2em}{0ex}}\end{array}$

By Theorem 4 and the monotonicity of uncertain measure, we have

$ℳ\left\{{\int }_{0}^{s}J\left({X}_{t}\right)\mathrm{d}t\le {\int }_{0}^{s}J\left({X}_{t}^{1-\alpha }\right)\mathrm{d}t\right\}\ge ℳ\left\{{X}_{t}\ge {X}_{t}^{1-\alpha },\forall t\right\}=\alpha$

and

$ℳ\left\{{\int }_{0}^{s}J\left({X}_{t}\right)\mathrm{d}t>{\int }_{0}^{s}J\left({X}_{t}^{1-\alpha }\right)\mathrm{d}t\right\}\ge ℳ\left\{{X}_{t}<{X}_{t}^{1-\alpha },\forall t\right\}=1-\alpha .$

It follows from the duality axiom of uncertain measure that

$ℳ\left\{{\int }_{0}^{s}J\left({X}_{t}\right)\mathrm{d}t\le {\int }_{0}^{s}J\left({X}_{t}^{1-\alpha }\right)\mathrm{d}t\right\}=\alpha .$

In other words, the integral

${\int }_{0}^{s}J\left({X}_{t}\right)\mathrm{d}t$

has an inverse uncertainty distribution

${\Upsilon }_{s}^{-1}\left(\alpha \right)={\int }_{0}^{s}J\left({X}_{t}^{1-\alpha }\right)\mathrm{d}t.$

Conclusions

This paper considered the solution of an uncertain differential equation, and gave the uncertainty distributions of its extreme values, first hitting time, and integral. In addition, we designed some numerical methods to obtain the uncertainty distributions.