Skip to main content

Stochastic Processes

  • Chapter
  • First Online:
  • 1479 Accesses

Abstract

This chapter is an extension of the previous chapter. In the previous chapter, we focused essentially on random variables. In this chapter, we introduce the concept of random (or stochastic) process as a generalization of a random variable to include another dimension—time. While a random variable depends on the outcome of a random experiment, a random process depends on both the outcome of a random experiment and time. In other words, if a random variable X is time-dependent, X(t) is known as a random process. Thus, a random process may be regarded as any process that changes with time and controlled by some probabilistic law. For example, the number of customers N in a queueing system varies with time; hence N(t) is a random process

For me problem-solving is the most interesting thing in life. To be handed something that’s a complete mess and straighten it out. To organize where there is no organization. To give form to a medium that has no form.

—Sylvester Weaver

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. X. R. Li, Probability, Random Signals, and Statistics. Boca Raton, FL: CRC Press, 1999, pp. 259-313.

    Google Scholar 

  2. G. R. Grimmett and D. R. Stirzaker, Probability and Random Processes. New York: Oxford University Press, 3rd ed., 2001, pp. 360-374.

    Google Scholar 

  3. R. Nelson, Probability, Stochastic Processes, and Queueing Theory. New York: Springer-Verlag, 1995, pp. 235-282.

    Google Scholar 

  4. D. Claiborne, Mathematical Preliminaries for Computer Networking. New York: John Wiley & Sons, 1990, pp. 35-42.

    Google Scholar 

  5. S. M. Ross, Stochastic Processes. New York: John Wiley & Sons, 1983.

    Google Scholar 

  6. R. Jain, The Art of Computer Systems Performance Analysis. New York: John Wiley, 1991, pp. 516-517.

    Google Scholar 

  7. J. Medhi, Stochastic Models in Queueing Theory. Boston, MA: Academic Press, 1991, p. 31.

    Google Scholar 

  8. R. Goodman, Introduction to Stochastic Models. Mineola, NY: Dover Publications, 2nd ed., 2006.

    Google Scholar 

  9. O. C. Ibe, Fundamentals of Applied Probability and Random Processes. Burlington, MA: Elsevier Academic Press, 2005.

    Google Scholar 

  10. J. C. Falmagne, Lectures in Elementary Probability Theory and Stochastic Processes. New York: McGraw-Hill, 2003.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Problems

Problems

  1. 3.1

    If X(t) = A sin4t, where A is random variable uniformly distributed between 0 and 2, find E[X(t)] and E[X2(t)].

  2. 3.2

    Given a random process X(t) = At + 2, where A is a random variable uniformly distributed over the range (0,1),

    1. (a)

      sketch three sample functions of X(t),

    2. (b)

      find \( \overline{X(t)}\kern0.75em \mathrm{and}\kern0.5em \overline{X^2(t)} \),

    3. (c)

      determine R X (t 1,t 2),

    4. (d)

      Is X(t) WSS?

  3. 3.3

    If a random process is given by

    $$ X(t)=A \cos \omega t-B \sin \omega t, $$

    where ω is a constant and A and B are independent Gaussian random variables with zero mean and variance σ2, determine: (a) E[X], E[X2] and Var(X), (b) the autocorrelation function R X (t 1,t 2).

  4. 3.4

    Let Y(t) = X(t − 1) + cos3t, where X(t) is a stationary random process. Determine the autocorrelation function of Y(t) in terms of R X (τ).

  5. 3.5

    If Y(t) = X(t) − X(t − α), where α is a constant and X(t) is a random process. Show that

    $$ \kern1em {R}_Y\left({t}_1,{t}_2\right)={R}_x\left({t}_1,{t}_2\right)-{R}_x\left({t}_1,{t}_2-\alpha \right)-{R}_x\left({t}_1-\alpha, {t}_2\right)+{R}_x\left({t}_1-\alpha, {t}_2-\alpha \right) $$
  6. 3.6

    A random stationary process X(t) has mean 4 and autocorrelation functon

    $$ {R}_X\left(\tau \right)=5{e}^{-2\left|\tau \right|} $$
    1. (a)

      If Y(t) = X(t − 1), find the mean and autocorrelation function of Y(t).

    2. (b)

      Repeat part (a) if Y(t) = tX(t).

  7. 3.7

    Let Z(t) = X(t) + Y(t), where X(t) and Y(t) are two independent stationary random processes. Find R Z (τ) in terms of R X (τ) and R Y (τ).

  8. 3.8

    Repeat the previous problem if Z(t) = 3X(t) + 4Y(t).

  9. 3.9

    If X(t) = Acosωt, where ω is a constant and A random variables with mean μ and variance σ2, (a) find < x(t) > and mX(t). (b) Is X(t) ergodic?

  10. 3.10

    A random process is defined by

    $$ X(t)=A \cos \omega t-B \sin \omega t, $$

    where ω is a constant and A and B are independent random variable with zero mean. Show that X(t) is stationary and also ergodic.

  11. 3.11

    N(t) is a stationary noise process with zero mean and autocorrelation function

    $$ {R}_N\left(\tau \right)=\frac{N_o}{2}\delta \left(\tau \right) $$

    where No is a constant. Is N(t) ergodic?

  12. 3.12

    X(t) is a stationary Gaussian process with zero mean and autocorrelation function

    $$ {R}_X\left(\tau \right)={\sigma}^2{e}^{-\alpha \left|\tau \right|} \cos \omega \tau $$

    where σ, ω, and α are constants. Show that X(t) is ergodic.

  13. 3.13

    If X(t) and Y(t) are two random processes that are jointly stationary so that R XY (t 1,t 2) = R XY (τ), prove that

    $$ {R}_{XY}\left(\tau \right)={R}_{YX}\left(-\tau \right) $$

    where τ = ∣t 2 − t 1∣.

  14. 3.14

    For two stationary processes X(t) and Y(t), show that

    1. (a)
      $$ \left|{R}_{XY}\left(\tau \right)\right|\le \frac{1}{2}\left[{R}_X(0)+{R}_Y(0)\right] $$
    2. (b)
      $$ \left|{R}_{XY}\left(\tau \right)\right|\le \sqrt{R_X(0){R}_Y(0)} $$
  15. 3.15

    Let X(t) and Y(t) be two random processes given by

    X(t) = cos (ωt+Θ)

    Y(t) = sin (ωt+Θ)

    where ω is a constant and Θ is a random variable uniformly distributed over (0,2π). Find

    $$ {R}_{XY}\left(t,t+\tau \right)\kern0.5em \mathrm{and}\kern0.75em {R}_{YX}\left(t,t+\tau \right). $$
  16. 3.16

    X(t) and Y(t) are two random processes described as

    X(t) = A cos ωt + B sin ωt

    Y(t) = B cos ωt − A sin ωt

    where ω is a constant and A = N(0,σ2) and B = N(0,σ2). Find R XY (τ).

  17. 3.17

    Let X(t) be a stationary random process and Y(t) = X(t) − X(t − a), where a is a constant. Find R XY (τ).

  18. 3.18

    Let \( \left\{\mathit{\mathsf{N}}\left(\mathit{\mathsf{t}}\right),\mathit{\mathsf{t}}\ge 0\right\} \) be a Poisson process with rate λ. Find \( \mathit{\mathsf{E}}\left[\mathit{\mathsf{N}}\left(\mathit{\mathsf{t}}\right)\cdotp \mathit{\mathsf{N}}\left(\mathit{\mathsf{t}}+\mathit{\mathsf{s}}\right)\right] \).

  19. 3.19

    For a Poisson process, show that if s < t,

    $$ \mathsf{Prob}\left[\mathsf{N}\left(\mathsf{s}\right)=\mathsf{k}\Big|\mathsf{N}\left(\mathsf{t}\right)=\mathsf{n}\right]=\left(\begin{array}{c}\hfill \mathit{\mathsf{n}}\hfill \\ {}\hfill \mathit{\mathsf{k}}\hfill \end{array}\right){\left(\frac{\mathit{\mathsf{s}}}{\mathit{\mathsf{t}}}\right)}^{\mathit{\mathsf{k}}}{\left(1-\frac{\mathit{\mathsf{s}}}{\mathit{\mathsf{t}}}\right)}^{\mathit{\mathsf{n}}-\mathit{\mathsf{k}}},\mathit{\mathsf{k}}=0,1,\cdots, \mathit{\mathsf{n}} $$
  20. 3.20

    Let N(t) be a renewal process where renewal epochs are Erlang with parameters (m,λ). Show that

    $$ \mathsf{Prob}\left[\mathit{\mathsf{N}}\left(\mathit{\mathsf{t}}\right)=\mathit{\mathsf{n}}\right]={\displaystyle \sum_{\mathit{\mathsf{k}}=\mathit{\mathsf{n}\mathsf{m}}}^{\mathit{\mathsf{n}\mathsf{m}}+\mathit{\mathsf{m}}-1}\frac{{\left(\lambda \mathit{\mathsf{t}}\right)}^{\mathit{\mathsf{k}}}}{\mathit{\mathsf{k}}!}}{\mathit{\mathsf{e}}}^{-\lambda \mathit{\mathsf{t}}} $$
  21. 3.21

    Use MATLAB to generate a random process X(t) = A cos(2πt), where A is a Gaussian random variable with mean zero and variance one. Take 0 < t < 4 s.

  22. 3.22

    Repeat the previous problem if A is random variable uniformly distributed over (−2, 2).

  23. 3.23

    Given that the autocorrelation function \( {R}_X\left(\tau \right)=2+3{e}^{-{\tau}^2} \), use MATLAB to plot the function for −2 < τ < 2.

  24. 3.24

    Use MATLAB to generate a random process

    $$ X(t)=2 \cos \left(2\pi t+B\left[n\right]\frac{\pi }{4}\right) $$

    where B[n] is a Bernoulli random sequence taking the values of +1 and −1. Take 0 < t < 3 s.

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Sadiku, M.N.O., Musa, S.M. (2013). Stochastic Processes. In: Performance Analysis of Computer Networks. Springer, Cham. https://doi.org/10.1007/978-3-319-01646-7_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-01646-7_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-01645-0

  • Online ISBN: 978-3-319-01646-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics