Skip to main content
Log in

Theoretical expression of link performance in OFDM cellular networks with MIMO compared to simulation and measurements

  • Published:
annals of telecommunications - annales des télécommunications Aims and scope Submit manuscript

Abstract

The objective of this paper is to establish a theoretical expression of the link performance in the downlink of a multiple input multiple output (MIMO) cellular network and compare it to the real Long-Term Evolution (LTE ) performance. In order to account for the interference, we prove that the worst additive noise process in the MIMO context is the white Gaussian one. Based on this theoretical result, we build an analytic expression of the link performance in LTE cellular networks with MIMO. We study also the minimum mean square error (MMSE) scheme currently implemented in the field, as well as its improvement MMSE-SIC (successive interference cancellation) known to achieve the MIMO capacity. Comparison to simulation results as well as to measurements in the field shows that the theoretical expression predicts well practical link performance of LTE cellular networks. This theoretical expression of link performance is the basis of a global analytic approach to the evaluation of the quality of service perceived by the users in the long run of their arrivals and departures.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Bonald T, Borst SC, Hegde N, Jonckheere M, Proutière A (2009) Flow-level performance and capacity of wireless networks with user mobility. Queueing Syst 63(1–4):131–164

    Article  MATH  MathSciNet  Google Scholar 

  2. Rong L, Elayoubi SE, Haddada OB (2011) Performance evaluation of cellular networks offering TV services. IEEE Trans Veh Technol 60(2):644–655

    Article  Google Scholar 

  3. Karray MK, Jovanovic M (2013) A queueing theoretic approach to the dimensioning of wireless cellular networks serving variable bit-rate calls. IEEE Trans Veh Technol 62:6

    Article  Google Scholar 

  4. 3GPP (2010) TR 36.814-V900 further advancements for E-UTRA - physical layer aspects. In: 3GPP Ftp Server

  5. Telatar IE (1995) Capacity of multi-antenna Gaussian channels. AT&T Technical Memorandum

  6. Foschini GJ, Gans MJ (1998) On limits of wireless communications in a fading environment when using multiple antennas. Wirel Pers Commun 6(3):311–335

    Article  Google Scholar 

  7. Song Y, Blostein SD (2002) MIMO channel capacity in co-channel interference. In: Proc. of 21st biennial symposium on communications

  8. Blum RS, Winters JH, Sollenberger NR (2002) On the capacity of cellular systems with MIMO. IEEE Commun Lett 6(6):242–244

    Article  Google Scholar 

  9. Clark A, Smith PJ, Taylor DP (2007) Instantaneous capacity of OFDM on Rayleigh-fading channels. IEEE Trans Inf Theory 53(1):355–361

    Article  MathSciNet  Google Scholar 

  10. Tulino AM, Verdú S (2004) Random matrix theory and wireless communications. Found Trends Commun Inf Theory 1(1)

  11. Tse DNC, Hanly SV (1999) Linear multiuser receivers: effective interference effective bandwidth and user capacity. IEEE Trans Inf Theory 45(2)

  12. Verdu S, Shamai S (1999) Spectral efficiency of CDMA with random spreading. IEEE Trans Inf Theory 45(2)

  13. Evans J, Tse DNC (2000) Large system performance of linear multiuser receivers in multipath fading channels. IEEE Trans Inf Theory 46(6)

  14. Goldsmith A, Jafar SA, Jindal N, Vishwanath S (2003) Capacity limits of MIMO channels. IEEE J Select Areas Commun 21(5):684–702

    Article  Google Scholar 

  15. Goldsmith AJ, Chua S-G (1997) Variable-rate variable-power MQAM for fading channels. IEEE Trans Commun 45:1218–1230

    Article  Google Scholar 

  16. Mogensen PE, Na W, Kovács IZ, Frederiksen F, Pokhariyal A, Pedersen KI, Kolding TE, Hugl K, Kuusela M (2007) LTE capacity compared to the Shannon bound. In: Proc. of VTC spring, pp 1234–1238

  17. Shannon C (1948) A mathematical theory of communication. Bell Syst Tech J 27:379–423, 623–656

    Article  MATH  MathSciNet  Google Scholar 

  18. Shomorony I, Avestimehr AS (2012) Worst-case additive noise in wireless networks. CoRR abs/1202.2687

  19. Diggavi S, Cover T (2001) The worst additive noise under a covariance constraint. IEEE Trans Inf Theory 47(7):3072–3081

    Article  MATH  MathSciNet  Google Scholar 

  20. Girnyk M, Vehkaperä M, Rasmussen L (2012) On the asymptotic sum-rate of uplink mimo cellular systems in the presence of non-gaussian inter-cell interference. In: Proc. of Globecom

  21. Karray MK, Jovanovic M (2012) Theoretically feasible QoS in a MIMO cellular network compared to the practical LTE performance. In: Proc. of ICWMC

  22. Lozano A, Tulino AM (2002) Capacity of multiple-transmit multiple-receive antenna architectures. IEEE Trans Inf Theory 48(12)

  23. Bolcskei H, Gesbert D, Paulraj AJ (2002) On the capacity of OFDM-based spatial multiplexing systems. IEEE Trans Commun 50(2)

  24. Tse D, Viswanath P (2005) Fundamentals of wireless communication. Cambridge University Press, Cambridge

    Book  MATH  Google Scholar 

  25. Caire G, Taricco G, Biglieri E (1999) Optimum power control over fading channels. IEEE Trans Inf Theory 45(5):1468–1489

    Article  MATH  MathSciNet  Google Scholar 

  26. Brémaud P (2009) Initiation aux probabilités et aux chaînes de Markov. Springer, France

    Book  MATH  Google Scholar 

  27. 3GPP (2010) TR 36.211-V910 Evolved Universal Terrestrial Radio Access (E-UTRA) - Physical Channels and Modulation. In: 3GPP Ftp Server

  28. Dahlman E, Parkvall S, Skold J (2011) 4g: LTE/LTE-advanced for mobile broadband. Academic, New York

    Google Scholar 

  29. 3GPP (2010) TR 36.942-V830 Evolved Universal Terrestrial Radio Access (E-UTRA) - Radio Frequency (RF) system scenarios. In: 3GPP Ftp Server

  30. 3GPP (2012) TS 36.101-V8.16.0 Evolved Universal Terrestrial Radio Access (E-UTRA)-User Equipment (UE) radio transmission and reception: In: 3GPP Ftp Server

  31. Ihara S (1993) Information theory for continuous systems. World Scientific, Singapore

    Book  MATH  Google Scholar 

  32. Medard M. (1997) Capacity of correlated jamming channels. In: Allerton conference on communications, computing and control

  33. Kashyap A, Basar T, Srikant R (2004) Correlated jamming on MIMO Gaussian fading channels. IEEE Trans Inf Theory 50(9):2119–2123

    Article  MATH  MathSciNet  Google Scholar 

  34. Gallager R (1968) Information theory and reliable communication. Wiley, New York

    MATH  Google Scholar 

  35. Rapajic PB, Popescu D (2000) Information capacity of a random signature multiple-input multiple-output channel. IEEE Trans Commun 48(8)

Download references

Acknowledgments

We thank M. Debbah (Supelec) as well as A. Saadani, S. Jeux, A. Jassal, J.-J. Bourhis and F. Huet (Orange Labs) for useful exchanges related to the present paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mohamed Kadhem Karray.

Appendix A: MIMO flat fading channel with additive noise

Appendix A: MIMO flat fading channel with additive noise

We shall establish in the present appendix a useful lower bound of the capacity of a general additive noise channel, where the noise is not necessarily Gaussian nor white. Our motivation is that interference in wireless networks does not have necessarily these properties.

1.1 A.1 Model

Consider a discrete-time model of a multiple input and multiple output (MIMO) channel with t transmitting and r receiving antennas such that, at each time n = 1,2,…, the channel output \(Y_{n}\in \mathbb {C}^{r}\) is related to the channel input \(X_{n}\in \mathbb {C}^{t}\) by

$$ Y_{n}=HX_{n}+B_{n},\quad n=1,2,\ldots $$
(A.1)

where H is a complex matrix of dimension r×t modelling the fading, and \(B_{1},B_{2},\ldots \in \mathbb {C}^{r}\) is the noise process. We assume that the fading matrix H is deterministic. The channel input is subject to a power constraint of the form

$$\frac{1}{n}\sum\limits_{k=1}^{n}X_{k}^{\ast}X_{k}\leq\mathcal{P},\quad n=1,2,\ldots $$

where \(\mathcal {P}\) is a given positive constant and \(X_{k}^{\ast }\) designates the transpose complex conjugate of X k . Note that the above constraint concerns the total power aggregated over all the t transmitters and averaged over n channel uses. The channel (A.1) is called MIMO additive noise channel with deterministic fading.

1.2 A.2 Capacity lower bound

We are interested in the capacity of the channel (A.1) when the noise samples B 1, B 2,… are not necessarily Gaussian nor independent. We shall in fact establish an explicit lower bound for this capacity.

We begin by some definitions and notation. The identity matrix of dimension r×r is denoted by I r . The covariance matrix of a centred random vector \(X\in \mathbb {C}^{t}\) is denoted by

$$\Gamma_{X}=E\left[ XX^{\ast}\right] $$

The covariance matrix of two centred random vectors \(X\in \mathbb {C}^{t}\) and \(Y\in \mathbb {C}^{r}\) is denoted by

$$\Gamma_{XY}=E\left[ XY^{\ast}\right] $$

A random vector \(X\in \mathbb {C}^{n}\) is called circularly symmetric if e iϕ X has the same distribution as X for all \(\phi \in \mathbb {R}\) which implies that X is centred.

From now on, all the considered random vectors are assumed to have well-defined entropies [31, §1.3]. For example, if the random vector \(X\in \mathbb {C}^{n}\) has a density p X with respect to the Lebesgue measure on \(\mathbb {C}^{n}\), then its entropy is defined by \(h(X)=-\int _{\mathbb {C}^{n}}p_{X}(x)\log p_{X}(x)dx\) provided the Lebesgue integral is well defined. We denote by \(I\left (X;Y\right ) \) the mutual information between two random vectors X and Y which is related to the entropy by [31, Theorem 1.6.2]

$$ I\left( X;Y\right) =h\left( X\right) -h\left( X|Y\right) $$
(A.2)

where \(h\left (X|Y\right ) \) is the entropy of X conditionally to Y.

We give now two preliminary lemmas.

Lemma A.1

Let X 1 ,X 2 ,…,X n be random vectors in \(\mathbb {C}^{t}\) and Y 1 ,Y 2 ,…,Y n be random vectors in \(\mathbb {C}^{r}\) . Denote \(X^{\left (n\right )} =\left (X_{1},X_{2} ,\ldots ,X_{n}\right ) \) and \(Y^{\left (n\right )} =\left (Y_{1},Y_{2} ,\ldots ,Y_{n}\right ) \) . If X 1 ,X 2 ,…,X n are independent, then

$$I\left( X^{\left( n\right)} ;Y^{\left( n\right)} \right) \geq\sum\limits _{k=1}^{n}I\left( X_{k};Y_{k}\right) $$

Proof

The mutual information may be expressed in terms of the entropy as follows [31, Theorem 1.6.2]

$$I\left( X^{\left( n\right)} ;Y^{\left( n\right)} \right) =h\left( X^{\left( n\right)} \right) -h\left( X^{\left( n\right)} |Y^{\left( n\right)} \right) $$

Since X 1, X 2,…, X n are independent, the entropy \(h\left (X^{\left (n\right )} \right ) \) may be decomposed as the sum of the individual entropies [31, Theorem 1.3.2 (h.6)] \(h\left (X^{\left (n\right )} \right ) =\sum \limits _{k=1}^{n}h\left (X_{k}\right ) \). On the other hand, the conditional entropy \(h\left (X^{\left (n\right )} |Y^{\left (n\right )} \right ) \) may be bounded as follows

$$\begin{array}{@{}rcl@{}} h\left( X^{\left( n\right)} |Y^{\left( n\right)} \right) & =&\sum\limits _{k=1}^{n}h\left( X_{k}|Y^{\left( n\right)} ,X_{1},\ldots,X_{k-1}\right) \\ & \leq&\sum\limits_{k=1}^{n}h\left( X_{k}|Y^{\left( n\right)} \right) \leq \sum\limits_{k=1}^{n}h\left( X_{k}|Y_{k}\right) \end{array} $$

where for the first equality, we use [31, Theorem 1.3.2 (h.7)], and for the two above inequalities, we use [31, Theorem 1.3.2 (h.7)] and [31, Theorem 1.3.2 (h.5)], respectively. Combining the above three equations, we get the desired result.

The following lemma may be seen as an extension of [31, Theorem 1.8.6] or [19, Lemma II.2] to the complex case. Our proof is inspired by [32] and [33].

Lemma A.2

Consider three random vectors \(X\in \mathbb {C} ^{t}\) , \(Y,\tilde {Y}\in \mathbb {C}^{r}\) . Assume that the random vector \(\left (X,\tilde {Y}\right ) \) is circularly symmetric Gaussian with the same covariance matrix as \(\left (X,Y\right ) \) and that Γ Y is invertible. Then,

$$I(X;Y)\geq I(X;\tilde{Y}) $$

Proof

For any deterministic matrix \(A\in \mathbb {C}^{t\times r}\),

$$ h\left( X|Y\right) =h\left( X-AY|Y\right) \leq h\left( X-AY\right) $$
(A.3)

where for the above inequality, we use [31, Theorem 1.3.2]. In particular, taking \(A=\Gamma _{XY}\Gamma _{Y}^{-1}\) in which case A Y is the best quadratic approximation of X by a linear function of Y, and letting U = XA Y, we get

$$ h\left( X-AY\right) =h(U)\leq\log\left[ \det\left( \pi e\Gamma_{U}\right) \right] $$
(A.4)

Combining the above two inequalities, we get \(h\left (X|Y\right ) \leq \log \left [ \det \left (\pi e\Gamma _{U}\right ) \right ] \). Then, Eq. (A.2) implies

$$ I\left( X;Y\right) \geq h\left( X\right) -\log\det\left( \pi e\Gamma _{U}\right) $$
(A.5)

Apply now the above arguments with \(\tilde {Y}\) in the role of Y. Observe that \(\tilde {U}=X-A\tilde {Y}\) is circularly symmetric Gaussian, thus equality holds in (A.4). Moreover, \(\tilde {U}\) is independent from \(\tilde {Y}\) since \(\Gamma _{\tilde {U}\tilde {Y}}=E\left [ \tilde {U}\tilde {Y}^{\ast }\right ] =\Gamma _{XY}-A\Gamma _{Y}=0\), and decorrelation implies independence for circularly symmetric Gaussian random vectors. Thus, equality holds also in (A.3) which shows that

$$I\left( X;\tilde{Y}\right) =h\left( X\right) -\log\det\left( \pi e\Gamma_{\tilde{U}}\right) $$

which combined with the observation that \(\Gamma _{U}=\Gamma _{X}-\Gamma _{XY}\Gamma _{Y}^{-1}\Gamma _{YX}=\Gamma _{\tilde {U}}\) and (A.2) finishes the proof of the desired inequality.

We show now that the above lemmas permit to deduce a lower bound for the capacity of the channel (A.1). The considered channel has memory, i.e. different channels uses are not independent because of the noise samples might be correlated, thus its information capacity C is defined as follows

$$C=\liminf_{n\rightarrow\infty}\frac{1}{n}C^{\left( n\right)} $$

where

$$C^{\left( n\right)} =\sup_{X^{\left( n\right)} }\left\{ I\left( X^{\left( n\right)} ;Y^{\left( n\right)} \right) ;\frac{1}{n}\sum\limits _{k=1}^{n}E\left[ X_{k}^{\ast}X_{k}\right] \leq\mathcal{P}\right\} $$

where \(X^{\left (n\right )} =\left (X_{1},\ldots ,X_{n}\right ) \) is a random object with values in \(\left (\mathbb {C}^{t}\right ) ^{n}\); and \(Y^{\left (n\right )} \) is the output of the channel associated to the input \(X^{\left (n\right )} \).

Proposition A.1

Assume that the covariance matrix \(E[B_{k}B_{k}^{\ast }]\) of the noise B k is finite for all \(k\in \mathbb {N}\) and denote

$$\mathcal{N}_{n}=\frac{1}{n}\sum\limits_{k=1}^{n}E[B_{k}B_{k}^{\ast}] $$

Then, the information capacity of the channel (A.1), given the fading matrix H, is lower bounded by

$$ C\geq\liminf_{n\rightarrow\infty}\left[ \log_{2}\det\left( I_{r} +\frac{\mathcal{P}}{t}HH^{\ast}\mathcal{N}_{n}^{-1}\right) \right] $$
(A.6)

The above inequality remains true under the additional constraint that the signals emitted by the transmitting antennas are independent and have equal powers.

Proof

Consider independent inputs X 1, X 2,…, then by Lemma A.1

$$I\left( X^{\left( n\right)} ;Y^{\left( n\right)} \right) \geq {\displaystyle\sum\limits\limits_{k=1}^{n}} I\left( X_{k};Y_{k}\right) $$

Assume now that each X k is circularly symmetric Gaussian, independent of B k , and with covariance matrix \(E[X_{k}X_{k}^{\ast }]=\frac {\mathcal {P}} {t}I_{t}\). Note that

$$E\left[ X_{k}Y_{k}^{\ast}\right] =E\left[ X_{k}X_{k}^{\ast}H^{\ast}\right] +E[X_{k}B_{k}^{\ast}]=\frac{\mathcal{P}}{t}H^{\ast} $$

and

$$\begin{array}{@{}rcl@{}} E\left[ Y_{k}Y_{k}^{\ast}\right] &=&E\left[ HX_{k}X_{k}^{\ast}H^{\ast} \right] +E[B_{k}B_{k}^{\ast}]\\ &=&\frac{\mathcal{P}}{t}HH^{\ast}+E[B_{k} B_{k}^{\ast}] \end{array} $$

Consider \(\tilde {B}_{k}\) circularly symmetric Gaussian independent from X k and with the same covariance matrix as B k . Denoting \(\tilde {Y}_{k}=HX_{k}+\tilde {B}_{k}\), then \(\left (X_{k},\tilde {Y}_{k}\right ) \) is circularly symmetric Gaussian with the same covariance matrix as \(\left (X_{k},Y_{k}\right ) \). We deduce from Lemma A.2 that

$$\begin{array}{@{}rcl@{}} I(X_{k};Y_{k}) & \geq& I(X_{k};\tilde{Y}_{k})\\ & =&h(\tilde{Y}_{k})-h(\tilde{Y}_{k}|X_{k})\\ & =&h(\tilde{Y}_{k})-h(\tilde{B}_{k})\\ & =&\log_{2}\left[ \det\left( \pi e\left( \frac{\mathcal{P}}{t}HH^{\ast} +E[B_{k}B_{k}^{\ast}]\right) \right) \right] \\ && -\log_{2}\left[ \det\left( \pi eE[B_{k}B_{k}^{\ast}]\right) \right] \\ & =&\log_{2}\det\left( I_{r}+\frac{\mathcal{P}}{t}HH^{\ast}E[B_{k}B_{k} ^{\ast}]^{-1}\right) \end{array} $$

Using Jensen’s inequality and convexity of the function \(A\mapsto \log _{2} \det \left (I_{r}+\frac {\mathcal {P}}{t}HH^{\ast }A^{-1}\right ) \) on the set of positive definite matrices of \(\mathbb {C}^{r\times r}\), cf. [19, Lemma II.3], we obtain

$$\frac{1}{n} {\displaystyle\sum\limits\limits_{k=1}^{n}} I\left( X_{k};Y_{k}\right) \geq\log_{2}\det\left( I_{r}+\frac{\mathcal{P}} {t}HH^{\ast}\mathcal{N}_{n}^{-1}\right) $$

which concludes the proof of (A.6). The last statement in the Proposition follows from the fact that the above inequality is proved for inputs X 1, X 2,… such that each X k is circularly symmetric Gaussian with covariance matrix \(E[X_{k}X_{k}^{\ast }]=\frac {\mathcal {P}} {t}I_{t}\), which implies that, for each \(k\in \mathbb {N}^{\ast }\), the components of the vector X k are independent from each other.

We make an observation and give a corollary.

Remark A.1

Assume that B 1, B 2,… have the same covariance matrix \(E[B_{k} B_{k}^{\ast }]=\mathcal {N}\), then, the right-hand side of (A.6) equals

$$\log_{2}\det\left( I_{r}+\frac{\mathcal{P}}{t}HH^{\ast}\mathcal{N} ^{-1}\right) $$

Note that the above formula gives also the capacity of a MIMO channel with additive circularly symmetric Gaussian noise process with independent samples and equi-partition of power between the transmitting antennas.

The following Corollary of Proposition A.1 states that, for a single input and single output (SISO) channel t = r = 1, the worst additive noise process distribution (not necessarily white nor Gaussian) for capacity with given second moment is the additive white Gaussian noise (AWGN). This result may be seen as an extension of Gallager’s result [34, Theorem 7.4.3] for memoryless channels to the channels with memory. It may also be deduced from Shannon’s result [17, Theorem 18]—proved there by the entropy power inequality—and from the fact that the entropy power is not larger than the average power.

Corollary A.1

Consider a SISO channel whose input and output, at time n, represented by \(X_{n}\in \mathbb {C}\) , \(Y_{n}\in \mathbb {C}\) , respectively, are related by

$$Y_{n}=X_{n}+B_{n},\quad n=1,2,\ldots $$

where the noise process \(B_{1},B_{2},\ldots \in \mathbb {C}\) is assumed stationary and satisfies \(E\left [ \left \vert B_{n}\right \vert ^{2}\right ] =N\) . Assume that the channel has a power constraint in the form \(\frac {1} {n}\sum \limits _{k=1}^{n}\left \vert X_{k}\right \vert ^{2}\leq \mathcal {P}\) . Then, the information capacity C of the channel is lower bounded by

$$ C\geq\log_{2}\left( 1+\frac{\mathcal{P}}{N}\right) $$
(A.7)

1.3 A.3 Asymptotic analysis

Consider a particular case where the noise samples B 1, B 2,… are i.i.d. each being circularly symmetric Gaussian with covariance matrix \(E[B_{n}B_{n}^{\ast }]=NI_{r}\) where N is a given positive constant. In this case, the right-hand side of (A.6 ) equals

$$ \log_{2}\det\left( I_{r}+\frac{\mathcal{P}}{tN}HH^{\ast}\right) =\log _{2}\det\left( I_{r}+\frac{P}{t}HH^{\ast}\right) $$
(A.8)

where \(P=\frac {\mathcal {P}}{N}\) is the signal to noise power ratio (SNR). For given t and r, the capacity (A.8) depends on H.

Frequently, one is interested in the ergodic capacity, that is the expectation of the capacity with respect to the fading matrix H assumed random with a given distribution. Assume for example that H has i.i.d. components each being circularly symmetric Gaussian with variance 1. In this case, the ergodic capacity may be calculated with the help of the analytical result given by Telatar [5, Theorem 2].

Alternatively, the capacity (A.8) may be approximated with the help of the following asymptotic result saying that when the number of transmitting and receiving antennas go to infinity, the capacity per receiving antenna converges to a deterministic limit.

Lemma A.3

[12, Equations (9), (38)] , [35, Appendix] Assume that the fading matrix \(H\in \mathbb {C}^{r\times t}\) has i.i.d. components, centred and with variance 1. Assume that \(t,r\rightarrow \infty \) such that \(\frac {t}{r}\rightarrow \beta \in \mathbb {R}_{+}^{\ast }\) , then

$$ \frac{1}{r}\log\det\left( I_{r}+\frac{P}{t}HH^{\ast}\right) \rightarrow \mathcal{C}\left( P,\beta\right) $$
(A.9)

almost surely, where

$$\begin{array}{@{}rcl@{}} \mathcal{C}\left( P,\beta\right) & :=&\beta\log\left( 1+\frac{P}{\beta} -\frac{1}{4}\mathcal{F}\left( \frac{P}{\beta},\beta\right) \right) \\ && +\log\left( 1+P-\frac{1}{4}\mathcal{F}\left( \frac{P}{\beta} ,\beta\right) \right)\\ &&-\frac{\beta}{4P}\mathcal{F}\left( \frac{P}{\beta} ,\beta\right)\notag\\ \end{array} $$
(A.10)

where

$$\mathcal{F}\left( \xi,\beta\right) =\left( \sqrt{\xi\left( 1+\sqrt{\beta} \right) ^{2}+1}-\sqrt{\xi\left( 1-\sqrt{\beta}\right) ^{2}+1}\right) ^{2} $$

The above asymptotic result gives a good approximation of the expectation \(E\left [ \frac {1}{r}\log \det \left (I_{r}+\frac {P}{t}HH^{\ast }\right ) \right ] \) even for a small number of antennas as already observed in [22, Table 1] for the value SNR=10 dB and confirmed for the whole typical range of SNR values in wireless cellular networks in Fig. 6 surprisingly even for a single antenna at the emitter and at the receiver.

Fig. 6
figure 6

Comparison of the analytic capacity and the asymptotic formula for MIMO t×r (t transmitting antennas and r receiving antennas)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Karray, M.K., Jovanovic, M. & Błaszczyszyn, B. Theoretical expression of link performance in OFDM cellular networks with MIMO compared to simulation and measurements. Ann. Telecommun. 70, 479–490 (2015). https://doi.org/10.1007/s12243-015-0469-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12243-015-0469-4

Keywords

Navigation