Abstract
For coded transmission over a memoryless channel, two kinds of mutual information are considered: the mutual information between a code symbol and its noisy observation and the overall mutual information between encoder input and decoder output. The overall mutual information is interpreted as a combination of the mutual informations associated with the individual code symbols. Thus, exploiting code constraints in the decoding procedure is interpreted as combining mutual informations. For single parity check codes and repetition codes, we present bounds on the overall mutual information, which are based only on the mutual informations associated with the individual code symbols. Using these mutual information bounds, we compute bounds on extrinsic information transfer (exit) functions and bounds on information processing characteristics (ipc) for these codes.
Résumé
Dans le cas d’une transmission codée sur un canal sans mémoire, deux types d’information mutuelle sont pris en compte : d’une part l’information mutuelle entre un symbole de code et son observation bruitée et, d’autre part, l’information mutuelle globale entre l’entrée du codeur et la sortie du décodeur. Cette dernière peut s’exprimer par une combinaison des informations mutuelles associées aux symboles de code individuel. Par conséquent, l’exploitation des contraintes du code dans le procédé de décodage peut être interprétée par une combinaison d’informations mutuelles. Pour des codes à un seul contrôle de parité et des codes à répétition, nous présentons des bornes sur l’information mutuelle globale basées uniquement sur les informations mutuelles associées aux symboles de code individuels. Utilisant ces bornes, nous calculons des bornes sur les fonctions de transfert de l’information extrinsèque (exit), ainsi que des bornes sur les caractéristiques de traitement de l’information (ipc) pour ces codes.
Similar content being viewed by others
References
Shannon (C. E.), “A mathematical theory of communication”,Bell System Technical Journal,27, pp. 379–423, 623–656, July and Oct. 1948.
Huettinger (S.),Huber (J.),Johannesson (R.),Fischer (R.), “Information processing in soft-output decoding”, inProc. Allerton Conf on Communications, Control, and Computing, Monticello, Illinois, USA, Oct. 2001.
Huettinger (S.),Huber (J.),Fischer (R.),Johannesson (R.), “Soft-output-decoding: Some aspects from information theory”, inProc. Int. itg Conf on Source and Channel Coding, Berlin, Germany, Jan. 2002, pp. 81–90.
Huber (J.),Huettinger (S.), “Information processing and combining in channel coding”, inProc. Int. Symp. on Turbo Codes & Rel. Topics, Brest, France, Sept. 2003, pp. 95–102.
ten Brink (S.), “Convergence behavior of iteratively decoded parallel concatenated codes”,IEEE Trans. Commun.,49, no 10, pp. 1727–1737, Oct. 2001.
ten Brink (S.), “Code characteristic matching for iterative decoding of serially concatenated codes”,Ann. Télécommun.,56, no 7–8, pp. 394–408, 2001.
Land (I.),Huettinger (S.),Hocher (R A.),Huber (J.), “Bounds on information combining”, inProc. Int. Symp. on Turbo Codes & Rel. Topics, Brest, France, Sept. 2003, pp. 39–42.
Huettinger (S.),Huber (J.), “Performance estimation for concatenated coding schemes”, inProc. IEEE Inform. Theory Workshop, Paris, France, Mar/Apr. 2003, pp. 123–126.
Cover (T. M.),Thomas (J. A.),Elements of Information Theory. Wiley-Interscience, 1991.
Huettinger (S.),Huber (J.), “Extrinsic and intrinsic information in systematic coding”, inProc. IEEE Int. Symp. Inform. Theory (isit), Lausanne, Switzerland, June 2002, p. 116.
Berrou (C.), Glavieux (A.), “Near optimum error correcting coding and decoding: Turbo-codes”,IEEE Trans. Commun.,44, no 10, pp. 1261–1271, Oct. 1996.
Hagenauer (J.), Offer (E.), Papke (L.), “Iterative decoding of binary block and convolutional codes”,IEEE Trans. Inform. Theory,42, no 2, pp. 429–445, Mar. 1996.
Land (I.),Hoeher (R A.),Huber (J.), “Bounds on information combining for parity-check equations”, inProc. Int. Zurich Seminar on Communications (izs), Zurich, Switzerland, Feb. 2004.
Sutskover (I.),Shamai (S.),Ziv (J.), “Extremes of information combining”, inProc. Allerton Conf on Communications, Control, and Computing, Monticello, Illinois, USA, Oct. 2003.
Wyner (A. D.), Ziv (J.), “A theorem on the entropy of certain binary sequences and applications: PartI”,IEEE Trans. Inform. Theory,19, no 6, pp. 769–772, Nov. 1973.
Land (I.),Hoeher (P. A.),Gligorević (S.), “Computation of symbol-wise mutual information in transmission systems with Logapp decoders and application to exit charts”, inProc. Int. itg Conf on Source and Channel Coding, Erlangen, Germany, Jan.2004.
Gallager (R.), “Low-density parity-check codes”,IEEE Trans. Inform. Theory,8, no 1, pp. 21–28, Jan. 1962.
MacKay (D. J), “Good error-correcting codes based on very sparse matrices”,IEEE Trans. Inform. Theory,45, no 2, pp. 399–431, Mar. 1999.
Richardson (T. J), Urbanke (R.), “The capacity of low-density parity-check codes under message-passing decoding”,IEEE Trans. Inform. Theory,47, no 2, pp. 619–637, Feb. 2001.
Divsalar (D.),Hin (H. J),McEliece (R. J.), “Coding theorems for ‘turbo-like’ codes”, inProc. Allerton Conf on Communications, Control, and Computing, Monticello, Illinois, USA, Sept. 1998, pp. 201–210.
ten Brink (S.), Kramer (G.), “Design of repeat-accumulate codes for iterative detection and decoding”,IEEE Trans. Signal Processing,51, no 11, pp. 2764–2772, Nov. 2003.
ten Brink (S.), “Rate one-half code for approaching the Shannon limit by 0.1 dB”, Electron. Letters,36, no 15, pp. 1293–1294, July 2000.
Huettinger (S.),ten Brink (S.),Huber (J.), “Turbo-code representation of RA-codes anddsr-codes for reduced decoding complexity”, in Proc. Conf. Inform. Sciences and Systems (ciss), The Johns Hopkins University, Baltimore, MD, USA, Mar. 2001, pp. 118–123.
Benedetto (S.), Montorsi (G.), “Unveiling turbo codes: Some results on parallel concatenated coding schemes”,IEEE Trans. Inform. Theory,42, no 2, pp. 409–428, Mar. 1996.
Author information
Authors and Affiliations
Additional information
Ingmar Land studied electrical engineering at the Universities of Ulm and Erlangen-Nürnberg, Germany. After receiving his Dipl.-Ing. degree in 1999, he joined the University of Kiel, Germany, as a research and teaching assistant. Since 2004, he is assistant professor at the University of Aalborg. His research topics are information and coding theory with focus on iterative decoding.
Simon Huettinger received the Dr.-Ing. Degree from the University of Erlangen-Nürnberg in June 2003. From 2000 to 2003 he was with the Institute for Information Transmission of the University of ErlangenNürnberg. Presently, he is with Siemens AG Corporate Technology.
Peter A. Hoeher received the Dipl.-Ing. and Dr.-Ing. degrees from the Technical University of Aachen, Germany, and the University of Kaiserslautern, Germany, in 1986 and 1990. From 1986 to 1998, he has been with the German Aerospace Research Establishment (DLR) in Oberpfaffenhofen. In 1992, he was on leave at AT&T Bell Laboratories in Murray Hill, NJ. Since 1998, he is a Professor at the University of Kiel, Germany.
Johannes Huber is Professor for communication engineering at the Universitat Erlangen-Nürnberg, Germany since 1991. His research interests are information and coding theory, modulation schemes, algorithms for signal detection and adaptive equalization, multiple-input multiple-output (MIMO) channels, concatenated coding together with iterative decoding.
Rights and permissions
About this article
Cite this article
Land, I., Huettinger, S., Hoeher, P.A. et al. Bounds on mutual information for simple codes using information combining. Ann. Télécommun. 60, 184–214 (2005). https://doi.org/10.1007/BF03219813
Received:
Accepted:
Issue Date:
DOI: https://doi.org/10.1007/BF03219813
Key words
- Information theory
- Information measure
- Error correcting code
- Concatenation
- Iteration
- Memoryless channel
- Binary channel