Suboptimum decoding using Kullback principle

  • Gérard Battail
  • Rémi Sfez
Information Theory
Part of the Lecture Notes in Computer Science book series (LNCS, volume 313)


Several successive decodings of cascaded codes become possible in principle without information loss if the decoding task is extended to determine a posterior probability distribution on the codewords. Kullback principle of cross-entropy minimization is considered as a means of implementing it. Its practical use, however, demands some kind of simplification. We propose to look for the posterior distribution in separable form with respect to the information symbols, which leads to decoding output of same form as its input. As an illustration of these ideas, we considered decoding an iterated product of parity-check codes which results in a vanishingly small error probability provided the channel signal-to-noise ratio is larger than some threshold. Interpreting a single linear code as a kind of product of its parity checks, the same ideas lead to a simple and efficient algorithm.


Posterior Distribution Error Probability Linear Code Parity Check Information Symbol 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    G. BATTAIL, Décodage Pondéré des Codes Linéaires, IPMU Symposium, Paris, 30 june–4 july 1986Google Scholar
  2. [2]
    G. BATTAIL, Le Décodage Pondéré en tant que Procédé de Réévaluation d'une Distribution de Probabilité, Annales des Télécommunications, 42 no 9–10, sept.–oct. 1987Google Scholar
  3. [3]
    P. ELIAS, Error-free Coding, IRE Trans. IT, jan. 1954, pp. 29–37Google Scholar
  4. [4]
    G.D. FORNEY Jr, Concatenated Codes, MIT Press, 1966Google Scholar
  5. [5]
    J.E. SHORE and R.W. JOHNSON, Axiomatic Derivation of the Principle of Maximum Entropy and the Principle of Minimum Cross-Entropy, IEEE Trans. Inf. Th., 26 no 1, jan. 1980, pp. 26–37Google Scholar
  6. [6]
    S. KULLBACK, Information Theory and Statistics, Wiley, 1959Google Scholar
  7. [7]
    R.A. SILVERMAN and M. BALSER, Coding for Constant Data Rate Systems — Part I. A New Error-Correcting Code, Proc. IRE, sept. 1954, pp. 1428–1435Google Scholar
  8. [8]
    G. BATTAIL, J.C. BELFIORE and R. SFEZ, Decoding as a Means of Reestimating a Probability Distribution; Application to an Iterated Product of Parity-Check Codes, submitted to IEEE International Symposium on Information Theory, Kobe, Japan, 17–24 july 1988Google Scholar
  9. [9]
    G. BATTAIL, M. DECOUVELAERE and P. GODLEWSKI, Replication Decoding, IEEE Trans. Inf. Th., 25 no 3, may 1979, pp. 332–345Google Scholar
  10. [10]
    N. ABRAMSON, Cascade Decoding of Cyclic Product Codes, IEEE Trans. on Com., COM-16 no 3, june 1968, pp. 398–402Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1988

Authors and Affiliations

  • Gérard Battail
    • 1
  • Rémi Sfez
    • 1
  1. 1.Ecole Nationale Supérieure des TélécommunicationsParis Cedex 13France

Personalised recommendations