A new approach to information theory
The fundamental contribution of this work is to show that it is possible to change the way that we view the basic question of sending information over channel. In particular we can show that codes that are only good on “average” suffice for worst case performance provided the channel is restricted to be a feasible one. These methods have recently been used to show that one can give a “constructive” proof to Shannon's Theorem for any feasible channel .
Unable to display preview. Download preview PDF.
- Babai, L., Fortnow L., Levin, L.A., Szegedy M. Checking computations in polylogarithmic time, in Proc. of the 23rd Annual Symp. on Theory of Computing, 21–31, 1991.Google Scholar
- Berelekamp, E.R. Algebraic Coding Theory. New York: McGraw-Hill, 1968.Google Scholar
- Blum, M. and Micali, S. How to generate cryptographically strong sequences of pseudorandom bits. IEEE FOCS 23, 112–117, 1982.Google Scholar
- Diffie, W. and Hellman, M.E. New directions in cryptography. IEEE Trans. Info. Theory 22, 644–654, 1976.Google Scholar
- Lipton, R. On Coding for Noisy Feasible Channels, unpublished manuscript, 1993.Google Scholar
- Lagararias, J. C. Pseudorandom Numbers in Probability and Algorithms, National Research Council, 1992.Google Scholar
- MacWilliams, J., and Sloane, N.J.A. The Theory of Error Correcting Codes. (Elsevier, American ed.) Amsterdam: North-Holland, 1977.Google Scholar
- McEliece, R.J. The Theory of Information and Coding. Reading, Mass: Addison Wesley, 1977.Google Scholar
- Shannon, C.E. A mathematical theory of communication. Bell System Tech. J. 27, 379–423 and 623–656, 1948.Google Scholar
- Dominic, W. Codes and Cryptography Oxford, 1988.Google Scholar