BSC: Testing of Hypotheses with Information Constraints

  • Marat V. Burnashev
  • Shun-ichi Amari
  • Te Sun Han


A problem of hypothesis testing on the crossover probability of a BSC is considered. We observe only the channel output and our helper only observes the channel input and can send us some limited amount of information about the input block. What kind of that information allows us to make the best statistical inferences? In particular, what is the minimal information sufficient to get the same results as if we could observe directly all data? Some upper bounds for that minimal amount of information and some related results are obtained.


Error Probability Dual Problem Channel Output Reliability Function Unique Root 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    R. Ahlswede and I. Csiszâr, “ Hypothesis testing with communication constraints”, IEEE Trans. Inform. Theory 32 (4), 1986, 533–542.MathSciNetMATHCrossRefGoogle Scholar
  2. [2]
    Z. Zhang and T. Berger, “Estimation via compressed information”, IEEE Trans. Inform. Theory 34 (2), 1988, 198–211.MathSciNetMATHCrossRefGoogle Scholar
  3. [3]
    T.S. Han and K. Kobayashi, “Exponential—type error probabilities for multiterminal hypothesis testing”, IEEE Trans. Inform. Theory 35 (1), 1989, 2–14.MathSciNetMATHCrossRefGoogle Scholar
  4. [4]
    R. Ahlswede and M.V. Burnashev, “On Minimax estimation in the presence of side information about remote data”, The Annals of Statistics 18 (1), 1990, 141–171.MathSciNetMATHCrossRefGoogle Scholar
  5. [5]
    T.S. Han and S. Amari, “Parameter estimation with multiterminal data compression”, IEEE Trans. Inform. Theory 41 (6), 1995, 1802–1833.MathSciNetMATHCrossRefGoogle Scholar
  6. [6]
    I.A. Ibragimov and R.Z. Has’minskii, Statistical Estimation. Asymptotic Theory, Springer—Verlag, 1981.Google Scholar
  7. [7]
    R.M. Fano, Transmission of Information. A Statistical Theory of Communication, MITandWiley, New York—London, 1961.Google Scholar
  8. [8]
    R.G.Gallager, Information Theory and Reliable Communication, Wiley, New York—London—Sydney—Toronto, 1968.Google Scholar
  9. [9]
    R. Ahlswede and I. Althöfer, “The asymptotic behavior of diameters in the average”, Journal of Combinatorial Theory, Ser. B 61 (2), 1994, 167–177.MathSciNetMATHCrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2000

Authors and Affiliations

  • Marat V. Burnashev
    • 1
  • Shun-ichi Amari
    • 2
  • Te Sun Han
    • 3
  1. 1.Institute for Problems of Information Transmission, RASMoscowRussia
  2. 2.RIKEN Brain Science InstituteSaitamaJapan
  3. 3.Graduate School of Information SystemsUniversity of Electro-CommunicationsChofu, Tokyo 182Japan

Personalised recommendations