Machine Learning

, Volume 40, Issue 1, pp 5–33 | Cite as

Implementation Issues in the Fourier Transform Algorithm

  • Yishay Mansour
  • Sigal Sahar


The Fourier transform of Boolean functions has received considerable attention in the last few years in the computational learning theory community, and has come to play an important role in proving many important learnability results. The aim of this work is to demonstrate that the Fourier transform techniques are also a useful and practical algorithm, in addition to having many interesting theoretical properties. In fact, this work was prompted by a genuine problem that was brought to our attention; researchers at a company were trying to come by a method to reverse-engineer a state-free controller. They had the capability of querying the controller on any input, thus setting them in the membership query model, in which the Fourier transform algorithm is set.

In order to keep the algorithm run-time reasonable and still produce accurate hypotheses, we had to perform many optimizations. In the paper we discuss the more prominent optimizations, ones that were crucial and without which the performance of the algorithm would severely deteriorate. One of the benefits we present is the confidence level the algorithm produces in addition to the predictions. The confidence level measures the likelihood that the prediction is correct.

computational learning theory Fourier transform learning algorithms implementation issues 


  1. Angluin, D. (1987). Learning regular sets from queries and counterexamples. Information and Computation, 75, 87–106.Google Scholar
  2. Bellare, M. (1992). A technique for upper bounding the spectral norm with applications to learning. In 5th COLT (Workshop on Computational Learning Theory), number 5 (pp. 62–70).Google Scholar
  3. Blum, A., Furst, M., Jackson, J., Kearns, M., Mansour, Y., & Rudich, S. (1994). Weakly learning DNF and characterizing statistical query learning using fourier analysis. In The 26th Annual ACM Symposium on Theory of Computing (pp. 253–262).Google Scholar
  4. Furst, M. L., Jackson, J. C., & Smith, S.W. (1991). Improved learning of AC 0 functions. In 4th COLT (Workshop on Computational Learning Theory) (pp. 317–325).Google Scholar
  5. Golden, R. M. (1996). Mathematical Methods for Neural Network Analysis and Design. MIT Press.Google Scholar
  6. Jackson, J. (1994). An efficient membership-query algorithm for learning DNF with respect to the uniform distribution. In Proceedings of the 35th Symposium on Foundations of Computer Science (pp. 42–53).Google Scholar
  7. Kushilevitz, E. & Mansour, Y. (1993). Learning decision trees using the fourier spectrum. Siam Journal on Computing, 22(6), 1331–1348. Earlier version appeared. In Proceedings of the 23rd Annual IEEE Symposium on Foundations of Computer Science, 1991.Google Scholar
  8. Linial, N., Mansour, Y., & Nisan, N. (1993). Constant depth circuits, fourier transform, and learnability. Journal of the ACM, 40(3), 607–620. Earlier version appeared in FOCS 1989.Google Scholar
  9. Mansour, Y. (1994). “Learning Boolean functions via the fourier transform” Advances in Neural Computation. Kluwer Academic Publishers.Google Scholar
  10. Mansour, Y. (1995). An O(n loglogn) learning algorithm for DNF under the uniform distribution. Journal of Computer and Systems Sciences, 50(3), 543–550.Google Scholar
  11. Nix, D. A. and Weigend, A. S. (1995). Learning local error bars for nonlinear regression. Advances in Neural Information Processing Systems 7 (NIPS*94), G. Tesauro, D. S. Touretzky, & T. K. Leen (eds.), Cambridge, MA: MIT Press.Google Scholar
  12. Quinlan, J. R. (1993). C4.5: Programs for Machine Learning. Morgan Kaufmann. Received July 2, 1996Google Scholar

Copyright information

© Kluwer Academic Publishers 2000

Authors and Affiliations

  • Yishay Mansour
    • 1
  • Sigal Sahar
    • 2
  1. 1.Computer Science DepartmentTel-Aviv UniversityTel-AvivIsrael
  2. 2.Computer Science DepartmentTel-Aviv UniversityTel-AvivIsrael

Personalised recommendations