From Computational Learning Theory to Discovery Science

  • Osamu Watanabe
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1644)

Abstract

Machine learning has been one of the important subjects of AI that is motivated by many real world applications. In theoretical computer science, researchers also have introduced mathematical frameworks for investigating machine learning, and in these frameworks, many interesting results have been obtained. Now we are proceeding to a new stage to study how to apply these fruitful theoretical results to real problems. We point out in this paper that “adaptivity” is one of the important issues when we consider applications of learning techniques, and we propose one learning algorithm with this feature.

Keywords

Learning Algorithm Adaptive Sampling Hypothesis Selection Weak Hypothesis Good Hypothesis 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    C. Domingo, R. Gavaldà, and O. Watanabe, Practical algorithms for on-line selection, in Proc. of the First Int’;l Conference on Discovery Science, DS’98, Lecture Notes in Artificial Intelligence 1532:150–161, 1998.Google Scholar
  2. 2.
    C. Domingo, R. Gavaldà, and O. Watanabe, Adaptive sampling methods for scaling up knowledge discovery algorithms, Technical Report C-131, Dept. of Math. and Computing Sciences, Tokyo Institute of Technology, 1999.Google Scholar
  3. 3.
    Y. Freund, Boosting a weak learning algorithm by majority, Information and Computation, 121(2):256–285, 1995.MATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Y. Freund and R.E. Schapire, Experiments with a new boosting algorithm, in Machine Learning: Proc. of the 13th Int’l Conference, 148–156, 1996.Google Scholar
  5. 5.
    Y. Freund and R.E. Schapire, A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci., 55(1):119–139, 1997.MATHCrossRefMathSciNetGoogle Scholar
  6. 6.
    M.J. Kearns and L.G. Valiant, Cryptographic limitations on learning boolean formulae and finite automata, J. Assoc. Comput. Mach., 41(1):67–95, 1994.MATHMathSciNetGoogle Scholar
  7. 7.
    M.J. Kearns and U.V. Vazirani, An Introduction to Computational Learning Theory, Cambridge University Press, 1994.Google Scholar
  8. 8.
    Richard J. Lipton and Jeffrey F. Naughton, Query size estimation by adaptive sampling, Journal of Computer and System Science, 51:18–25, 1995.MATHCrossRefMathSciNetGoogle Scholar
  9. 9.
    R.J. Lipton, J.F. Naughton, D.A. Schneider, and S. Seshadri, Efficient sampling strategies for relational database operations, Theoretical Computer Science, 116:195–226, 1993.MATHCrossRefMathSciNetGoogle Scholar
  10. 10.
    J.R. Quinlan, Bagging, boosting, and C4.5, in Proc. of the 13th National Conference on Artificial Intelligence, 725–730, 1996.Google Scholar
  11. 11.
    R. Reischuk and T. Zeugmann, A complete and tight average-case analysis of learning monomial, in Proc. 16th Int’l Sympos. on Theoretical Aspects of Computer Science, STACS’99, 1999, to appear.Google Scholar
  12. 12.
    J. Shawe-Taylor, P.L. Bartlett, R.C. Williamson, and M. Anthony, Structural risk minimization over data-dependent hierarchies, IEEE Trans. Information Theory, 44(5):1926–1940, 1998.MATHCrossRefMathSciNetGoogle Scholar
  13. 13.
    R.E. Schapire, The strength of weak learnability, Machine Learning, 5(2):197–227, 1990.Google Scholar
  14. 14.
    R.E. Schapire, Theoretical views of boosting, in Computational Learning Theory: Proc. of the 4th European Conference, EuroCOLT’99, 1999, to appear.Google Scholar
  15. 15.
    L. Valiant, A theory of the learnable, Communications of the ACM, 27(11):1134–1142, 1984.MATHCrossRefGoogle Scholar
  16. 16.
    A. Wald, Sequential Analysis, Wiley Mathematical, Statistics Series, 1947.Google Scholar
  17. 17.
    T. Zeugmann, Lange and Wiehagen’s pattern language learning algorithm: an average-case analysis with respect to its total learning time, Annals of Math. and Artificial Intelligence, 23(1-2):117–145, 1998.MATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1999

Authors and Affiliations

  • Osamu Watanabe
    • 1
  1. 1.Dept. of Mathematical and Computing SciencesTokyo Institute of TechnologyTokyoJapan

Personalised recommendations