Advertisement

The Entropy of Relations and a New Approach for Decision Tree Learning

  • Dan Hu
  • HongXing Li
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3614)

Abstract

The formula for scaling how much information in relations on the finite universe is proposed, which is called the entropy of relation R and denoted by H (R). Based on the concept of H (R), the entropy of predicates and the information of propositions are measured. We can use these measures to evaluate predicates and choose the most appropriate predicate for some given cartesian set. At last, H (R) is used to induce decision tree. The experiment show that the new induction algorithm denoted by IDIR do better than ID3 on the aspects of nodes and test time.

Keywords

Binary Relation Information Complexity Bijective Function Fuzzy Measure Fuzzy Partition 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Xuelong, Z.: Fundamentals of applied information theory. Tsinghua University press, Beijing (2001)Google Scholar
  2. 2.
    Uang, N.C.: Generalized information theory. University of scince and technology of China press, Hefei (1993)Google Scholar
  3. 3.
    Kuriyama, K.: Entropy of a finite partition of fuzzy sets. J. Math. Anal. Appl. 94, 38–43 (1983)zbMATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Zadeh, L.A.: The concept of a linguistic variable and its applicaiton to approximate reasoning I. Information Science 8(3), 199–251 (1975)CrossRefMathSciNetGoogle Scholar
  5. 5.
    Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo (1993)Google Scholar
  6. 6.
    De MSantaras, R.L.: A distance-based attribute selection measure for decision tree induction. Machine Learning 6, 81–92 (1991)Google Scholar
  7. 7.
    Simovici, D.A., Jaroszewicz, S.: Generalized conditional entropy and decision trees. In: Proceedings of EGC France, pp. 369–380 (2003)Google Scholar
  8. 8.
    Dumitrescu, D.: Fuzzy measure and the entropy of fuzzy partition. J. Math. Anal. Appl. 176, 359–373 (1993)zbMATHCrossRefMathSciNetGoogle Scholar
  9. 9.
    Marichal, J., Roubens, M.: Entropy of discrete fuzzy measures. International Journal of uncertainty, fuzziness and knowledge-based systems 8(6), 625–640 (2000)zbMATHMathSciNetGoogle Scholar
  10. 10.
    Kolman, B., Busby, R.C., Ross, S.: Discrete mathematical structures, 3rd edn. Prentice Hall, USA (1996)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Dan Hu
    • 1
  • HongXing Li
    • 1
  1. 1.Department of MathematicsBeijing Normal UniversityBeijingChina

Personalised recommendations