Skip to main content
Log in

The information content of rules and rule sets and its application

  • Published:
Science in China Series F: Information Sciences Aims and scope Submit manuscript

Abstract

The information content of rules is categorized into inner mutual information content and outer impartation information content. Actually, the conventional objective interestingness measures based on information theory are all inner mutual information, which represent the confidence of rules and the mutual information between the antecedent and consequent. Moreover, almost all of these measures lose sight of the outer impartation information, which is conveyed to the user and help the user to make decisions. We put forward the viewpoint that the outer impartation information content of rules and rule sets can be represented by the relations from input universe to output universe. By binary relations, the interaction of rules in a rule set can be easily represented by operators: union and intersection. Based on the entropy of relations, the outer impartation information content of rules and rule sets are well measured. Then, the conditional information content of rules and rule sets, the independence of rules and rule sets and the inconsistent knowledge of rule sets are defined and measured. The properties of these new measures are discussed and some interesting results are proven, such as the information content of a rule set may be bigger than the sum of the information content of rules in the rule set, and the conditional information content of rules may be negative. At last, the applications of these new measures are discussed. The new method for the appraisement of rule mining algorithm, and two rule pruning algorithms, λ-choice and RPCIC, are put forward. These new methods and algorithms have predominance in satisfying the need of more efficient decision information.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Yao Y Y, Zhong N. An analysis of quantitative measures associated with rules. In: Zhong N, Zhou L, eds. Proceedings of Pacific-Aisa Conference on Knowledge Discovery and Data Mining, LNAI, vol. 1574. Berlin: Springer-Verlag, 1999. 479–488

    Google Scholar 

  2. Hilderman R J, Hamilton H J. Knowledge Discovery and Measures of Interest. London: Kluwer Academic Publishers, 2001

    MATH  Google Scholar 

  3. Ohsaki M, Sato Y, Yokoi H, et al. Investigation of rule interestingness in medical data mining. In: Tsumoto S, et al., eds. LNAI, vol. 3430. Berlin: Springer-Verlag, 2005. 174–189

    Google Scholar 

  4. Ohsaki M, Sato Y, Yokoi H, et al. Evaluation of rule interestingness measures with a clinical data set on hepatitis. In: Boulicaut J F, et al., eds. LNAI, vol. 3202. Berlin: Springer-Verlag, 2004. 362–373

    Google Scholar 

  5. Hilderman R J, Hamilton H J. Knowledge discovery and interestingness measures: a survey. Technical report, University of Regina, 1999

  6. Balaji P, Alexander T. Unexpectedness as a measure of interestingness in knowledge discovery. Decis Supp Syst, 1999, 27: 303–318

    Article  Google Scholar 

  7. Ailberschatz A, Tuzhilin A. What makes patterns interesting in knowledge discovery systems. IEEE Trans Knowl Data Eng, 1996, 8(6): 970–974

    Article  Google Scholar 

  8. Liu B, Hsu W, Chen S. Using general impressions to analyze discovered classification rules. In: Proc. of 3rd Int. Conf. on Knowledge Discovery and Data Mining. California: AAAI Press, 1997. 31–36

    Google Scholar 

  9. Liu B, Hsu W. Post-analysis of learned rules. In: Proc. of 13rd Int. Conf on Artificial Intelligence. California: AAAI/MIT Press, 1996. 828–834

    Google Scholar 

  10. Hamilton H J, Shan N, Ziarko W. Machine learning of credible classifications. In: Abdul S, ed. Proceedings of Australian Conference on Artificial Intelligence, LNCS, vol. 1342. Berlin: Springer-Verlag, 1997. 330–339

    Google Scholar 

  11. Shapiro G S, Discovery, analysis and presentation of strong Rules. Knowledge Discovery in Databases. California: AAAI/MIT Press, 1991. 229–248

    Google Scholar 

  12. Jaroszewicz S, Simovici D. A general measure of rule interestingness. In: De Raedt L, Siebes A, eds. Proceedings of European Conference on Principles of Data Mining and Knowledge Discovery, LNCS, vol. 2168. Berlin: Springer-Verlag, 2001. 253–265

    Chapter  Google Scholar 

  13. Gago P, Bento C. A metric for selection of the most promising rules. In: Zytkow J M, Quafafou M, eds. Proceedings of European Conference on the Principle of Data Ming and Knowledge Discovery, LNCS, vol. 1510. Berlin: Springer-Verlag, 1998. 19–27

    Chapter  Google Scholar 

  14. Zhong N, Yao Y Y, Ohshima M. Peculiarity oriented multi-database mining. In: Zytkow J M, Rauch J, eds. Proceedings of European Conference on Principles of data Mining and Knowledge Discovery, LNCS, vol. 1704. Berlin: Springer-Verlag, 1999. 136–146

    Google Scholar 

  15. Hamilton H J, Fudger D F. Estimationg DBLearn’s potential for knowledge discovery in databases. Comput Intell, 1995, 11(2): 280–296

    Article  Google Scholar 

  16. Symth P, Goodman R M. Rule induction using information theory. In: Piatetsky-Shapiro G, Frauley W J, eds. Knowledge Discovery in Databases. California: AAAI/MIT Press, 1991

    Google Scholar 

  17. Freitas A A. On rule interestingness measures. Knowledge-based Syst, 1999, 12: 309–315

    Article  Google Scholar 

  18. Dong G Z, Li J. Interestingness of discovered association rules in terms of neighborhood-Based unexpectedness. In: Wu X D, Kotagiri B, Korb B, eds. Proceedings of Pacific-Asia Conference on Knowledge Discovery and Data Mining, LNAI, vol.1394. Berlin: Springer-Verlag, 1998. 72–86

    Google Scholar 

  19. Shore J E, Johnson R W. Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Trans Inf Theory, 1980, 26(1): 26–37

    Article  MATH  MathSciNet  Google Scholar 

  20. Hu D, Li H X. The entropy of relations and a new approach for decision tree learning. In: Wang L P, Jin Y C, eds. LNAI, vol. 3614. Berlin: Springer-Verlag, 2005. 378–388

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dan Hu.

Additional information

Supported in part by the National Natural Science Foundation of China (Grant Nos. 60774049 and 40672195), the Natural Science Foundation of Beijing (Grant No. 4062020), the National 973 Fundamental Research Project of China (Grant No. 2002CB312200) and the Youth Foundation of Beijing Normal University

Rights and permissions

Reprints and permissions

About this article

Cite this article

Hu, D., Li, H. & Yu, X. The information content of rules and rule sets and its application. Sci. China Ser. F-Inf. Sci. 51, 1958–1979 (2008). https://doi.org/10.1007/s11432-008-0130-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11432-008-0130-1

Keywords

Navigation