Implementing a Rule Generation Method Based on Secondary Differences of Two Criteria

  • Hidenao Abe
  • Shusaku Tsumoto
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5306)

Abstract

In order to obtain valuable knowledge from stored data on database systems, rule mining is considered as one of the usable mining method. However, almost current rule mining algorithms only use primary difference of a criterion to select attribute-value pairs to obtain a rule set to a given dataset. In this paper, we implemented a rule generation method based on secondary differences of two criteria. Then, we performed a case study using UCI common datasets. With regarding to the result, we compared the accuracies of rule sets learned by our algorithm with that of three representative algorithms.

Keywords

Rule Mining Secondary Difference Practical Machine Learn Tool Rule Learning Algorithm Rule Generation Algorithm 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Furnkranz, J., Flach, P.A.: ROC ‘n’ rule learning: towards a better understanding of covering algorithms. Machine Learning 58(1), 39–77 (2005)CrossRefMATHGoogle Scholar
  2. 2.
    Tsumoto, S.: Accuracy and coverage in rough set rule induction. In: 11th International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems (2006)Google Scholar
  3. 3.
    Furnkranz, J.: Separate-and-Conquer Rule Learning. Artificial Intelligence Review 13(1), 3–54 (1999)CrossRefMATHGoogle Scholar
  4. 4.
    Michalski, R.S.: On the QuasiMinimal Solution of the Covering Problem. In: Proceedings of the 5th International Symposium on Information Processing (FCIP 1969) (Switching Circuits), vol. A3, pp. 125–128 (1969)Google Scholar
  5. 5.
    Mitchell, T.M.: Generalization as Search. Artificial Intelligence 18(2), 203–226 (1982)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Quinlan, J.R.: Programs for Machine Learning. Morgan Kaufmann, San Francisco (1992)Google Scholar
  7. 7.
    Jovanoski, V., Lavrac, N.: Classification Rule Learning with APRIORI-C. In: Brazdil, P.B., Jorge, A.M. (eds.) EPIA 2001. LNCS (LNAI), vol. 2258, pp. 44–51. Springer, Heidelberg (2001)Google Scholar
  8. 8.
    Booker, L.B., Holland, J.H., Goldberg, D.E.: Classifier Systems and Genetic Algorithms. Artificial Intelligence 40, 235–282 (1989)CrossRefGoogle Scholar
  9. 9.
    Goodman, R.M., Smyth, P.: The induction of probabilistic rule sets—the Itrule algorithm. In: Proceedings of the sixth international workshop on Machine Learning, pp. 129–132 (1989)Google Scholar
  10. 10.
    Frank, E., Witten, I.H.: Generating accurate rule sets without global optimization. In: The Fifteenth International Conference on Machine Learning, pp. 144–151 (1998)Google Scholar
  11. 11.
  12. 12.
    Asuncion, A., Newman, D.J.: UCI Machine Learning Repository. University of California, School of Information and Computer Science, Irvine (2007), http://www.ics.uci.edu/~mlearn/MLRepository.html Google Scholar
  13. 13.
    Holte, R.C.: Very simple classification rules perform well on most commonly used datasets. Machine Learning 11, 63–91 (1993)CrossRefMATHGoogle Scholar
  14. 14.
    Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations. Morgan Kaufmann, San Francisco (2000)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Hidenao Abe
    • 1
  • Shusaku Tsumoto
    • 1
  1. 1.Shimane UniversityJapan

Personalised recommendations