Advertisement

Dynamic reducts as a tool for extracting laws from decisions tables

  • Jan G. Bazan
  • Andrzej Skowron
  • Piotr Synak
Communications Learning and Adaptive Systems
Part of the Lecture Notes in Computer Science book series (LNCS, volume 869)

Abstract

We apply rough set methods and boolean reasoning for knowledge discovery from decision tables. It is not always possible to extract general laws from experimental data by computing first all reducts [12] of a decision table and next decision rules on the basis of these reducts. We investigate a problem how information about the reduct set changes in a random sampling process of a given decision table could be used to generate these laws. The reducts stable in the process of decision table sampling are called dynamic reducts. Dynamic reducts define the set of attributes called the dynamic core. This is the set of attributes included in all dynamic reducts. The set of decision rules can be computed from the dynamic core or from the best dynamic reducts. We report the results of experiments with different data sets, e.g. market data, medical data, textures and handwritten digits. The results are showing that dynamic reducts can help to extract laws from decision tables.

Key words

evolutionary computation knowledge discovery rough sets decision algorithms machine learning 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bazan J., Skowron A., Synak P.: Market data analysis: A rough set approach, ICS Research Report 6/94 Warsaw University of Technology 1994.Google Scholar
  2. 2.
    Brown, E.M.: Boolean reasoning. Dordrecht: Kluwer 1990.Google Scholar
  3. 3.
    Market data, Hughes Research Laboratories, manuscript.Google Scholar
  4. 4.
    International Workshop Rough Sets: State of the Art and Perspectives, Poznan — Kiekrz (Poland), September 2–4, 1992. Extended Abstracts. Poznan 1992. Full papers in Foundations of Computing and Decision Sciences, Vol.18, No 3–4 (1993).Google Scholar
  5. 5.
    Kodratoff Y., Michalski R.(eds.): Machine Learning vol III, Morgan Kaufmann, San Mateo, CA, 1990.Google Scholar
  6. 6.
    Michalski R., Carbonell J.G., Mitchel T.M. (eds.): Machine Learning vol I, Tioga/Morgan Kaufmann, Los Altos, CA, 1983.Google Scholar
  7. 7.
    Michalski R., Carbonell J.G., Mitchel T.M. (eds.): Machine Learning vol II, Morgan Kaufmann, Los Altos, CA, 1986.Google Scholar
  8. 8.
    Michalski R., Tecuci G.: Machine Learning. A Multistrategy Approach vol.IV, Morgan Kaufmann 1994.Google Scholar
  9. 9.
    Nguyen T., Swiniarski R., Skowron A., Bazan J., Thyagarajan K.: Application of rough sets, neural networks and maximum likelihood for texture classification based on singular value decomposition, submitted for the Workshop RSSC'94, San Jose, California 1994.Google Scholar
  10. 10.
    Proceedings of the International Workshop on Rough Sets and Knowledge Discovery, RSKD'93, Banff, October 11–15, Canada 1993, 101–104.Google Scholar
  11. 11.
    Pawlak Z. and Skowron A.: A rough set approach for decision rules generation. ICS Research Report 23/93, Warsaw University of Technology 1993, Proc. of the IJCAI'93 Workshop W12: The Management of Uncertainty in AI, France 1993.Google Scholar
  12. 12.
    Pawlak Z.: Rough sets: Theoretical aspects of reasoning about data. Dordrecht: Kluwer 1991.Google Scholar
  13. 13.
    Quinlan J. R.: The Effect of Noise on Concept Learning. In R. S. Michalski, J. G. Carbonell, & T. M. Mitchel (eds.), Machine Learning: An Artificial Intelligence Approach vol.II, Morgan Kaufmann Publishers, San Mateo, California 1986, 149–166.Google Scholar
  14. 14.
    Quinlan J. R.: Probabilistic Decision Trees. In: Kodratoff Y., Michalski R. (eds.) Machine Learning: An Artificial Intelligence Approach vol. III, Morgan Kaufmann Publishers, San Mateo, California 1990, 140–152.Google Scholar
  15. 15.
    Quinlan J. R.: C4.5: Programs for Machine Learning, Morgan Kaufmann, San Mateo, California 1993Google Scholar
  16. 16.
    Raedt De L.: Interactive Theory Revision. An Inductive Logic Programming, Academic Press 1992.Google Scholar
  17. 17.
    Shavlik J.W., Dietterich T.: Readings in Machine Learning, Morgan Kaufmann 1990.Google Scholar
  18. 18.
    Shrager, J., Langley, P.: Computational models of scientific discovery and theory formation, Morgan Kaufmann, San Mateo 1990.Google Scholar
  19. 19.
    Skowron, A. and Rauszer C.: The Discernibility Matrices and Functions in Information Systems. In: R. Slowinski (ed.): Intelligent Decision Support. Handbook of Applications and Advances of the Rough Sets Theory. Dordrecht: Kluwer 1992, 331–362.Google Scholar
  20. 20.
    Skowron, A.: Boolean reasoning for decision rules generation. Proceedings of the 7-th International Symposium ISMIS'93, Trondheim, Norway 1993, In: J. Komorowski and Z. Ras (eds.): Lecture Notes in Artificial Intelligence, Vol.689, Springer-Verlag 1993, 295–305.Google Scholar
  21. 21.
    Slowinski, R. (ed.): Intelligent Decision Support. Handbook of Applications and Advances of the Rough Sets Theory, Dordrecht: Kluwer 1992.Google Scholar
  22. 22.
    Swiniarski R., Nguyen T.: Application of rough sets, neural networks and maximum likelihood for texture classification based on singular value decomposition, Research Report SDSU, San Diego State University 1994.Google Scholar
  23. 23.
    Tsumoto S.: Medical Data, personal communication.Google Scholar
  24. 24.
    Trung, Nguyen, Tuan and Son, Nguyen, Hung: An Approach to the Handwriting Digit Recognition Problem Based on Modal Logic, ICS Research Report 44/93, Warsaw University of Technology 1993, M. Sc. Thesis, Warsaw University 1993.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1994

Authors and Affiliations

  • Jan G. Bazan
    • 1
  • Andrzej Skowron
    • 2
  • Piotr Synak
    • 2
  1. 1.Institute of MathematicsPedagogical UniversityRzeszowPoland
  2. 2.Institute of MathematicsWarsaw UniversityWarsawPoland

Personalised recommendations