Optimal Sub-Reducts with Test Cost Constraint

  • Fan Min
  • William Zhu
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6954)


Cost-sensitive learning extends classical machine learning by considering various types of costs, such as test costs and misclassification costs, of the data. In many applications, there is a test cost constraint due to limited money, time, or other resources. It is necessary to deliberately choose a set of tests to preserve more useful information for classification. To cope with this issue, we define optimal sub-reducts with test cost constraint and a corresponding problem for finding them. The new problem is more general than two existing problems, namely the minimal test cost reduct problem and the 0-1 knapsack problem, therefore it is more challenging than both of them. We propose two exhaustive algorithms to deal with it. One is straightforward, and the other takes advantage of some properties of the problem. The efficiencies of these two algorithms are compared through experiments on the mushroom dataset. Some potential enhancements are also pointed out.


Cost-sensitive learning attribute reduction test cost constraint exhaustive algorithm 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Hunt, E.B., Marin, J., Stone, P.J. (eds.): Experiments in induction. Academic Press, New York (1966)Google Scholar
  2. 2.
    Turney, P.D.: Cost-sensitive classification: Empirical evaluation of a hybrid genetic decision tree induction algorithm. Journal of Artificial Intelligence Research 2, 369–409 (1995)Google Scholar
  3. 3.
    Min, F., He, H., Qian, Y., Zhu, W.: Test-cost-sensitive attribute reduction. In: To Appear in Information Sciences (2011)Google Scholar
  4. 4.
    Min, F., Liu, Q.: A hierarchical model for test-cost-sensitive decision systems. Information Sciences 179(14), 2442–2452 (2009)MathSciNetCrossRefMATHGoogle Scholar
  5. 5.
    Min, F., Zhu, W.: Coser: Cost-senstive rough sets (2011),
  6. 6.
    Pawlak, Z.: Rough sets and intelligent data analysis. Information Sciences 147(12), 1–12 (2002)MathSciNetCrossRefMATHGoogle Scholar
  7. 7.
    Zhu, W.: Topological approaches to covering rough sets. Information Sciences 177(6), 1499–1508 (2007)MathSciNetCrossRefMATHGoogle Scholar
  8. 8.
    Zhu, W., Wang, F.: Reduction and axiomization of covering generalized rough sets. Information Sciences 152(1), 217–230 (2003)MathSciNetCrossRefMATHGoogle Scholar
  9. 9.
    Yao, Y., Zhao, Y.: Attribute reduction in decision-theoretic rough set models. Information Sciences 178(17), 3356–3373 (2008)MathSciNetCrossRefMATHGoogle Scholar
  10. 10.
    Greco, S., Matarazzo, B., Słowiński, R., Stefanowski, J.: Variable consistency model of dominance-based rough sets approach. In: Ziarko, W.P., Yao, Y. (eds.) RSCTC 2000. LNCS (LNAI), vol. 2005, pp. 170–181. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  11. 11.
    Hu, Q., Yu, D., Liu, J., Wu, C.: Neighborhood rough set based heterogeneous feature subset selection. Information Sciences 178(18), 3577–3594 (2008)MathSciNetCrossRefMATHGoogle Scholar
  12. 12.
    Qian, Y., Liang, J., Pedrycz, W., Dang, C.: Positive approximation: An accelerator for attribute reduction in rough set theory. Artificial Intelligence 174(9-10), 597–618 (2010)MathSciNetCrossRefMATHGoogle Scholar
  13. 13.
    Slezak, D.: Approximate entropy reducts. Fundamenta Informaticae 53(3-4), 365–390 (2002)MathSciNetMATHGoogle Scholar
  14. 14.
    Min, F., Zhu, W.: Attribute reduction with test cost constraint. Journal of Electronic Science and Technology of China 9(2) (June 2011)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Fan Min
    • 1
  • William Zhu
    • 1
  1. 1.Lab of Granular ComputingZhangzhou Normal UniversityZhangzhouChina

Personalised recommendations