Attribute reduction via local conditional entropy

  • Yibo Wang
  • Xiangjian Chen
  • Kai DongEmail author
Original Article


In rough set theory, the concept of conditional entropy has been widely accepted for studying the problem of attribute reduction. If a searching strategy is given to find reduct, then the value of conditional entropy can also be used to evaluate the significance of the candidate attribute in the process of searching. However, traditional conditional entropy is used to characterize the relationship between conditional attributes and decision attribute in terms of all samples in data, it does not take such relationship with specific samples (samples with same label) into account. To fill such a gap, a new form of conditional entropy which is termed as Local Conditional Entropy is proposed. Furthermore, based on some important properties about local conditional entropy studied, local conditional entropy based attribute reduction is defined. Immediately, an ensemble strategy is introduced into the heuristic process for searching reduct, which is realized by the significance based on local conditional entropy. Finally, the experimental results over 18 UCI data sets show us that local conditional entropy based attribute reduction is superior to traditional conditional entropy based attribute reduction, the former may provide us attributes with higher classification accuracies. In addition, if local conditional entropy is regarded as the measurement in online feature selection, then it not only offers us better classification performance, but also requires lesser elapsed time to complete the process of online feature selection. This study suggests new trends for considering attribute reduction and provides guidelines for designing new measurements and related algorithms.


Attribute reduction Conditional entropy Local conditional entropy Neighborhood rough set Rough set 



This work is supported by the Natural Science Foundation of China (Nos. 61502211, 61572242, 61503160). We would like to thank Eric Appiah Mantey and selase Tawiah Kwawu for their help in improving the language quality of this paper.


  1. 1.
    Pawlak Z (1992) Rough sets: theoretical aspects of reasoning about data. Kluwer Academic Publishers, Poland (ISBN 978-0792314721) Google Scholar
  2. 2.
    Azam N, Yao JT (2014) Game-theoretic rough sets for recommender systems. Knowl Based Syst 72:96–107. CrossRefGoogle Scholar
  3. 3.
    Chen HM, Li TR, Luo C (2015) A decision-theoretic rough set approach for dynamic data mining. IEEE Trans Fuzzy Syst 23:1–14. CrossRefGoogle Scholar
  4. 4.
    Dou HL, Yang XB, Song XN (2016) Decision-theoretic rough set: a multicost strategy. Knowl Based Syst 91:71–83. CrossRefGoogle Scholar
  5. 5.
    Hu QH, Pedrycz W, Yu DR (2010) Selecting discrete and continuous features based on neighborhood decision error minimization. IEEE Trans Syst Man Cybern B 40:137–150. CrossRefGoogle Scholar
  6. 6.
    Qian YH, Liang XY, Wang Q (2018) Local rough set: a solution to rough data analysis in big data. Int J Approx Reason 97:38–63. MathSciNetCrossRefzbMATHGoogle Scholar
  7. 7.
    Yang XB, Qian YH, Yang JY (2012) Hierarchical structures on multigranulation spaces. J Comput Sci Technol 27:1169–1183. MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    Li JH, Cherukuri AK, Mei CL (2017) Comparison of reduction in formal decision contexts. Int J Approx Reason 80:100–122MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    Li JH, Mei CL, Lv YJ (2013) Incomplete decision contexts: approximate concept construction, rule acquisition and knowledge reduction. Int J Approx Reason 54(1):149–165MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    Zhou P, Hu XG, Li PP (2017) Online feature selection for high-dimensional class-imbalanced data. Knowl Based Syst 136:187–199. CrossRefGoogle Scholar
  11. 11.
    Qian YH, Liang JY, Pedrycz W (2011) An efficient accelerator for attribute reduction from incomplete data in rough set framework. Pattern Recognit 44:1658–1670CrossRefzbMATHGoogle Scholar
  12. 12.
    Yang XB, Yao YY (2018) Ensemble selector for attribute reduction. Appl Soft Comput 70:1–11. CrossRefGoogle Scholar
  13. 13.
    Min F, He HP, Qian YH (2011) Test-cost-sensitive attribute reduction. Inf Sci 181:4928–4942CrossRefGoogle Scholar
  14. 14.
    Yang YY, Chen DG, Wang H (2018) Incremental perspective for feature selection based on fuzzy rough sets. IEEE Trans Fuzzy Syst 26:1257–1273. CrossRefGoogle Scholar
  15. 15.
    Tsang ECC, Hu QH, Chen DG (2016) Feature and instance reduction for PNN classfiers based on fuzzy rough sets. Int J Mach Learn Cybern 7:1–11. CrossRefGoogle Scholar
  16. 16.
    Dai JH, Hu H, Wu WZ (2018) Maximal-discernibility-pair-based approach to attribute reduction in fuzzy rough sets. IEEE Trans Fuzzy Syst 26:2174–2187. CrossRefGoogle Scholar
  17. 17.
    Wang CZ, Hu QH, Wang XZ (2018) Feature selection based on neighborhood discrimination index. IEEE Trans Neural Netw Learn Syst 29:2986–2999. MathSciNetGoogle Scholar
  18. 18.
    Liao SJ, Zhu QX, Qian YH (2018) Multi-granularity feature selection on cost-sensitive data with measurement errors and variable costs. Knowl Based Syst 158:25–42. CrossRefGoogle Scholar
  19. 19.
    Hu QH, Zhang L, Chen DG (2010) Gaussian kernel based fuzzy rough sets: model, uncertainty measures and applications. Int J Approx Reason 51:453–471. CrossRefzbMATHGoogle Scholar
  20. 20.
    Yang XB, Yang JY, Wu C (2008) Dominance-based rough set approach and knowledge reductions in incomplete ordered information system. Inf Sci 178:1219–1234. MathSciNetCrossRefzbMATHGoogle Scholar
  21. 21.
    Jia XY, Tang ZM, Liao WH (2014) On an optimization representation of decision-theoretic rough set model. Int J Approx Reason 55:156–166. MathSciNetCrossRefzbMATHGoogle Scholar
  22. 22.
    Yang X, Li TR, Liu D (2017) A unified framework of dynamic three-way probabilistic rough sets. Inf Sci 420:126–147. MathSciNetCrossRefGoogle Scholar
  23. 23.
    Liu D, Li TR, Ruan D (2011) Probabilistic model criteria with decision-theoretic rough sets. Inf Sci 181:3709–3722. MathSciNetCrossRefGoogle Scholar
  24. 24.
    Min F, Hu QH, Zhu W (2014) Feature selection with test cost constraint. Int J Approx Reason 55:167–179. MathSciNetCrossRefzbMATHGoogle Scholar
  25. 25.
    Zhang HR, Min F, Shi B (2017) Regression-based three-way recommendation. Inf Sci 378:444–461CrossRefGoogle Scholar
  26. 26.
    Hu QH, Yu DR, Xie ZX (2006) Fuzzy probabilistic approximation spaces and their information measures. IEEE Trans Fuzzy Syst 14:191–201. CrossRefGoogle Scholar
  27. 27.
    Dai JH, Wang WT, Tian HW (2013) Attribute selection based on a new conditional entropy for incomplete decision systems. Knowl Based Syst 39:207–213. CrossRefGoogle Scholar
  28. 28.
    Qian YH, Liang JY (2008) Combination entropy and combination granulation in rough set theory. Int J Uncertain Fuzzy 16:179–193. MathSciNetCrossRefzbMATHGoogle Scholar
  29. 29.
    Chen DG, Zhao SY, Zhang L (2012) Sample pair selection for attribute reduction with rough set. IEEE Trans Knowl Data Eng 24:2080–2093. CrossRefGoogle Scholar
  30. 30.
    Yang XB, Qi YS, Song XN (2013) Test cost sensitive multigranulation rough set: model and minimal cost selection. Inf Sci 250:184–199. MathSciNetCrossRefzbMATHGoogle Scholar
  31. 31.
    Liu KY, Yang XB, Yu HL et al (2018) Rough set based semi-supervised feature selection via ensemble selector. Knowl Based Syst.
  32. 32.
    Qian YH, Liang JY, Pedrycz W (2010) Positive approximation: an accelerator for attribute reduction in rough set theory. Artif Intell 174:597–618. MathSciNetCrossRefzbMATHGoogle Scholar
  33. 33.
    Chen HM, Li TR, Cai Y (2016) Parallel attribute reduction in dominance-based neighborhood rough set. Inf Sci 373:351–368. CrossRefGoogle Scholar
  34. 34.
    Yang XB, Qi Y, Yu HL (2014) Updating multigranulation rough approximations with increasing of granular structures. Knowl Based Syst 64:59–69. CrossRefGoogle Scholar
  35. 35.
    Chen DG, Yang YY, Dong Z (2016) An incremental algorithm for attribute reduction with variable precision rough sets. Appl Soft Comput 45:129–149. CrossRefGoogle Scholar
  36. 36.
    Hu QH, Pan WW, Zhang L (2012) Feature selection for monotonic classification. IEEE Trans Fuzzy Syst 20:69–81. CrossRefGoogle Scholar
  37. 37.
    Zhang X, Mei CL, Chen DG (2016) Feature selection in mixed data: a method using a novel fuzzy rough set-based information entropy. Pattern Recognit 56:1–15. CrossRefGoogle Scholar
  38. 38.
    Dai JH, Wei BJ, Zhang XH (2017) Uncertainty measurement for incomplete interval-valued information systems based on \(\alpha\)-weak similarity. Knowl Based Syst 136:159–171. CrossRefGoogle Scholar
  39. 39.
    Hu QH, Che XJ, Zhang L (2012) Rank entropy-based decision trees for monotonic classification. IEEE Trans Knowl Data Eng 24:2052–2064. CrossRefGoogle Scholar
  40. 40.
    Chen DG, Zhao SY (2010) Local reduction of decision system with fuzzy rough sets. Fuzzy Set Syst 161:1871–1883. MathSciNetCrossRefzbMATHGoogle Scholar
  41. 41.
    Yang XB, Liang SC, Yu HL (2018) Pseudo-label neighborhood rough set: measures and attribute reductions. Int J Approx Reason 105:112–129. MathSciNetCrossRefzbMATHGoogle Scholar
  42. 42.
    Yao YY, Zhang XY (2017) Class-specific attribute reducts in rough set theory. Inf Sci 418–419:601–618. CrossRefGoogle Scholar
  43. 43.
    Xu SP, Yang XB, Yu HL et al (2016) Multi-label learning with label-specific feature reduction. Knowl Based Syst 104:52–61. CrossRefGoogle Scholar
  44. 44.
    Hu XG, Zhou P, Li PP (2017) A survey on online feature selection with streaming features. Front Comput Sci 12:479–493. CrossRefGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2019

Authors and Affiliations

  1. 1.School of Computer Science and EngineeringSoutheast UniversityNanjingPeople’s Republic of China

Personalised recommendations