Rough–Fuzzy Entropy in Neighbourhood Characterization
The Entropy has been used to characterize the neighbourhood of a sample on the base of its k Nearest Neighbour when data are imbalanced and many measures of Entropy have been proposed in the literature to better cope with vagueness, exploiting fuzzy logic, rough set theory and their derivatives. In this paper, a rough extension of Entropy is proposed to measure uncertainty and ambiguity in the neighbourhood of a sample, using the lower and upper approximations from rough–fuzzy set theory in order to compute the Entropy of the set of the k Nearest Neighbours of a sample. The proposed measure shows better robustness to noise and allows a more flexible modeling of vagueness with respect to the Fuzzy Entropy.
KeywordsRough–fuzzy Entropy Fuzzy classification Imbalanced classification
- 7.Ferone, A., Galletti, A., Maratea, A.: Variable width rough-fuzzy c-means. In: 2017 13th International Conference on Signal-Image Technology Internet-Based Systems (SITIS), pp. 458–464, December 2017Google Scholar
- 11.Maratea, A., Ferone, A.: Fuzzy entropy in imbalanced fuzzy classification. In: Esposito, A., et al. (ed.): Proceedings of Wirn, 2019. Springer, Heidelberg (2019, in press)Google Scholar
- 13.Song, G., Rochas, J., Huet, F., Magoulés, F.: Solutions for processing k nearest neighbor joins for massive data on mapreduce. In: 2015 23rd Euromicro International Conference on Parallel, Distributed, and Network-Based Processing, March 2015, pp. 279–287 (2015)Google Scholar