Knowledge and Information Systems

, Volume 38, Issue 1, pp 179–206 | Cite as

Analyzing the presence of noise in multi-class problems: alleviating its influence with the One-vs-One decomposition

  • José A. Sáez
  • Mikel Galar
  • Julián Luengo
  • Francisco Herrera
Regular Paper

Abstract

The presence of noise in data is a common problem that produces several negative consequences in classification problems. In multi-class problems, these consequences are aggravated in terms of accuracy, building time, and complexity of the classifiers. In these cases, an interesting approach to reduce the effect of noise is to decompose the problem into several binary subproblems, reducing the complexity and, consequently, dividing the effects caused by noise into each of these subproblems. This paper analyzes the usage of decomposition strategies, and more specifically the One-vs-One scheme, to deal with noisy multi-class datasets. In order to investigate whether the decomposition is able to reduce the effect of noise or not, a large number of datasets are created introducing different levels and types of noise, as suggested in the literature. Several well-known classification algorithms, with or without decomposition, are trained on them in order to check when decomposition is advantageous. The results obtained show that methods using the One-vs-One strategy lead to better performances and more robust classifiers when dealing with noisy data, especially with the most disruptive noise schemes.

Keywords

Noisy data Class noise Attribute noise One-vs-One Decomposition strategies Ensembles Classification 

References

  1. 1.
    Aggarwal CC (2009) On classification and segmentation of massive audio data streams. Knowl Inf Syst 20(2):137–156CrossRefMathSciNetGoogle Scholar
  2. 2.
    Alcalá-Fdez J, Fernández A, Luengo J, Derrac J, García S, Sánchez L, Herrera F (2011) KEEL data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. J Multiple Valued Logic Soft Comput 17(2–3):255–287Google Scholar
  3. 3.
    Alcalá-Fdez J, Sánchez L, García S, del Jesus M, Ventura S, Garrell J, Otero J, Romero C, Bacardit J, Rivas V, Fernández J, Herrera F (2009) KEEL: a software tool to assess evolutionary algorithms for data mining problems. Soft Comput Fusion Found Methodol Appl 13:307–318Google Scholar
  4. 4.
    Allwein EL, Schapire RE, Singer Y, Kaelbling P (2000) Reducing multiclass to binary: a unifying approach for margin classifiers. J Mach Learn Res 1:113–141Google Scholar
  5. 5.
    Anand A, Suganthan PN (2009) Multiclass cancer classification by support vector machines with class-wise optimized genes and probability estimates. J Theor Biol 259(3):533–540CrossRefGoogle Scholar
  6. 6.
    Anand R, Mehrotra K, Mohan CK, Ranka S (1995) Efficient classification for multiclass problems using modular neural networks. IEEE Trans Neural Netw 6(1):117–124CrossRefGoogle Scholar
  7. 7.
    Bootkrajang J, Kabán A (2011) Multi-class classification in the presence of labelling errors. In: European symposium on artificial neural networks 2011 (ESANN 2011), pp 345–350Google Scholar
  8. 8.
    Brodley CE, Friedl MA (1999) Identifying mislabeled training data. J Artif Intell Res 11:131–167MATHGoogle Scholar
  9. 9.
    Cao J, Kwong S, Wang R (2012) A noise-detection based AdaBoost algorithm for mislabeled data. Pattern Recognit 45(12):4451–4465CrossRefMATHGoogle Scholar
  10. 10.
    Chang CC, Lin CJ (2011) LIBSVM: A library for support vector machines. ACM Trans Intell Syst Technol 2:27:1–27:27CrossRefGoogle Scholar
  11. 11.
    Cohen WW (1995) Fast effective rule induction. In: Proceedings of the twelfth international conference on machine learning. Morgan Kaufmann Publishers, pp 115–123Google Scholar
  12. 12.
    Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30MATHMathSciNetGoogle Scholar
  13. 13.
    Dietterich TG, Bakiri G (1995) Solving multiclass learning problems via error-correcting output codes. J Artifl Intell Res 2:263–286MATHGoogle Scholar
  14. 14.
    Duda RO, Hart PE, Stork DG (2001) Pattern classification, 2nd edn. Wiley, New YorkMATHGoogle Scholar
  15. 15.
    Furnkranz J (2002) Round Robin classificationGoogle Scholar
  16. 16.
    Galar M, Fernández A, Barrenechea E, Bustince H, Herrera F (2011) An overview of ensemble methods for binary classifiers in multi-class problems: experimental study on one-vs-one and one-vs-all schemes. Pattern Recognit 44:1761–1776CrossRefGoogle Scholar
  17. 17.
    Galar M, Fernández A, Barrenechea E, Bustince H, Herrera F (2012) A review on ensembles for the class imbalance problem: bagging, boosting, and hybrid-based approaches. IEEE Trans Syst Man Cybern Part C Appl Rev 42(4):463–484CrossRefGoogle Scholar
  18. 18.
    Gamberger D, Boskovic R, Lavrac N, Groselj C (1999) Experiments with noise filtering in a medical domain. In: Proceedings of the sixteenth international conference on machine learning. Morgan Kaufmann Publishers, pp 143–151Google Scholar
  19. 19.
    Guler I, Ubeyli ED (2007) Multiclass support vector machines for EEG-signals classification. IEEE Trans Inf Technol Biomed 11(2):117–126CrossRefGoogle Scholar
  20. 20.
    Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten IH (2009) The WEKA data mining software: an update. SIGKDD Explor 11:10–18CrossRefGoogle Scholar
  21. 21.
    Hernández MA, Stolfo SJ (1998) Real-world data is dirty: data cleansing and the merge/purge problem. Data Min Knowl Discov 2:9–37CrossRefGoogle Scholar
  22. 22.
    Hernández-Lobato D, Hernández-Lobato JM, Dupont P (2011) Robust multi-class Gaussian process classification. In: Annual conference on neural information processing systems (NIPS 2011), pp 280–288Google Scholar
  23. 23.
    Hido S, Tsuboi Y, Kashima H, Sugiyama M, Kanamori T (2011) Statistical outlier detection using direct density ratio estimation. Knowl Inf Syst 26(2):309–336CrossRefGoogle Scholar
  24. 24.
    Hsu CW, Lin CJ (2002) A comparison of methods for multiclass support vector machines. IEEE Trans Neural Netw 13(2):415–425CrossRefGoogle Scholar
  25. 25.
    Huber PJ (1981) Robust statistics. Wiley, New YorkCrossRefMATHGoogle Scholar
  26. 26.
    Hüllermeier E, Vanderlooy S (2010) Combining predictions in pairwise classification: an optimal adaptive voting strategy and its relation to weighted voting. Pattern Recognit 43(1):128–142CrossRefMATHGoogle Scholar
  27. 27.
    Knerr S, Personnaz L, Dreyfus G (1990) A stepwise procedure for building and training a neural network. In: Fogelman Soulié F, Hérault J (eds) Neurocomputing: algorithms, architectures and applications. Springer, Berlin, pp 41–50CrossRefGoogle Scholar
  28. 28.
    Knerr S, Personnaz L, Dreyfus G, Member S (1992) Handwritten digit recognition by neural networks with single-layer trainingGoogle Scholar
  29. 29.
    Kononenko I, Kukar M (2007) Machine learning and data mining: introduction to principles and algorithms. Horwood Publishing Limited, New YorkCrossRefGoogle Scholar
  30. 30.
    Liu KH, Xu CG (2009) A genetic programming-based approach to the classification of multiclass microarray datasets. Bioinformatics 25(3):331–337CrossRefGoogle Scholar
  31. 31.
    Liu L, Liang Q (2011) A high-performing comprehensive learning algorithm for text classification without pre-labeled training set. Knowl Inf Syst 29(3):727–738CrossRefGoogle Scholar
  32. 32.
    Lorena A, de Carvalho A, Gama J (2008) A review on the combination of binary classifiers in multiclass problems. Artif Intell Rev 30:19–37CrossRefGoogle Scholar
  33. 33.
    Luengo J, García S, Herrera F (2012) On the choice of the best imputation methods for missing values considering three groups of classification methods. Knowl Inf Syst 32(1):77–108CrossRefGoogle Scholar
  34. 34.
    Mayoraz E, Moreira M (1996) On the decomposition of polychotomies into dichotomiesGoogle Scholar
  35. 35.
    McLachlan GJ (2004) Discriminant analysis and statistical pattern recognition. Wiley, New YorkMATHGoogle Scholar
  36. 36.
    Ménard PA, Ratté S (2011) Classifier-based acronym extraction for business documents. Knowl Inf Syst 29(2):305–334CrossRefGoogle Scholar
  37. 37.
    Nettleton D, Orriols-Puig A, Fornells A (2010) A study of the effect of different types of noise on the precision of supervised learning techniques. Artif Intell Rev 33(4):275–306CrossRefGoogle Scholar
  38. 38.
    Passerini A, Pontil M, Frasconi P (2004) New results on error correcting output codes of kernel machines. IEEE Trans Neural Netw 15:45–54CrossRefGoogle Scholar
  39. 39.
    Pimenta E, Gama J (2005) A study on error correcting output codes. In: Portuguese conference on artificial intelligence EPIA 2005, pp 218–223Google Scholar
  40. 40.
    Quinlan JR (1986) Induction of decision trees. In: Machine learning, pp 81–106Google Scholar
  41. 41.
    Quinlan JR (1986) The effect of noise on concept learning. In: Machine learning: an artificial intelligence approach, chap. 6. Morgan Kaufmann Publishers, pp 149–166Google Scholar
  42. 42.
    Quinlan JR (1993) C4.5: programs for machine learning. Morgan Kaufmann Publishers, San FranciscoGoogle Scholar
  43. 43.
    Rifkin R, Klautau A (2004) In defense of one-vs-all classification. J Mach Learn Res 5:101–141MATHMathSciNetGoogle Scholar
  44. 44.
    da Silva I, Adeodato P (2011) PCA and gaussian noise in MLP neural network training improve generalization in problems with small and unbalanced data sets. In: Neural networks (IJCNN), the 2011 international joint conference on, pp 2664–2669Google Scholar
  45. 45.
    Sun Y, Wong AKC, Kamel MS (2009) Classification of imbalanced data: a review. Int J Pattern Recognit Artif Intell 687–719Google Scholar
  46. 46.
    Teng CM (1999) Correcting noisy data. In: Proceedings of the sixteenth international conference on machine learning. Morgan Kaufmann Publishers, San Francisco, pp 239–248Google Scholar
  47. 47.
    Teng CM (2004) Polishing blemishes: Issues in data correction. IEEE Intell Syst 19:34–39CrossRefGoogle Scholar
  48. 48.
    Vapnik V (1998) Statistical learning theory. Wiley, New YorkMATHGoogle Scholar
  49. 49.
    Verikas A, Guzaitis J, Gelzinis A, Bacauskiene M (2011) A general framework for designing a fuzzy rule-based classifier. Knowl Inf Syst 29(1):203–221CrossRefGoogle Scholar
  50. 50.
    Wang RY, Storey VC, Firth CP (1995) A framework for analysis of data quality research. IEEE Trans Knowl Data Eng 7(4):623–640CrossRefGoogle Scholar
  51. 51.
    Wu TF, Lin CJ, Weng RC (2004) Probability estimates for multi-class classification by pairwise coupling. JMach Learn Res 5:975–1005MATHMathSciNetGoogle Scholar
  52. 52.
    Wu X (1996) Knowledge acquisition from databases. Ablex Publishing Corp, NorwoodGoogle Scholar
  53. 53.
    Wu X, Zhu X (2008) Mining with noise knowledge: error-aware data mining. IEEE Trans Syst Man Cybern Part A Syst Humans 38(4):917–932CrossRefGoogle Scholar
  54. 54.
    Zhang C, Wu C, Blanzieri E, Zhou Y, Wang Y, Du W, Liang Y (2009) Methods for labeling error detection in microarrays based on the effect of data perturbation on the regression model. Bioinformatics 25(20):2708–2714CrossRefGoogle Scholar
  55. 55.
    Zhong S, Khoshgoftaar TM, Seliya N (2004) Analyzing software measurement data with clustering techniques. IEEE Intell Syst 19(2):20–27CrossRefGoogle Scholar
  56. 56.
    Zhu X, Wu X (2004) Class noise vs. attribute noise: a quantitative study. Artif Intell Rev 22:177–210CrossRefMATHGoogle Scholar
  57. 57.
    Zhu X, Wu X, Chen Q (2003) Eliminating class noise in large datasets. In: Proceeding of the twentieth international conference on machine learning, pp 920–927Google Scholar
  58. 58.
    Zhu X, Wu X, Yang Y (2004) Error detection and impact-sensitive instance ranking in noisy datasets. In: Proceedings of the nineteenth national conference on artificial intelligence. AAAI Press, pp 378–383Google Scholar

Copyright information

© Springer-Verlag London 2012

Authors and Affiliations

  • José A. Sáez
    • 1
  • Mikel Galar
    • 2
  • Julián Luengo
    • 3
  • Francisco Herrera
    • 1
  1. 1.Department of Computer Science and Artificial IntelligenceUniversity of Granada, CITIC-UGRGranadaSpain
  2. 2.Department of Automática y ComputaciónUniversidad Pública de NavarraPamplonaSpain
  3. 3.Department of Civil Engineering, LSIUniversity of BurgosBurgosSpain

Personalised recommendations