Oddness-Based Classifiers for Boolean or Numerical Data

  • Myriam BounhasEmail author
  • Henri Prade
  • Gilles Richard
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9324)


The paper proposes an oddness measure for estimating the extent to which a new item is at odds with a class. Then a simple classification procedure based on the minimization of oddness with respect to the different classes is proposed. Experiments on standard benchmarks with Boolean or numerical data provide good results. The global oddness measure is based on the estimation of the oddness of the new item with respect to the subsets of the classes having a given size.


Numerical Data Truth Table Logical Formula Standard Benchmark Oddness Measure 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Benedetto, D., Caglioti, E., Loreto, V.: Language trees and zipping. Phys. Review Lett. 88(4) (2002)Google Scholar
  2. 2.
    Bounhas, M., Prade, H., Richard, G.: Analogical classification: handling numerical data. In: Straccia, U., Calì, A. (eds.) SUM 2014. LNCS, vol. 8720, pp. 66–79. Springer, Heidelberg (2014) Google Scholar
  3. 3.
    Bounhas, M., Prade, H., Richard, G.: A new view of conformity and its application to classification. In: Destercke, S., Denoeux, T. (eds.) ECSQARU 2015. LNAI, vol. 9161. Springer, Heidelberg (2015) CrossRefGoogle Scholar
  4. 4.
    Bounhas, M., Prade, H., Richard, G.: Analogical classification: a new way to deal with examples. In: ECAI 2014–21st European Conference on Artificial Intelligence, 18–22 August 2014, Prague, Czech Republic. Frontiers in Artificial Intelligence and Applications, vol. 263, pp. 135–140. IOS Press (2014)Google Scholar
  5. 5.
    Demsar, J.: Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7, 1–30 (2006)MathSciNetzbMATHGoogle Scholar
  6. 6.
    Mertz, J., Murphy, P.: Uci repository of machine learning databases (2000).
  7. 7.
    Miclet, L., Bayoudh, S., Delhay, A.: Analogical dissimilarity: definition, algorithms and two experiments in machine learning. JAIR 32, 793–824 (2008)MathSciNetzbMATHGoogle Scholar
  8. 8.
    Prade, H., Richard, G.: From analogical proportion to logical proportions. Logica Universalis 7(4), 441–505 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    Prade, H., Richard, G.: Homogenous and heterogeneous logical proportions. IfCoLog J. of Logics and their Applications 1(1), 1–51 (2014)Google Scholar
  10. 10.
    Sculley, D., Brodley, C.E.: Compression and machine learning: a new perspective on feature space vectors. In: Proc. of the Data Compressing Conference DCC, pp. 332–341. IEEE (2006)Google Scholar
  11. 11.
    Vovk, V., Gammerman, A., Saunders, C.: Machine-learning applications of algorithmic randomness. Int. Conf. on Machine Learning, pp. 444–453 (1999)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Myriam Bounhas
    • 1
    • 2
    Email author
  • Henri Prade
    • 3
    • 4
  • Gilles Richard
    • 3
    • 4
  1. 1.LARODEC LaboratoryISG de TunisLe BardoTunisia
  2. 2.Emirates College of TechnologyAbu DhabiUnited Arab Emirates
  3. 3.IRIT – CNRSToulouseFrance
  4. 4.QCISUniversity of TechnologySydneyAustralia

Personalised recommendations