Skip to main content

A Metaheuristic Perspective on Learning Classifier Systems

  • Chapter
  • First Online:
Metaheuristics for Machine Learning

Abstract

Within this book chapter we summarize Learning Classifier Systems (LCSs), a family of rule-based learning systems with a more than forty-year-long research history, and differentiate LCSs from related approaches like Genetic Programming, Decision Trees, Mixture of Experts as well as Bagging and Boosting. LCS training constructs a finite collection of if-then rules. While the conclusion (the then-part) of each rule is based on a problem-dependent local submodel (e. g.linear regression), the individual conditions and, by extension, the global model structure, are optimized using a—typically evolutionary—metaheuristic. This makes the employed metaheuristic a central part of learning and indispensable for successful training. While most traditional LCSs solely rely on Genetic Algorithms, in this chapter, we also explore systems that employ different metaheuristics while still being similar to LCSs. Furthermore, we discuss the different problems that metaheuristics are solving when applied in this context, for example, discrete or real valued input domains, optimization of individual rule conditions or entire sets of rule conditions, and fitness functions that support nicheing or control bloating. We ascertain that, despite the optimizer being essential, it has been investigated less directly than the learning components so far. Thus, overall, we provide an analysis of LCSs with a focus on the used metaheuristics and present existing solutions as well as current challenges to other practioners in the fields of metaheuristic ML and rule-based learning.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 139.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 179.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 179.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    At the time of this writing, a more formal treatment of the differences is under active investigation.

  2. 2.

    At least as long as the LCS uses the typical binary matching functions. A matching-by-degree LCS can actually be more expressive than a comparable MoE.

  3. 3.

    If matching by degree is used (which is not the case for any of the promiment LCSs), they are softly localized. This means that inputs are somewhat weighted—which goes more in the same direction as boosting.

  4. 4.

    This is equivalent to \((l_{1} \leqslant x_1 < u_{1}) \wedge (l_{2} \leqslant x_2 < u_{2}) \wedge (l_{3} \leqslant x_3 < u_{3})\).

References

  1. Basheer M. Al-Maqaleh and Hamid Shahbazkia. A Genetic Algorithm for Discovering Classification Rules in Data Mining. International Journal of Computer Applications, 41(18):40–44, mar 2012.

    Google Scholar 

  2. Bilal Alatas. A novel chemistry based metaheuristic optimization method for mining of classification rules. Expert Systems with Applications, 39(12):11080–11088, sep 2012.

    Google Scholar 

  3. Zulfiqar Ali and Waseem Shahzad. Comparative Analysis and Survey of Ant Colony Optimization based Rule Miners. International Journal of Advanced Computer Science and Applications, 8(1), 2017.

    Google Scholar 

  4. Sarab AlMuhaideb and Mohamed El Bachir Menai. HColonies: a new hybrid metaheuristic for medical data classification. Applied Intelligence, 41(1):282–298, feb 2014.

    Google Scholar 

  5. Sarab AlMuhaideb and Mohamed El Bachir Menai. A new hybrid metaheuristic for medical data classification. International Journal of Metaheuristics, 3(1):59, 2014.

    Google Scholar 

  6. Jaume Bacardit. Pittsburgh genetics-based machine learning in the data mining era: representations, generalization, and run-time. PhD thesis, PhD thesis, Ramon Llull University, Barcelona, 2004.

    Google Scholar 

  7. Jaume Bacardit and Natalio Krasnogor. BioHEL: Bioinformatics-oriented Hierarchical Evolutionary Learning. 2006.

    Google Scholar 

  8. Thomas Bäck, D. B. Fogel, and Z. Michalewicz, editors. Handbook of Evolutionary Computation. CRC Press, jan 1997.

    Google Scholar 

  9. Alwyn Barry. The stability of long action chains in XCS. Soft Comput., 6(3–4):183–199, 2002.

    Article  MATH  Google Scholar 

  10. Marco Barsacchi, Alessio Bechini, and Francesco Marcelloni. An analysis of boosted ensembles of binary fuzzy decision trees. Expert Systems with Applications, 154, 2020.

    Google Scholar 

  11. Ester Bernadó-Mansilla and Josep M. Garrell-Guiu. Accuracy-Based Learning Classifier Systems: Models, Analysis and Applications to Classification Tasks. Evolutionary Computation, 11(3):209–238, 09 2003.

    Google Scholar 

  12. Christopher M. Bishop. Pattern recognition and machine learning, 8th Edition. Information science and statistics. Springer, 2009.

    Google Scholar 

  13. Urszula Boryczka and Jan Kozak. New Algorithms for Generation Decision Trees—Ant-Miner and Its Modifications. In Studies in Computational Intelligence, pages 229–262. Springer Berlin Heidelberg, 2009.

    Google Scholar 

  14. Leo Breiman. Bagging predictors. Mach. Learn., 24(2):123–140, 1996.

    Article  MATH  Google Scholar 

  15. Leo Breiman, Jerome H. Friedman, Richard A. Olshen, and Charles J. Stone. Classification And Regression Trees. Routledge, October 1984.

    MATH  Google Scholar 

  16. James Brookhouse and Fernando E. B. Otero. Discovering Regression Rules with Ant Colony Optimization. In Proceedings of the Companion Publication of the 2015 Annual Conference on Genetic and Evolutionary Computation. ACM, jul 2015.

    Google Scholar 

  17. Larry Bull and Jacob Hurst. A neural learning classifier system with self-adaptive constructivism. In The 2003 Congress on Evolutionary Computation, 2003. CEC ’03., volume 2, pages 991–997, 2003.

    Google Scholar 

  18. Martin V. Butz, Pier Luca Lanzi, and Stewart W. Wilson. Function approximation with XCS: Hyperellipsoidal conditions, recursive least squares, and compaction. IEEE Transactions on Evolutionary Computation, 12(3):355–376, 2008.

    Google Scholar 

  19. Martin V. Butz and Wolfgang Stolzmann. An algorithmic description of ACS2. In Pier Luca Lanzi, Wolfgang Stolzmann, and Stewart W. Wilson, editors, Advances in Learning Classifier Systems, pages 211–229, Berlin, Heidelberg, 2002. Springer Berlin Heidelberg.

    Google Scholar 

  20. Ivan Chorbev, Boban Joksimoski, and Dragan Mihajlov. SA Tabu Miner: A hybrid heuristic algorithm for rule induction. Intelligent Decision Technologies, 6:265–271, 2012.

    Article  Google Scholar 

  21. Dieferson L. Alves de Araujo, Heitor S. Lopes, and Alex A. Freitas. A parallel genetic algorithm for rule discovery in large databases. In IEEE SMC'99 Conference Proceedings. 1999 IEEE International Conference on Systems, Man, and Cybernetics (Cat. No.99CH37028). IEEE, 1999.

    Google Scholar 

  22. Kenneth DeJong. Learning with genetic algorithms: An overview. Mach. Learn., 3:121–138, 1988.

    Article  Google Scholar 

  23. Clarisse Dhaenens and Laetitia Jourdan. Metaheuristics for data mining. 4OR, 17(2):115–139, apr 2019.

    Google Scholar 

  24. Thomas G. Dietterich. Ensemble methods in machine learning. In Multiple Classifier Systems, pages 1–15, Berlin, Heidelberg, 2000. Springer Berlin Heidelberg.

    Google Scholar 

  25. Marco Dorigo and Thomas Stützle. Ant colony optimization. MIT Press, Cambridge, Mass, 2004.

    Book  MATH  Google Scholar 

  26. Jan Drugowitsch. Design and Analysis of Learning Classifier Systems - A Probabilistic Approach, volume 139 of Studies in Computational Intelligence. Springer Berlin Heidelberg, Berlin, Heidelberg, 2008.

    MATH  Google Scholar 

  27. Narayanan Unny Edakunni, Gavin Brown, and Tim Kovacs. Online, GA based mixture of experts: a probabilistic model of UCS. In Natalio Krasnogor and Pier Luca Lanzi, editors, 13th Annual Genetic and Evolutionary Computation Conference, GECCO 2011, Proceedings, Dublin, Ireland, July 12–16, 2011, pages 1267–1274, New York, NY, USA, 2011. ACM.

    Google Scholar 

  28. J. Doyne Farmer, Norman H. Packard, and Alan S. Perelson. The immune system, adaptation, and machine learning. Physica D: Nonlinear Phenomena, 22(1–3):187–204, oct 1986.

    Google Scholar 

  29. María A. Franco, Natalio Krasnogor, and Jaume Bacardit. GAssist vs. BioHEL: critical assessment of two paradigms of genetics-based machine learning. Soft Comput., 17(6):953–981, 2013.

    Google Scholar 

  30. Alex A. Freitas. Data Mining and Knowledge Discovery with Evolutionary Algorithms. Springer Berlin Heidelberg, 2002.

    Book  MATH  Google Scholar 

  31. Alex A. Freitas. A Survey of Evolutionary Algorithms for Data Mining and Knowledge Discovery. In Natural Computing Series, pages 819–845. Springer Berlin Heidelberg, 2003.

    Google Scholar 

  32. Yoav Freund and Robert E. Schapire. Experiments with a new boosting algorithm. In Lorenza Saitta, editor, Machine Learning, Proceedings of the Thirteenth International Conference (ICML ’96), Bari, Italy, July 3–6, 1996, pages 148–156. Morgan Kaufmann, 1996.

    Google Scholar 

  33. Simon M. Garrett. How Do We Evaluate Artificial Immune Systems? Evolutionary Computation, 13(2):145–177, jun 2005.

    Google Scholar 

  34. David Goldberg. Genetic algorithms in search, optimization, and machine learning. Addison-Wesley Publishing Company, Reading, Mass, 1989.

    MATH  Google Scholar 

  35. Preeti Gupta, Tarun Kumar Sharma, Deepti Mehrotra, and Ajith Abraham. Knowledge building through optimized classification rule set generation using genetic based elitist multi objective approach. Neural Computing and Applications, 31(S2):845–855, may 2017.

    Google Scholar 

  36. Emma Hart and Jon Timmis. Application areas of AIS: The past, the present and the future. Applied Soft Computing, 8(1):191–201, jan 2008.

    Google Scholar 

  37. Nicholas Holden and Alex A. Freitas. A hybrid particle swarm/ant colony algorithm for the classification of hierarchical biological data. In Proceedings 2005 IEEE Swarm Intelligence Symposium, 2005. SIS 2005. IEEE, 2005.

    Google Scholar 

  38. Nicholas Holden and Alex A. Freitas. A hybrid PSO/ACO algorithm for discovering classification rules in data mining. Journal of Artificial Evolution and Applications, 2008:1–11, may 2008.

    Google Scholar 

  39. John H. Holland. Adaptation. Progress in theoretical biology, 4:263–293, 1976.

    Article  Google Scholar 

  40. Muhammad Iqbal, Will N. Browne, and Mengjie Zhang. Reusing building blocks of extracted knowledge to solve complex, large-scale boolean problems. IEEE Transactions on Evolutionary Computation, 18(4):465–480, 2014.

    Article  Google Scholar 

  41. Robert A. Jacobs, Michael I. Jordan, Steven J. Nowlan, and Geoffrey E. Hinton. Adaptive mixtures of local experts. Neural Computation, 3(1):79–87, 1991.

    Article  Google Scholar 

  42. Cezary Z. Janikow. Fuzzy decision trees: issues and methods. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 28(1):1–14, 1998.

    Article  Google Scholar 

  43. Licheng Jiao, Jing Liu, and Weicai Zhong. An organizational coevolutionary algorithm for classification. IEEE Transactions on Evolutionary Computation, 10(1):67–80, feb 2006.

    Google Scholar 

  44. Michael I. Jordan and Robert A. Jacobs. Hierarchical mixtures of experts and the EM algorithm. Neural Computation, 6(2):181–214, 1994.

    Article  Google Scholar 

  45. John R. Koza. Genetic programming - on the programming of computers by means of natural selection. Complex adaptive systems. MIT Press, 1993.

    MATH  Google Scholar 

  46. Bo Liu, H. A. Abbas, and B. McKay. Classification rule discovery with ant colony optimization. In IEEE/WIC International Conference on Intelligent Agent Technology, 2003. IAT 2003. IEEE Comput. Soc, 2003.

    Google Scholar 

  47. Yi Liu, Will N. Browne, and Bing Xue. Absumption and subsumption based learning classifier systems. In Proceedings of the 2020 Genetic and Evolutionary Computation Conference, GECCO ’20, pages 368–376, New York, NY, USA, 2020. Association for Computing Machinery.

    Google Scholar 

  48. Sean Luke. Essentials of metaheuristics: a set of undergraduate lecture notes. Lulu Com, 2013.

    Google Scholar 

  49. Bart Minnaert and David Martens. Towards a Particle Swarm Optimization-Based Regression Rule Miner. In 2012 IEEE 12th International Conference on Data Mining Workshops. IEEE, dec 2012.

    Google Scholar 

  50. Thiago Zafalon Miranda, Diorge Brognara Sardinha, Mrcio Porto Basgalupp, Yaochu Jin, and Ricardo Cerri. Generation of Consistent Sets of Multi-Label Classification Rules with a Multi-Objective Evolutionary Algorithm, March 2020. https://arXiv.org/abs/2003.12526.

  51. Thiago Zafalon Miranda, Diorge Brognara Sardinha, and Ricardo Cerri. Preventing the Generation of Inconsistent Sets of Classification Rules, August 2019. https://arXiv.org/abs/1908.09652.

  52. Hamid Mohamadi, Jafar Habibi, Mohammad Saniee Abadeh, and Hamid Saadi. Data mining with a simulated annealing based fuzzy classification system. Pattern Recognition, 41(5):1824–1833, may 2008.

    Google Scholar 

  53. Masaya Nakata, Will N. Browne, Tomoki Hamagami, and Keiki Takadama. Theoretical XCS parameter settings of learning accurate classifiers. In Peter A. N. Bosman, editor, Proceedings of the Genetic and Evolutionary Computation Conference 2017, GECCO ’17, pages 473–480, New York, NY, USA, 2017. ACM.

    Google Scholar 

  54. Bijaya Kumar Nanda and Satchidananda Dehuri. Ant Miner: A Hybrid Pittsburgh Style Classification Rule Mining Algorithm. International Journal of Artificial Intelligence and Machine Learning, 10(1):45–59, jan 2020.

    Google Scholar 

  55. Romain Orhand, Anne Jeannin-Girardon, Pierre Parrend, and Pierre Collet. PEPACS Integrating probability-enhanced predictions to acs2. GECCO ’20, New York, NY, USA, 2020. Association for Computing Machinery.

    Google Scholar 

  56. Fernando E. B. Otero, Alex A. Freitas, and Colin G. Johnson. cAnt-Miner: An Ant Colony Classification Algorithm to Cope with Continuous Attributes. In Ant Colony Optimization and Swarm Intelligence, pages 48–59. Springer Berlin Heidelberg, 2008.

    Google Scholar 

  57. Fernando E. B. Otero, Alex A. Freitas, and Colin G. Johnson. A New Sequential Covering Strategy for Inducing Classification Rules With Ant Colony Algorithms. IEEE Transactions on Evolutionary Computation, 17(1):64–76, feb 2013.

    Google Scholar 

  58. Rafael S. Parpinelli, Heitor S. Lopes, and Alex A. Freitas. An Ant Colony Algorithm for Classification Rule Discovery. In Data Mining, pages 191–208. IGI Global, 2002.

    Google Scholar 

  59. David Pätzel, Michael Heider, and Alexander R. M. Wagner. An overview of LCS research from 2020 to 2021. In Proceedings of the Genetic and Evolutionary Computation Conference Companion, GECCO ’21, pages 1648–1656, New York, NY, USA, 2021. Association for Computing Machinery.

    Google Scholar 

  60. David Pätzel, Anthony Stein, and Masaya Nakata. An overview of LCS research from IWLCS 2019 to 2020. In Proceedings of the 2020 Genetic and Evolutionary Computation Conference Companion, GECCO ’20, pages 1782–1788, New York, NY, USA, 2020. Association for Computing Machinery.

    Google Scholar 

  61. J. Ross Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, 1993.

    Google Scholar 

  62. Grzegorz Rozenberg, Thomas Bäck, and Joost N. Kok, editors. Handbook of Natural Computing. Springer Berlin Heidelberg, 2012.

    Google Scholar 

  63. Rizauddin Saian and Ku Ruhana Ku-Mahamud. Hybrid Ant Colony Optimization and Simulated Annealing for Rule Induction. In 2011 UKSim 5th European Symposium on Computer Modeling and Simulation. IEEE, nov 2011.

    Google Scholar 

  64. Bikash Kanti Sarkar, Shib Sankar Sana, and Kripasindhu Chaudhuri. A genetic algorithm-based rule extraction system. Applied Soft Computing, 12(1):238–254, jan 2012.

    Google Scholar 

  65. P. S. Shelokar, V. K. Jayaraman, and B. D. Kulkarni. An ant colony classifier system: application to some process engineering problems. Computers & Chemical Engineering, 28(9):1577–1584, aug 2004.

    Google Scholar 

  66. Tiago Sousa, Arlindo Silva, and Ana Neves. A Particle Swarm Data Miner. In Progress in Artificial Intelligence, pages 43–53. Springer Berlin Heidelberg, 2003.

    Google Scholar 

  67. Tiago Sousa, Arlindo Silva, and Ana Neves. Particle swarm based data mining algorithms for classification tasks. Parallel Computing, 30(5–6):767–783, may 2004.

    Google Scholar 

  68. Anthony Stein. Interpolation-Assisted Evolutionary Rule-Based Machine Learning - Strategies to Counter Knowledge Gaps in XCS-Based Self-Learning Adaptive Systems. doctoralthesis, Universität Augsburg, 2019.

    Google Scholar 

  69. Anthony Stein, Roland Maier, Lukas Rosenbauer, and Jörg Hähner. XCS classifier system with experience replay. In Proceedings of the 2020 Genetic and Evolutionary Computation Conference, GECCO ’20, pages 404–413, New York, NY, USA, 2020. Association for Computing Machinery.

    Google Scholar 

  70. Anthony Stein, Simon Menssen, and Jörg Hähner. What about interpolation? a radial basis function approach to classifier prediction modeling in XCSF. In Proceedings of the Genetic and Evolutionary Computation Conference, GECCO ’18, pages 537–544, New York, NY, USA, 2018. Association for Computing Machinery.

    Google Scholar 

  71. Haijun Su, Yupu Yang, and Liang Zhao. Classification rule discovery with DE/QDE algorithm. Expert Systems with Applications, 37(2):1216–1222, mar 2010.

    Google Scholar 

  72. Richard S. Sutton and Andrew G. Barto. Reinforcement Learning: An Introduction. Second Edition. MIT Press, Cambridge, MA, 2018.

    Google Scholar 

  73. K. C. Tan, Q. Yu, C. M. Heng, and T. H. Lee. Evolutionary computing for knowledge discovery in medical diagnosis. Artificial Intelligence in Medicine, 27(2):129–154, feb 2003.

    Google Scholar 

  74. Ajay Kumar Tanwani and Muddassar Farooq. Classification Potential vs. Classification Accuracy: A Comprehensive Study of Evolutionary Algorithms with Biomedical Datasets. In Lecture Notes in Computer Science, pages 127–144. Springer Berlin Heidelberg, 2010.

    Google Scholar 

  75. Jon Timmis, Paul Andrews, Nick Owens, and Ed Clark. An interdisciplinary perspective on artificial immune systems. Evolutionary Intelligence, 1(1):5–26, jan 2008.

    Google Scholar 

  76. Ryan J. Urbanowicz and Will N. Browne. Introduction to Learning Classifier Systems. Springer Briefs in Intelligent Systems. Springer, 2017.

    Book  MATH  Google Scholar 

  77. Ryan J. Urbanowicz and Jason H. Moore. Learning classifier systems: A complete introduction, review, and roadmap. Journal of Artificial Evolution and Applications, 2009, 2009.

    Google Scholar 

  78. Ryan J. Urbanowicz and Jason H. Moore. ExSTraCS 2.0: description and evaluation of a scalable learning classifier system. Evolutionary Intelligence, 8(2):89–116, Sep 2015.

    Google Scholar 

  79. A. H. C. van Kampen, Z. Ramadan, M. Mulholland, D. B. Hibbert, and L. M. C. Buydens. Learning classification rules from an ion chromatography database using a genetic based classifier system. Analytica Chimica Acta, 344(1–2):1–15, may 1997.

    Google Scholar 

  80. Thomas Weise. Global optimization algorithms-theory and application. Self-Published Thomas Weise, 2009.

    Google Scholar 

  81. Stewart W. Wilson. ZCS: A zeroth level classifier system. Evolutionary Computation, 2(1):1–18, 1994.

    Article  Google Scholar 

  82. Stewart W. Wilson. Classifier fitness based on accuracy. Evolutionary Computation, 3(2):149–175, 1995.

    Article  Google Scholar 

  83. Stewart W. Wilson. Get real! xcs with continuous-valued inputs. In Pier Luca Lanzi, Wolfgang Stolzmann, and Stewart W. Wilson, editors, Learning Classifier Systems, pages 209–219, Berlin, Heidelberg, 2000. Springer Berlin Heidelberg.

    Google Scholar 

  84. Stewart W. Wilson. Classifiers that approximate functions. Natural Computing, 1(2):211–234, 6 2002.

    Google Scholar 

  85. Seniha Esen Yuksel, Joseph N. Wilson, and Paul D. Gader. Twenty years of mixture of experts. IEEE Transactions on Neural Networks and Learning Systems, 23(8):1177–1193, 2012.

    Google Scholar 

  86. Rodrigo C. Barros, Márcio P. Basgalupp, André C.P.L.F. de Carvalho, and Alex A. Freitas. A hyper-heuristic evolutionary algorithm for automatically designing decision tree algorithms. In Proceedings of the Fourteenth International Conference on Genetic and Evolutionary Computation - GECCO ’12. ACM Press, 2012.

    Google Scholar 

  87. Urszula Boryczka and Jan Kozak. Ant Colony Decision Trees – A New Method for Constructing Decision Trees Based on Ant Colony Optimization. In Computational Collective Intelligence. Technologies and Applications, pages 373–382. Springer Berlin Heidelberg, 2010.

    Google Scholar 

  88. Narayanan Unny Edakunni, Tim Kovacs, Gavin Brown, and James A. R. Marshall. Modeling ucs as a mixture of experts. In Proceedings of the 11th Annual Conference on Genetic and Evolutionary Computation, GECCO ’09, pages 1187–1194, New York, NY, USA, 2009. ACM.

    Google Scholar 

  89. Vili Podgorelec, Matej Šprogar, and Sandi Pohorec. Evolutionary design of decision trees. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 3(2):63–82, 2012.

    Google Scholar 

  90. Kreangsak Tamee, Larry Bull, and Ouen Pinngern. Towards clustering with XCS. In Proceedings of the 9th Annual Conference on Genetic and Evolutionary Computation, GECCO ’07, pages 1854–1860, New York, NY, USA, 2007. Association for Computing Machinery.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Michael Heider .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Heider, M., Pätzel, D., Stegherr, H., Hähner, J. (2023). A Metaheuristic Perspective on Learning Classifier Systems. In: Eddaly, M., Jarboui, B., Siarry, P. (eds) Metaheuristics for Machine Learning. Computational Intelligence Methods and Applications. Springer, Singapore. https://doi.org/10.1007/978-981-19-3888-7_3

Download citation

  • DOI: https://doi.org/10.1007/978-981-19-3888-7_3

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-19-3887-0

  • Online ISBN: 978-981-19-3888-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics