Advertisement

Pruning Hypothesis Spaces Using Learned Domain Theories

  • Martin SvatošEmail author
  • Gustav Šourek
  • Filip Železný
  • Steven Schockaert
  • Ondřej Kuželka
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10759)

Abstract

We present a method to prune hypothesis spaces in the context of inductive logic programming. The main strategy of our method consists in removing hypotheses that are equivalent to already considered hypotheses. The distinguishing feature of our method is that we use learned domain theories to check for equivalence, in contrast to existing approaches which only prune isomorphic hypotheses. Specifically, we use such learned domain theories to saturate hypotheses and then check if these saturations are isomorphic. While conceptually simple, we experimentally show that the resulting pruning strategy can be surprisingly effective in reducing both computation time and memory consumption when searching for long clauses, compared to approaches that only consider isomorphism.

Notes

Acknowledgements

MS, GŠ and FŽ acknowledge support by project no. 17-26999S granted by the Czech Science Foundation. This work was done while OK was with Cardiff University and supported by a grant from the Leverhulme Trust (RPG-2014-164). SS is supported by ERC Starting Grant 637277. Computational resources were provided by the CESNET LM2015042 and the CERIT Scientific Cloud LM2015085, provided under the programme “Projects of Large Research, Development, and Innovations Infrastructures”.

References

  1. 1.
    Berre, D.L., Parrain, A.: The SAT4J library, release 2.2. J. Satisfiability Boolean Model. Comput. 7, 50–64 (2010)Google Scholar
  2. 2.
    Buntine, W.L.: Generalized subsumption and its applications to induction and redundancy. Artif. Intell. 36(2), 149–176 (1988)MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    Chekuri, C., Rajaraman, A.: Conjunctive query containment revisited. Theor. Comput. Sci. 239(2), 211–229 (2000)MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    Dechter, R.: Constraint Processing. Elsevier Morgan Kaufmann, San Francisco (2003)zbMATHGoogle Scholar
  5. 5.
    Dehaspe, L., De Raedt, L.: Mining association rules in multiple relations. In: Lavrač, N., Džeroski, S. (eds.) ILP 1997. LNCS, vol. 1297, pp. 125–132. Springer, Heidelberg (1997).  https://doi.org/10.1007/3540635149_40 CrossRefGoogle Scholar
  6. 6.
    Ferilli, S., Fanizzi, N., Di Mauro, N., Basile, T.M.: Efficient \(\theta \)-subsumption under object identity. In: 2002 AI*IA Workshop, pp. 59–68 (2002)Google Scholar
  7. 7.
    van Hoeve, W.J.: The alldifferent constraint: A survey (2001). CoRR cs.PL/0105015. http://arxiv.org/abs/cs.PL/0105015
  8. 8.
    Kuželka, O., Železný, F.: A restarted strategy for efficient subsumption testing. Fundam. Inform. 89(1), 95–109 (2008)zbMATHGoogle Scholar
  9. 9.
    Kuželka, O., Železný, F.: Block-wise construction of tree-like relational features with monotone reducibility and redundancy. Mach. Learn. 83(2), 163–192 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    Malerba, D.: Learning recursive theories in the normal ILP setting. Fundam. Inform. 57(1), 39–77 (2003)MathSciNetzbMATHGoogle Scholar
  11. 11.
    Maloberti, J., Sebag, M.: Fast theta-subsumption with constraint satisfaction algorithms. Mach. Learn. 55(2), 137–174 (2004)CrossRefzbMATHGoogle Scholar
  12. 12.
    Muggleton, S.: Inverse entailment and progol. New Gen. Comput. 13(3–4), 245–286 (1995)CrossRefGoogle Scholar
  13. 13.
    Newborn, M.: Automated Theorem Proving - Theory and Practice. Springer, New York (2001).  https://doi.org/10.1007/978-1-4613-0089-2 CrossRefzbMATHGoogle Scholar
  14. 14.
    Nijssen, S., Kok, J.N.: Efficient frequent query discovery in Farmer. In: Lavrač, N., Gamberger, D., Todorovski, L., Blockeel, H. (eds.) PKDD 2003. LNCS (LNAI), vol. 2838, pp. 350–362. Springer, Heidelberg (2003).  https://doi.org/10.1007/978-3-540-39804-2_32 CrossRefGoogle Scholar
  15. 15.
    Plotkin, G.D.: A note on inductive generalization. Mach. Intell. 5(1), 153–163 (1970)MathSciNetzbMATHGoogle Scholar
  16. 16.
    Raedt, L.D.: Logical settings for concept-learning. Artif. Intell. 95(1), 187–201 (1997)MathSciNetCrossRefzbMATHGoogle Scholar
  17. 17.
    Ralaivola, L., Swamidass, S.J., Saigo, H., Baldi, P.: Graph kernels for chemical informatics. Neural Netw. 18(8), 1093–1110 (2005)CrossRefGoogle Scholar
  18. 18.
    Ramon, J., Roy, S., Jonny, D.: Efficient homomorphism-free enumeration of conjunctive queries. In: Preliminary Papers ILP 2011, p. 6 (2011)Google Scholar
  19. 19.
    Riedel, S.: Improving the accuracy and efficiency of MAP inference for markov logic. In: 24th Conference on Uncertainty in Artificial Intelligence, UAI 2008, pp. 468–475 (2008)Google Scholar
  20. 20.
    Stepp, R.E., Michalski, R.S.: Conceptual clustering: inventing goal-oriented classifications of structured objects. In: Machine Learning: An Artificial Intelligence Approach, vol. 2, pp. 471–498 (1986)Google Scholar
  21. 21.
    Tamaddoni-Nezhad, A., Muggleton, S.: The lattice structure and refinement operators for the hypothesis space bounded by a bottom clause. Mach. Learn. 76(1), 37–72 (2009)CrossRefGoogle Scholar
  22. 22.
    Weisfeiler, B., Lehman, A.: A reduction of a graph to a canonical form and an algebra arising during this reduction. Nauchno-Technicheskaya Informatsia 2(9), 12–16 (1968)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Czech Technical UniversityPragueCzech Republic
  2. 2.School of CS and InformaticsCardiff UniversityCardiffUK
  3. 3.Department of Computer ScienceKU LeuvenLeuvenBelgium

Personalised recommendations