Dynamic Programming Algorithms as Products of Weighted Logic Programs

  • Shay B. Cohen
  • Robert J. Simmons
  • Noah A. Smith
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5366)


Weighted logic programming, a generalization of bottom-up logic programming, is a successful framework for specifying dynamic programming algorithms. In this setting, proofs correspond to the algorithm’s output space, such as a path through a graph or a grammatical derivation, and are given a weighted score, often interpreted as a probability, that depends on the score of the base axioms used in the proof. The desired output is a function over all possible proofs, such as a sum of scores or an optimal score. We describe the PRODUCT transformation, which can merge two weighted logic programs into a new one. The resulting program optimizes a product of proof scores from the original programs, constituting a scoring function known in machine learning as a “product of experts.” Through the addition of intuitive constraining side conditions, we show that several important dynamic programming algorithms can be derived by applying PRODUCT to weighted logic programs corresponding to simpler weighted logic programs.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Greco, S., Zaniolo, C.: Greedy algorithms in Datalog. Theory Pract. Log. Program 1(4), 381–407 (2001)MathSciNetCrossRefMATHGoogle Scholar
  2. 2.
    Ganzinger, H., McAllester, D.A.: Logical algorithms. In: Stuckey, P.J. (ed.) ICLP 2002. LNCS, vol. 2401, pp. 209–223. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  3. 3.
    Tarjan, R.E.: A unified approach to path problems. Journal of the ACM 28(3), 577–593 (1981)MathSciNetCrossRefMATHGoogle Scholar
  4. 4.
    Eisner, J., Goldlust, E., Smith, N.A.: Dyna: A declarative language for implementing dynamic programs. In: Proc. of ACL (companion volume) (2004)Google Scholar
  5. 5.
    Goodman, J.: Semiring parsing. Computational Linguistics 25(4), 573–605 (1999)MathSciNetGoogle Scholar
  6. 6.
    Shieber, S.M., Schabes, Y., Pereira, F.C.N.: Principles and implementation of deductive parsing. Journal of Logic Programming 24(1–2), 3–36 (1995)MathSciNetCrossRefMATHGoogle Scholar
  7. 7.
    Sikkel, K.: Parsing Schemata: A Framework for Specification and Analysis of Parsing Algorithms. Springer-Verlag New York, Inc., Secaucus, NJ, USA (1997)Google Scholar
  8. 8.
    McAllester, D.A.: On the complexity analysis of static analyses. In: Cortesi, A., Filé, G. (eds.) SAS 1999. LNCS, vol. 1694, pp. 312–329. Springer, Heidelberg (1999)CrossRefGoogle Scholar
  9. 9.
    Klein, D., Manning, C.D.: Parsing and hypergraphs. New developments in parsing technology, 351–372 (2004)Google Scholar
  10. 10.
    Felzenszwalb, P.F., McAllester, D.: The generalized A ∗  architecture. Journal of Artificial Intelligence Research 29, 153–190 (2007)MathSciNetMATHGoogle Scholar
  11. 11.
    Eisner, J., Goldlust, E., Smith, N.A.: Compiling comp ling: practical weighted dynamic programming and the dyna language. In: HLT 2005: Proceedings of the conference on Human Language Technology and Empirical Methods in Natural Language Processing, Morristown, NJ, USA, pp. 281–290. Association for Computational Linguistics (2005)Google Scholar
  12. 12.
    Huang, L., Chiang, D.: Better k-best parsing. In: Proceedings of the Ninth International Workshop on Parsing Technologies (IWPT 2005), Vancouver, Canada (2005)Google Scholar
  13. 13.
    Eisner, J., Blatz, J.: Program transformations for optimization of parsing algorithms and other weighted logic programs. In: Wintner, S. (ed.) Proceedings of FG 2006: The 11th Conference on Formal Grammar, pp. 45–85. CSLI Publications (2007)Google Scholar
  14. 14.
    Hinton, G.E.: Training products of experts by minimizing contrastive divergence. Neural Comput. 14(8), 1771–1800 (2002)CrossRefMATHGoogle Scholar
  15. 15.
    Klein, D., Manning, C.D.: Fast exact inference with a factored model for natural language parsing. In: Advances in Neural Information Processing Systems, pp. 3–10. MIT Press, Cambridge (2002)Google Scholar
  16. 16.
    Liang, P., Klein, D., Jordan, M.: Agreement-based learning. In: Platt, J., Koller, D., Singer, Y., Roweis, S. (eds.) Advances in Neural Information Processing Systems, vol. 20, pp. 913–920. MIT Press, Cambridge (2008)Google Scholar
  17. 17.
    Chiang, D.: Hierarchical phrase-based translation. Comput. Linguist. 33(2), 201–228 (2007)MathSciNetCrossRefMATHGoogle Scholar
  18. 18.
    Cohen, S.B., Smith, N.A.: Joint morphological and syntactic disambiguation. In: Proceedings of EMNLP-CoNLL 2007, pp. 208–217 (2007)Google Scholar
  19. 19.
    Sutton, C., McCallum, A.: Piecewise training for undirected models. In: Proceedings of the 21th Annual Conference on Uncertainty in Artificial Intelligence (UAI 2005), Arlington, Virginia, p. 568. AUAI Press (2005)Google Scholar
  20. 20.
    Cohen, S.B., Simmons, R.J., Smith, N.A.: Products of weighted logic programs. Technical Report CMU-LTI-08-009, Carnegie Mellon University (2008)Google Scholar
  21. 21.
    Levenshtein, V.: Binary codes capable of correcting spurious insertions and deletions of ones. Problems of Information Transmission 1, 8–17 (1965)MATHGoogle Scholar
  22. 22.
    Mohri, M.: Finite-state transducers in language and speech processing. Comput. Linguist. 23(2), 269–311 (1997)MathSciNetGoogle Scholar
  23. 23.
    Manning, C., Schütze, H.: Foundations of Statistical Natural Language Processing. MIT Press, Cambridge (1999)MATHGoogle Scholar
  24. 24.
    Hopcroft, J.E., Ullman, J.D.: Introduction to Automata Theory, Languages, and Computation. Addison-Wesley, Reading (1979)MATHGoogle Scholar
  25. 25.
    Cocke, J., Schwartz, J.T.: Programming languages and their compilers: Preliminary notes. Technical report, Courant Institute of Mathematical Sciences, New York University (1970)Google Scholar
  26. 26.
    Gaifman, H.: Dependency systems and phrase-structure systems. Information and Control 8 (1965)Google Scholar
  27. 27.
    Eisner, J., Satta, G.: Efficient parsing for bilexical context-free grammars and head automaton grammars. In: Proceedings of the 37th annual meeting of the Association for Computational Linguistics on Computational Linguistics, Morristown, NJ, USA, pp. 457–464. Association for Computational Linguistics (1999)Google Scholar
  28. 28.
    Wu, D.: Stochastic inversion transduction grammars and bilingual parsing of parallel corpora. Computational Linguistics 23(3), 377–404 (1997)Google Scholar
  29. 29.
    Melamed, I.D.: Multitext grammars and synchronous parsers. In: NAACL 2003: Proceedings of the 2003 Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology, Morristown, NJ, USA, pp. 79–86. Association for Computational Linguistics (2003)Google Scholar
  30. 30.
    Zhang, H., Gildea, D.: Stochastic lexicalized inversion transduction grammar for alignment. In: ACL 2005: Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics, Morristown, NJ, USA, pp. 475–482. Association for Computational Linguistics (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Shay B. Cohen
    • 1
  • Robert J. Simmons
    • 1
  • Noah A. Smith
    • 1
  1. 1.School of Computer ScienceCarnegie Mellon UniversityUSA

Personalised recommendations