The Role of PP Attachment in Preposition Generation

  • John Lee
  • Ola Knutsson
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4919)

Abstract

This paper is concerned with the task of preposition generation in the context of a grammar checker. Relevant features for this task can range from lexical features, such as words and their part-of-speech tags in the vicinity of the preposition, to syntactic features that take into account the attachment site of the prepositional phrase (PP), as well as its argument/adjunct distinction. We compare the performance of these different kinds of features in a memory-based learning framework. Experiments show that using PP attachment information can improve preposition generation accuracy on Wall Street Journal texts.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bitchener, J., Young, S., Cameron, D.: The effect of different types of corrective feedback on esl student writing. Journal of Second Language Writing 14, 191–205 (2005)CrossRefGoogle Scholar
  2. 2.
    Chodorow, M., Tetreault, J., Han, N.R.: Detection of grammatical errors involving prepositions. In: Proc. ACL-SIGSEM Workshop on Prepositions, Prague, Czech Republic (2007)Google Scholar
  3. 3.
    Eeg-Olofsson, J., Knutsson, O.: Automatic grammar checking for second language learners — the use of prepositions. In: Proc. NoDaLiDa, Reykjavík, Iceland (2003)Google Scholar
  4. 4.
    De Felice, R., Pulman, S.G.: Automatically acquiring models of preposition use. In: Proc. ACL-SIGSEM Workshop on Prepositions, Prague, Czech Republic (2007)Google Scholar
  5. 5.
    Quirk, R., et al.: A comprehensive grammar of the english language. Longman (1985)Google Scholar
  6. 6.
    Merlo, P., Esteve Ferrer, E.: The notion of argument in prepositional phrase attachment. Computational Linguistics 32, 341–378 (2006)CrossRefGoogle Scholar
  7. 7.
    Ratnaparkhi, A., Reynar, J., Roukos, S.: A maximum entropy model for prepositional phrase attachment. In: Proc. ARPA Workshop on Human Language Technology, Plainsboro, NJ (1994)Google Scholar
  8. 8.
    Daelemans, W., van den Bosch, A., Zavrel, J.: Forgetting exceptions is harmful in language learning. Machine Learning 34, 11–41 (1999)MATHCrossRefGoogle Scholar
  9. 9.
    Daelemans, W., van den Bosch, A., Weijters, A.: Igtree: Using trees for compression and classification in lazy learning algorithms. Artificial Intelligence Review 11, 407–423 (1997)CrossRefGoogle Scholar
  10. 10.
    Marcus, M., et al.: The penn treebank: Annotating predicate argument structure. In: Proc. ARPA Workshop on Human Language Technology, Plainsboro, NJ (1994)Google Scholar
  11. 11.
    Collins, M.: Head-driven statistical models for natural language parsing. Computational Linguistics 29, 589–637 (2003)CrossRefMathSciNetGoogle Scholar
  12. 12.
    Bies, A., et al.: Bracketing guidelines for treebank ii style. Technical Report, University of Pennsylvania, Philadelphia, PA (1995)Google Scholar
  13. 13.
    Collins, M., Brooks, J.: Prepositional phrase attachment through a backed-off model. In: Proc. 3rd Workshop on Very Large Corpora, Cambridge, MA (1995)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • John Lee
    • 1
  • Ola Knutsson
    • 2
  1. 1.MIT Computer Science and Artificial Intelligence LaboratorySpoken Language SystemsCambridgeUSA
  2. 2.School of Computer Science and CommunicationRoyal Institute of TechnologyStockholmSweden

Personalised recommendations