Answer Type Identification for Question Answering

Supervised Learning of Dependency Graph Patterns from Natural Language Questions
  • Andrew D. WalkerEmail author
  • Panos Alexopoulos
  • Andrew Starkey
  • Jeff Z. Pan
  • José Manuel Gómez-Pérez
  • Advaith Siddharthan
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9544)


Question Answering research has long recognised that the identification of the type of answer being requested is a fundamental step in the interpretation of a question as a whole. Previous strategies have ranged from trivial keyword matches, to statistical analyses, to well-defined algorithms based on shallow syntactic parses with user-interaction for ambiguity resolution. A novel strategy combining deep NLP on both syntactic and dependency parses with supervised learning is introduced and results that improve on extant alternatives reported. The impact of the strategy on QALD is also evaluated with a proprietary Question Answering system and its positive results analysed.


Dependency Graph Question Answering Semantic Class Grammatical Structure Question Focus 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



This research has been partly funded by the European Commission within the 7th Framework Programme/Marie Curie Industry-Academia Partnerships and Pathways schema/PEOPLE Work Programme 2011 project K-Drive number 286348 (cf.


  1. Alexopoulos, P., Walker, A., Gomez-Perez, J.M., Wallace, M.: Towards ontology-based question answering in vague domains. In: 2014 9th International Workshop on Semantic and Social Media Adaptation and Personalization (SMAP), pp. 26–31. IEEE (2014)Google Scholar
  2. Berners-Lee, T., Hendler, J., Lassila, O., et al.: The semantic web. Sci. Am. 284(5), 28–37 (2001)CrossRefGoogle Scholar
  3. Bernstein, A., Kaufmann, E., Göhring, A., Kiefer, C.: Querying ontologies: a controlled english interface for end-users. In: Gil, Y., Motta, E., Benjamins, V.R., Musen, M.A. (eds.) ISWC 2005. LNCS, vol. 3729, pp. 112–126. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  4. Brill, E., Lin, J., Banko, M., Dumais, S., Ng, A., et al.: Data-intensive question answering. In: Proceedings of the Tenth Text REtrieval Conference (TREC 2001) (2001)Google Scholar
  5. Brin, S., Page, L.: The anatomy of a large-scale hypertextual web search engine. Comput. Netwo. ISDN Syst. 30(1), 107–117 (1998)CrossRefGoogle Scholar
  6. Buscaldi, D., Rosso, P.: Mining knowledge from wikipedia for the question answering task. In: Proceedings of the International Conference on Language Resources and Evaluation (2006)Google Scholar
  7. Cer, D.M., De Marneffe, M.-C., Jurafsky, D., Manning, C.D.: Parsing to stanford dependencies: trade-offs between speed and accuracy. In: LREC (2010)Google Scholar
  8. Codd, E.F.: A relational model of data for large shared data banks. Commun. ACM 13(6), 377–387 (1970)CrossRefzbMATHGoogle Scholar
  9. Damljanovic, D., Agatonovic, M., Cunningham, H.: Identification of the question focus: combining syntactic analysis and ontology-based lookup through the user interaction. In: 7th Language Resources and Evaluation Conference (LREC), ELRA, La Valletta, Malta. Citeseer (2010)Google Scholar
  10. Damljanovic, D., Agatonovic, M., Cunningham, H.: FREyA: an interactive way of querying linked data using natural language. In: García-Castro, R., Fensel, D., Antoniou, G. (eds.) ESWC 2011. LNCS, vol. 7117, pp. 125–138. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  11. De Marneffe, M.-C., MacCartney, B., Manning, C.D.: Generating typed dependency parses from phrase structure parses. In: Proceedings of LREC, vol. 6, pp. 449–454 (2006)Google Scholar
  12. Dijkstra, E.W.: A note on two problems in connexion with graphs. Nume. Math. 1(1), 269–271 (1959)MathSciNetCrossRefzbMATHGoogle Scholar
  13. Green Jr., B.F., Wolf, A.K., Chomsky, C., Laughery, K.: Baseball: an automatic question-answerer. In: 1961 Western Joint IRE-AIEE-ACM Computer Conference Papers Presented at the May 9–11, pp. 219–224. ACM (1961)Google Scholar
  14. Krishnan, V., Das, S., Chakrabarti, S.: Enhanced answer type inference from questions using sequential models. In: Proceedings of the Conference on Human Language Technology and Empirical Methods in Natural Language Processing, pp. 315–322. Association for Computational Linguistics (2005)Google Scholar
  15. Li, X., Roth, D.: Learning question classifiers. In: Proceedings of the 19th International Conference on Computational Linguistics, vol. 1, pp. 1–7. Association for Computational Linguistics (2002)Google Scholar
  16. Marcus, M.P., Marcinkiewicz, M.A., Santorini, B.: Building a large annotated corpus of english: the penn treebank. Comput. Linguist. 19(2), 313–330 (1993). ISSN 0891–2017, URL Google Scholar
  17. McKeown, K.R.: Paraphrasing questions using given and new information. Comput. Linguist. 9(1), 1–10 (1983)MathSciNetGoogle Scholar
  18. Miller, G.A.: Wordnet: a lexical database for english. Commun. ACM 38(11), 39–41 (1995)CrossRefGoogle Scholar
  19. Moldovan, D., Harabagiu, S., Pasca, M., Mihalcea, R., Girju, R., Goodrum, R., Rus, V.: The structure and performance of an open-domain question answering system. In: Proceedings of the 38th Annual Meeting on Association for Computational Linguistics, pp. 563–570. Association for Computational Linguistics (2000)Google Scholar
  20. Prager, J., Chu-Carroll, J., Czuba, K.: Statistical answer-type identification in open-domain question answering. In: Proceedings of the Second International Conference on Human Language Technology Research, pp. 150–156. Morgan Kaufmann Publishers Inc. (2002)Google Scholar
  21. Siddharthan, A.: Syntactic simplification and text cohesion. Res. Lang. Comput. 4(1), 77–109 (2006)CrossRefGoogle Scholar
  22. Simmons, R.F.: Answering english questions by computer: a survey. Commun. ACM 8(1), 53–70 (1965). ISSN 0001-0782, doi:10.1145/363707.363732, URL CrossRefGoogle Scholar
  23. Ullmann, J.R.: An algorithm for subgraph isomorphism. J. ACM (JACM) 23(1), 31–42 (1976)MathSciNetCrossRefGoogle Scholar
  24. Wales, J., Sanger, L.: Wikipedia, the free encyclopedia (2001). Accessed April 22, 2013, URL
  25. Walker, A., Starkey, A., Pan, J.Z., Siddharthan, A.: Making test corpora for question answering more representative. In: Kanoulas, E., Lupu, M., Clough, P., Sanderson, M., Hall, M., Hanbury, A., Toms, E. (eds.) CLEF 2014. LNCS, vol. 8685, pp. 1–6. Springer, Heidelberg (2014)Google Scholar
  26. Waltz, D.L.: An english language question answering system for a large relational database. Commun. ACM 21(7), 526–539 (1978)CrossRefzbMATHGoogle Scholar
  27. Woods, W.A.: Progress in natural language understanding: an application to lunar geology. In: Proceedings of the June 4–8, National Computer Conference and Exposition, AFIPS 1973, pp. 441–450. ACM, New York (1973). doi: 10.1145/1499586.1499695, URL
  28. Woods, W.A.: Lunar rocks in natural english. Linguist. Struct. Process. 5, 521–569 (1977)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Andrew D. Walker
    • 1
    Email author
  • Panos Alexopoulos
    • 2
  • Andrew Starkey
    • 1
  • Jeff Z. Pan
    • 1
  • José Manuel Gómez-Pérez
    • 2
  • Advaith Siddharthan
    • 1
  1. 1.University of AberdeenAberdeenUK
  2. 2.Expert SystemAmsterdamNetherlands

Personalised recommendations