The Design of SREE — A Prototype Potential Ambiguity Finder for Requirements Specifications and Lessons Learned

  • Sri Fatimah Tjong
  • Daniel M. Berry
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7830)


[Context and Motivation] Many a tool for finding ambiguities in natural language (NL) requirements specifications (RSs) is based on a parser and a parts-of-speech identifier, which are inherently imperfect on real NL text. Therefore, any such tool inherently has less than 100% recall. Consequently, running such a tool on a NL RS for a highly critical system does not eliminate the need for a complete manual search for ambiguity in the RS. [Question/Problem] Can an ambiguity-finding tool (AFT) be built that has 100% recall on the types of ambiguities that are in the AFT’s scope such that a manual search in an RS for ambiguities outside the AFT’s scope is significantly easier than a manual search of the RS for all ambiguities? [Principal Ideas/Results] This paper presents the design of a prototype AFT, SREE (Systemized Requirements Engineering Environment), whose goal is achieving a 100% recall rate for the ambiguities in its scope, even at the cost of a precision rate of less than 100%. The ambiguities that SREE searches for by lexical analysis are the ones whose keyword indicators are found in SREE’s ambiguity-indicator corpus that was constructed based on studies of several industrial strength RSs. SREE was run on two of these industrial strength RSs, and the time to do a completely manual search of these RSs is compared to the time to reject the false positives in SREE’s output plus the time to do a manual search of these RSs for only ambiguities not in SREE’s scope. [Contribution] SREE does not achieve its goals. However, the time comparison shows that the approach to divide ambiguity finding between an AFT with 100% recall for some types of ambiguity and a manual search for only the other types of ambiguity is promising enough to justify more work to improve the implementation of the approach. Some specific improvement suggestions are offered.


Manual Search Requirement Engineer Software Requirement Manual Examination Lexical Analyzer 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Berry, D. M., Gacitua, R., Sawyer, P., Tjong, S.F.: The case for dumb requirements engineering tools. In: Regnell, B., Damian, D. (eds.) REFSQ 2011. LNCS, vol. 7195, pp. 211–217. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  2. 2.
    van Rossum, W.: The implementation of technologies in intensive care units: Ambiguity, uncertainty and organizational reactions. Technical Report Research Report 97B51, Research Institute SOM (Systems, Organisations and Management), University of Groningen (1997),
  3. 3.
    Sussman, S.W., Guinan, P.J.: Antidotes for high complexity and ambiguity in software development. Information and Management 36, 23–35 (1999)CrossRefGoogle Scholar
  4. 4.
    Gause, D.C., Weinberg, G.M.: Exploring Requirements: Quality Before Design. Dorset House, New York (1989)zbMATHGoogle Scholar
  5. 5.
    Mich, L., Franch, M., Inverardi, P.N.: Market research for requirements analysis using linguistic tools. Requirements Engineering Journal 9(1), 40–56, 9(2), 15 (2004); has full article with inverted names, No. 2 has correction of names and reference to full article in No. 1.CrossRefGoogle Scholar
  6. 6.
    Berry, D.M., Kamsties, E.: Ambiguity in requirements specification. In: Leite, J., Doorn, J. (eds.) Perspectives on Requirements Engineering, pp. 7–44. Kluwer, Boston (2004)CrossRefGoogle Scholar
  7. 7.
    Tjong, S.F.: Avoiding Ambiguity in Requirements Specifications. PhD thesis, Faculty of Engineering & Computer Science, University of Nottingham, Malaysia Campus, Semenyih, Selangor Darul Ehsan, Malaysia (2008),
  8. 8.
    Ambriola, V., Gervasi, V.: On the systematic analysis of natural language requirements with CIRCE. Automated Software Engineering 13, 107–167 (2006)CrossRefGoogle Scholar
  9. 9.
    Wilson, W.M., Rosenberg, L.H., Hyatt, L.E.: Automated analysis of requirement specifications. In: Proceedings of the Nineteenth International Conference on Software Engineering (ICSE 1997), pp. 161–171 (1997)Google Scholar
  10. 10.
    Fabbrini, F., Fusani, M., Gnesi, S., Lami, G.: Quality evaluation of software requirement specifications. In: Proceedings of the Software and Internet Quality Week 2000 Conference, pp. 1–18 (2000)Google Scholar
  11. 11.
    Fabbrini, F., Fusani, M., Gnesi, S., Lami, G.: The linguistic approach to the natural language requirements, quality: Benefits of the use of an automatic tool. In: Proceedings of the Twenty-Sixth Annual IEEE Computer Society – NASA GSFC Software Engineering Workshop, pp. 97–105 (2001)Google Scholar
  12. 12.
    Kasser, J.: Tiger pro manual. Technical report, University of South Australia (2006),
  13. 13.
    Willis, A., Chantree, F., de Roeck, A.: Automatic identification of nocuous ambiguity. Research on Language and Computation 6, 355–374 (2008)CrossRefGoogle Scholar
  14. 14.
    Kiyavitskaya, N., Zeni, N., Mich, L., Berry, D.M.: Requirements for tools for ambiguity identification and measurement in natural language requirements specifications. Requirements Engineering Journal 13, 207–239 (2008)CrossRefGoogle Scholar
  15. 15.
    Miller, G.A., Felbaum, C., et al.: WordNet Web Site. Princeton University, Princeton, (accessed March 12, 2006)
  16. 16.
    Gleich, B., Creighton, O., Kof, L.: Ambiguity detection: Towards a tool explaining ambiguity sources. In: Wieringa, R., Persson, A. (eds.) REFSQ 2010. LNCS, vol. 6182, pp. 218–232. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  17. 17.
    Agarwal, R., Boggess, L.: A simple but useful approach to conjunct identification. In: Proceedings of the Thirtieth Annual Meeting of the Association for Computational Linguistics (ACL 1992), pp. 15–21 (1992)Google Scholar
  18. 18.
    Resnik, P.: Semantic similarity in a taxonomy: An information-based measure and its application to problems of ambiguity in natural language. Journal of Artificial Intelligence Research 11, 95–130 (1999)zbMATHGoogle Scholar
  19. 19.
    Goldberg, M.: An unsupervised model for statistically determining coordinate phrase attachment. In: Proceedings of the Thirty-Seventh Annual Meeting of the Association for Computational Linguistics on Computational Linguistics (ACL 1999), pp. 610–614 (1999)Google Scholar
  20. 20.
    Chantree, F., Willis, A., Kilgarriff, A., de Roeck, A.: Detecting dangerous coordination ambiguities using word distribution. In: Recent Advances in Natural Language Processing: Current Issues in Linguistic Theory, vol. 4 (292), pp. 287–296. John Benjamins (2007)Google Scholar
  21. 21.
    Tjong, S.F., Hallam, N., Hartley, M.: Improving the quality of natural language requirements specifications through natural language requirements patterns. In: Proceedings of the Sixth IEEE International Conference on Computer and Information Technology (CIT 2006), pp. 199–206 (2006),
  22. 22.
    Tjong, S.F.: Natural language patterns for requirements specifications. Technical report, Faculty of Engineering & Computer Science, University of Nottingham, Malaysia Campus (2006),
  23. 23.
    Tjong, S.F., Hartley, M., Berry, D.M.: Extended disambiguation rules for requirements specifications. In: Proceedings of the Tenth Workshop on Requirements Engineering a.k.a. Workshop em Engenharia de Requisitos (WER 2007), pp. 97–106 (2007),
  24. 24.
    Tjong, S.F., Hallam, N., Hartley, M.: An adaptive parsing technique for prorules grammar. In: Proceedings of the Computer Science and Mathematics Symposium (CSMS) (2006),
  25. 25.
    Collins, M., Duffy, N.: New ranking algorithms for parsing and tagging: Kernels over discrete structures, and the voted perceptron. In: Proceedings of the Fortieth Annual Meeting of the Association for Computational Linguistics (ACL 2002), pp. 263–270 (2002)Google Scholar
  26. 26.
    Hammerton, J., Osborne, M., Armstrong, S., Daelemans, W.: Introduction to special issue on machine learning approaches to shallow parsing. Journal of Machine Learning Research 2, 551–558 (2002)Google Scholar
  27. 27.
    Bray, I.K.: An Introduction to Requirements Engineering. Addison-Wesley, Harlow (2002)Google Scholar
  28. 28.
    Eng, C.S.: Batch poster system, detailed business requirements. Technical report, EDS MySC (2005)Google Scholar
  29. 29.
    EPRI: Cask loader software, general requirements document draft. Technical Report, Electric Power Research Institute Inc. (1999),
  30. 30.
    Nelbach, F.: Software requirements document for the data cycle system (DCS). Technical Report, Universities Space Research Association, UCLA (2002),
  31. 31.
    Moeser, R., Perley, P.: Expanded very large array (EVLA) operations interface software requirements. Technical Report EVLA-SW-003, National Radio Astronomy Observatory (2003),
  32. 32.
    Dubois, R.: Large area telescope (LAT) science analysis software specification. Technical Report GE-0000X-DO, SLAC National Accelerator Laboratory (2000),
  33. 33.
    George, S.: PESA high-level trigger selection software requirements. Technical Report, Centre for Particle Physics at Royal Holloway University (2001),
  34. 34.
    Stevenson, M., Hartley, M., Iacovou, H., Tan, A., Phan, L.: Software requirements specification for sort algorithm demonstration program. Technical report, SDPM (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Sri Fatimah Tjong
    • 1
  • Daniel M. Berry
    • 2
  1. 1.University of NottinghamSemenyihMalaysia
  2. 2.Cheriton School of Computer ScienceUniversity of WaterlooWaterlooCanada

Personalised recommendations