Skip to main content

The Testers’ Workbench

  • Chapter
Practical Software Testing

Part of the book series: Springer Professional Computing ((SPC))

  • 764 Accesses

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 119.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. R. Firth, V. Mosley, R. Pethia, L. Roberts, W. Wood, “ A guide to the classification and assessment of software engineering tools,” Technical Report CMU/SEI-87-TR-10, ESD-TR-87-11, Software Engineering Institute, Carneagie Mellon, 1987.

    Google Scholar 

  2. R. Poston, M. Sexton, “Evaluating and selecting testing tools,” IEEE Software, pp. 33–42, May 1992.

    Google Scholar 

  3. R. Poston, “Testing tools combine best of new and old,” IEEE Software, pp. 122–126, March 1995.

    Google Scholar 

  4. R. Poston, Automating Specification-Based Software Testing, IEEE Computer Society Press, Los Alamitos, CA, 1996.

    Google Scholar 

  5. C. Kemerer, “How the learning curve affects CASE tool adaptation,” IEEE Software, pp. 23–28, May 1993.

    Google Scholar 

  6. V. Moseley, “How to assess tools efficiently and quantitatively,” IEEE Software, pp. 29–32, May 1993.

    Google Scholar 

  7. E. Kit, Software Testing in the Real World, Addison-Wesley, Reading, MA, 1995.

    Google Scholar 

  8. G. Daich, G. Price, B. Ragland, M. Dawood, Software Test Technologies Report, August 1994, Software Technology Support Center (STSC) Hill Air Force Base, UT, August 1994.

    Google Scholar 

  9. IEEE Recommended Practice for the Evaluation and Selection of CASE Tools (IEEE Std 1209–1992), copyright 1993 by IEEE, all rights reserved.

    Google Scholar 

  10. E. Dustin, J. Cashka, J. Paul, Automated Software Testing, Addison-Wesley, Reading, MA, 1999.

    Google Scholar 

  11. I. Burnstein, A. Homyen, T. Suwanassart, G. Saxena, R. Grom, “A testing maturity model for software test process assessment and improvement,” Software Quality Professional (American Society for Quality), Vol. 1,No. 4 Sept. 1999, pp 8–21.

    Google Scholar 

  12. I. Burnstein, A. Homyen, T. Suwanassart, G. Saxena, R. Grom, “Using the testing maturity model to assess and improve your software testing process,” Proc. of International Quality Week Conf. (QW’ 99), San Jose, CA, May 1999.

    Google Scholar 

  13. I. Burnstein, T. Suwanassart, C. R. Carlson, “Developing a testing maturity model for software test process evaluation,” IEEE International Test Conference’ 96, Washington, DC, Oct. 1996, pp. 581–589.

    Google Scholar 

  14. W. Humphrey, A Discipline for Software Engineering, Addison-Wesley, Reading, MA, 1995.

    Google Scholar 

  15. IEEE Standard for Software Test Documentation (IEEE Std 829-1983), copyright 1983 by IEEE, all rights reserved.

    Google Scholar 

  16. R. Thayer, ed. Software Engineering Project Management, second edition, IEEE Computer Society Press, Los Alamitos, CA, 1997.

    Google Scholar 

  17. B. Subramaniam, “Effective software defect tracking, reducing project costs, and enhancing quality,” CrossTalk: The Journal of Defense Software Engineering, Vol. 12,No. 4, April 1999, pp. 3–9.

    Google Scholar 

  18. T. Khoshgoftarr, J. Munson, “Predicting software development errors using software complexity metrics,” IEEE J. Selected Areas in Comm., Vol. 8,No. 2, Feb. 1990, pp. 252–261.

    Google Scholar 

  19. I. Burnstein, T. Suwanassart, C. R. Carlson, “Developing a testing maturity model: part I,” CrossTalk: Journal of Defense Software Engineering, Vol. 9,No. 8, August 1996, pp. 21–24.

    Google Scholar 

  20. I. Burnstein, T. Suwanassart, C. R. Carlson, “Developing a testing maturity model: part II,” CrossTalk: Journal of Defense Software Engineering, Vol. 9,No. 9, Sept. 1996, pp. 19–26.

    Google Scholar 

  21. I. Burnstein, F. Saner, “Fuzzy reasoning to support automated program understanding,” International Journal of Software Engineering and Knowledge Engineering, Vol. 10,No. 1, Feb. 2000, pp. 115–137.

    Article  Google Scholar 

  22. W. Kozaczynski, J. Ning, “Automated program recognition by concept recognition,” Automated Software Engineering, Vol. 1, 1994, pp. 61–78.

    Article  Google Scholar 

  23. A. Quilici, “A memory-based approach to recognizing program plans,” CACM, Vol. 37,No. 5, 1994, pp. 84–93.

    Google Scholar 

  24. H. Pham, Software Reliability and Testing, IEEE Computer Society Press, Los Alamitos, CA, 1995.

    Google Scholar 

  25. M. Kellner, R. Phillip, “Practical technology for process assets,” Proc. 8th International Software Process Workshop: State of the Practice in Process Technology, Warden, Germany, March 1993, pp. 107–112.

    Google Scholar 

  26. B. Korel, I. Burnstein, R. Brevelle, “Postcondition based stress testing in certification of COTS components,” Proceedings of the First International Software Assurance Certification Conference, Washington, DC, March 1999.

    Google Scholar 

  27. J. Hearns, S. Garcia, “ Automated test team management-it works!!”, Proc. 10th Software Engineering Process Group Conference (SEPG’98), Chicago, IL, March 1998.

    Google Scholar 

  28. J. Rubin, Handbook of Usability Testing, John Wiley & Sons, New York, 1994.

    Google Scholar 

Download references

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag New York, Inc.

About this chapter

Cite this chapter

(2003). The Testers’ Workbench. In: Practical Software Testing. Springer Professional Computing. Springer, New York, NY. https://doi.org/10.1007/0-387-21658-8_14

Download citation

  • DOI: https://doi.org/10.1007/0-387-21658-8_14

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-0-387-95131-7

  • Online ISBN: 978-0-387-21658-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics