Advertisement

Shinobi: A Novel Approach for Context-Driven Testing (CDT) Using Heuristics and Machine Learning for Web Applications

  • Duc-Man NguyenEmail author
  • Hoang-Nhat Do
  • Quyet-Thang Huynh
  • Dinh-Thien Vo
  • Nhu-Hang Ha
Conference paper
Part of the Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering book series (LNICST, volume 257)

Abstract

Context-Driven Testing is widely used in the Agile World. It optimizes the testing value and provides an effective way to detect unexpected bugs. Context-driven testing requires the testing team to leverage the full knowledge and skills to solve the problem or to make a decision. In this paper, we propose an approach for Context-Driven Testing using Heuristics and Machine Learning for web applications with a framework called Shinobi. The framework can detect web controls, suggest a set of heuristic values, recognize the meaningful input data, and detect changes of application to recommend test ideas. In the context of improvising the testing performance, Shinobi is considered as Test Assistant for context-driven testers. Shinobi is a PoC to prove the idea of using Machine Learning to develop a Virtual Tester to improve the test quality and train junior testers as responsible testers. The framework is well integrated into all eCommerce projects at MeU Solutions which is a value-added advantage for testing.

Keywords

Shinobi MeU-Solutions Context-driven testing Machine learning Exploratory testing Software testing Web testing 

References

  1. 1.
    Zang, J.: Financial organization transformation strategy. In: Concas, G., Damiani, E., Scotto, M., Succi, G. (eds.) XP 2007. LNCS, vol. 4536, pp. 188–192. Springer, Heidelberg (2007).  https://doi.org/10.1007/978-3-540-73101-6_34CrossRefGoogle Scholar
  2. 2.
    Rajasekhar, P., Shafi, R.M.: Agile software development and testing: approach and challenges in advanced distributed systems. Glob. J. Comput. Sci. Technol. B Cloud Distrib. 14(1), 6–10 (2014)Google Scholar
  3. 3.
    Vijay Anand, R., Dinakaran, M.: Issues in scrum agile development principles and practices in software development. Indian J. Sci. Technol. 8(35) (2015).  https://doi.org/10.17485/ijst/2015/v8i35/79037
  4. 4.
    CapGemini, Sogeti and Micro Focus: World Quality Report 2017-18, 9th edn (2017)Google Scholar
  5. 5.
    Papadopoulos, P., Walkinshaw, N.: Black-box test generation from inferred models. In: 4th IEEE/ACM International Workshop on Realizing AI Synergies in Software Engineering, pp. 19–24 (2015)Google Scholar
  6. 6.
    Shaukat, H., Marselis, R.: Testing of Artificial Intelligence-AI Quality Engineering Skills - An Introduction. SOGETI (2017)Google Scholar
  7. 7.
    Eguide, T.: The Impact of Software Failure - And How Automated Testing Reduces Risks. Tricentis (2017). https://www.stickyminds.com/tricentis-eguide-impact-software-failure-and-how-automated-testing-reduces-risks
  8. 8.
    Tricentis: Exploratory Testing: The Heart of All Things Testing (2016). https://www.tricentis.com/resource-assets/exploratory-testing-whitepaper/
  9. 9.
    Bach, B.J., Bolton, M.: A Context - Driven Approach to Automation in Testing, vol. 2016. Satisfice Inc. (2016). http://www.satisfice.com/articles/cdt-automation.pdf
  10. 10.
    Johnson, K.N.: Software Testing - Heuristics and Mnemonics (2012). http://karennicolejohnson.com/wp-content/uploads/2012/11/KNJohnson-2012-heuristics-mnemonics.pdf
  11. 11.
    Baller, H., Lity, S. Lochau, M., Schaefer, I.: Multi-objective test suite optimization for incremental product family testing. In: Proceedings - IEEE 7th International Conference on Software Testing, Verification and Validation, ICST 2014, pp. 303–312 (2014)Google Scholar
  12. 12.
    Ding, J., Zhang, D.: A machine learning approach for developing test oracles for testing scientific software. In: The 28th International Conference on Software Engineering and Knowledge Engineering, SEKE 2016, pp. 390–395 (2016)Google Scholar
  13. 13.
    Bach, J.: Heuristic Test Strategy Model. Satisfice, Inc. (2002). http://www.satisfice.com/tools/htsm.pdf
  14. 14.
    Ghazi, A.N., Garigapati, R.P., Petersen, K.: Checklists to support test charter design in exploratory testing. In: Baumeister, H., Lichter, H., Riebisch, M. (eds.) XP 2017. LNBIP, vol. 283, pp. 251–258. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-57633-6_17CrossRefGoogle Scholar
  15. 15.
    Zhang, D.: Applying machine learning algorithm in software development. The Effects of Brief Mindfulness Intervention on Acute Pain Experience: An Examination of Individual Difference, vol. 1, pp. 1689–1699 (2003).  https://doi.org/10.1017/CBO9781107415324.004
  16. 16.
    Hormozi, H., Hormozi, E., Nohooji, H.R.: The classification of the applicable machine learning methods in robot manipulators. Int. J. Mach. Learn. Comput. 2(5), 560–563 (2012)CrossRefGoogle Scholar
  17. 17.
    Noorian, M., Bagheri, E.B., Du, W.: Machine learning-based software testing: towards a classification framework. In: SEKE 2011 - Proceedings of the 23rd International Conference on Software Engineering and Knowledge Engineering, pp. 225–229 (2011)Google Scholar
  18. 18.
    Briand, L.C.: Novel applications of machine learning in software testing. In: The Eighth International Conference on Quality Software, pp. 3–10 (2008)Google Scholar
  19. 19.
    Joshi, N.: Survey of rapid software testing using machine learning. Int. J. Trend Res. Dev. 3(5), 91–93 (2016)Google Scholar
  20. 20.
    Raghuwanshi, V.: AI and software testing. In: 17th Annual International Software Testing Conference (2017)Google Scholar
  21. 21.
    Bhateja, N., Sikka, S.: Achieving quality in automation of software testing using Ai based techniques. Int. J. Comput. Sci. Mob. Comput. 6(5), 50–54 (2017)Google Scholar
  22. 22.
    Sahoo, R.K., Ojha, D., Mohapatra, D.P., Patra, M.R.: Automated test case generation and optimization: a comparative review. Int. J. Comput. Sci. Inf. Technol. 8(5), 19–32 (2016)Google Scholar
  23. 23.
    Wegener, J., Baresel, A., Sthamer, H.: Evolutionary test environment for automatic structural testing. Inf. Softw. Technol. 43(14), 841–854 (2001)CrossRefGoogle Scholar
  24. 24.
    von Mayrhauser, A., Anderson, C., Mraz, R.: Using a neural network to predict test case effectiveness. In: The IEEE Aerospace Applications Conference Proceedings, no. level 1, pp. 77–91 (1995)Google Scholar
  25. 25.
    Briand, L.C., Labiche, Y., Liu, X.: Using machine learning to support debugging with tarantula. In: The 18th IEEE International Symposium on Software Reliability, (ISSRE 2007), pp. 137–146 (2007)Google Scholar
  26. 26.
    Sathyavathy, V.: Evaluation of software testing techniques using artificial neural network. Int. J. Electr. Comput. Sci. 6(3), 20617–20620 (2017)Google Scholar
  27. 27.
    Akmel, F., Birihanu, E., Siraj, B.: A literature review study of software defect pre-diction using machine learning techniques. Int. J. Emerg. Res. Manag. Technol. 6(6), 300–306 (2017)CrossRefGoogle Scholar
  28. 28.
    Ramsundar, B.: TensorFlow Tutorial (2016)Google Scholar
  29. 29.
    Goldsborough, P.: A Tour of TensorFlow (2016). https://arxiv.org/abs/1610.01178
  30. 30.
    Girshick, R.: Fast R-CNN. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 1440–1448 (2015)Google Scholar
  31. 31.
    Huang, J., et al.: Speed/accuracy trade-offs for modern convolutional object detectors (2016). https://arxiv.org/abs/1611.10012
  32. 32.
    Chen, X., Gupta, A.: An Implementation of Faster RCNN with Study for Region Sampling (2017). https://arxiv.org/abs/1702.02138
  33. 33.
    Wongsuphasawatl, K., et al.: Visualizing dataflow graphs of deep learning models in TensorFlow. IEEE Trans. Vis. Comput. Graph. 24(1), 1–12 (2018)CrossRefGoogle Scholar
  34. 34.
    Yaman, F., Oates, T., Burstein, M.: A context driven approach for workflow mining. In: Proceedings of the 21st International Joint Conference on Artifical Intelligence, IJCAI 2009, pp. 1798–1803 (2009)Google Scholar
  35. 35.
    Ren, S., He, K., Girshick, R., Sun, J.: Faster R-CNN: towards real-time object detec-tion with region proposal networks. IEEE Trans. Pattern Anal. Mach. Intell. 39(6), 1137–1149 (2017)CrossRefGoogle Scholar
  36. 36.
    Ghahrai, A.: How to Overcome Agile Testing Challenges. https://www.testingexcellence.com/agile-testing-challenges-qa-agile-projects/. Accessed 28 Apr 2018
  37. 37.
    Mathuria, M.: AI and Machine Learning to Optimize Software Testing. https://www.readitquik.com/articles/ai/ai-and-machine-learning-to-optimize-software-testing/. Accessed 28 Apr 2018
  38. 38.
    Sypolt, G.: AI Test Automation: The AI Test Bots Are Coming. https://saucelabs.com/blog/ai-test-automation-the-ai-test-bots-are-coming. Accessed 28 Apr 2018
  39. 39.
    Wandile, P.: Overcoming Testing Challenges In Agile. https://dzone.com/articles/overcoming-testing-challenges-in-agile. Accessed 28 Apr 2018
  40. 40.
    Ghahrai, A.: What Are Test Oracles and Test Heuristics? https://www.testingexcellence.com/test-oracles-test-heuristics/. Accessed 28 Apr 2018
  41. 41.
    Brownlee, J.: A Tour of Machine Learning Algorithms. https://machinelearningmastery.com/a-tour-of-machine-learning-algorithms/. Accessed 28 Apr 2018
  42. 42.
    Faster R-CNN: Down The Rabbit Hole of Modern Object. https://tryolabs.com/blog/2018/01/18/faster-r-cnn-down-the-rabbit-hole-of-modern. Accessed 28 Apr 2018
  43. 43.
    Tensorflow. https://www.tensorflow.org/. Accessed 28 Apr 2018
  44. 44.
    From the Experts: Top 5 Trends Shaping the Future of Software Testing. https://www.qasymphony.com/blog/5-trends-future-software-testing/. Accessed 28 Apr 2018
  45. 45.
    RapidValue: New age Software Testing with Artificial Intelligence and Machine Learning. https://www.rapidvaluesolutions.com/new-age-software-testing-artificial-intelligence-machine-learning/. Accessed 28 Apr 2018
  46. 46.
    Object Detection with Faster R-CNN in Chainer. https://github.com/mitmul/chainer-faster-rcnn. Accessed 28 Apr 2018
  47. 47.
    Weng, L.: Object Recognition for Dummies Part 3: R-CNN and Fast/Faster/Mask R-CNN and YOLO. https://lilianweng.github.io/lil-log/2017/12/31/object-recognition-for-dummies-part-3.html#faster-r-cnn. Accessed 28 Apr 2018
  48. 48.
    MeU-Home - Meu Solutions. http://meu-solutions.com/. Accessed 28 Apr 2018

Copyright information

© ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering 2019

Authors and Affiliations

  1. 1.Duy Tan UniversityDa NangVietnam
  2. 2.MeU SolutionsHo Chi Minh CityVietnam
  3. 3.Ha Noi University of Science and TechnologyHa NoiVietnam

Personalised recommendations