Skip to main content

Deep Specification Mining with Attention

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 12273))

Abstract

In this paper, we improve the method of specification mining based on deep learning proposed in [16]. In that neural network model, we find that if the length of a single trace exceeds 25 and the number of the tracking methods exceeds 15, the \(F_{measure}\) output of the original model will decrease significantly. Accordingly, we propose a new model with attention mechanism to solve the forgetting problem of the original model for long sequence learning. First of all, test cases are used to generate as many as possible program traces, each of which covers a complete execution path. The trace set is then used for training a language model based on Recurrent Neural Networks (RNN) and attention mechanism. From these trajectories, a Prefix Tree Acceptor (PTA) is built and features are extracted using the new proposed model. Then, these features are used by clustering algorithms to merge similar states in the PTA to build multiple finite automata. Finally, a heuristic algorithm is used to evaluate the quality of these automata and select the one with the highest \(F_{measure}\) as the final specification automaton.

The research is supported by National Natural Science Foundation of China under Grant Nos. 61420106004, 61732013 and 61751207.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    https://github.com/karpathy/char-rnn.

References

  1. Abadi, M., et al.: TensorFlow: a system for large-scale machine learning. In: Proceedings of 12th USENIX Symposium on Operating Systems Design and Implementation, pp. 265–283 (2016)

    Google Scholar 

  2. Ashton, E.A., Molinelli, L., Totterman, S., Parker, K.J.: Evaluation of reproducibility for manual and semi-automated feature extraction in CT and MR images. In: Proceedings of International Conference on Image Processing, vol. 3, p. III (2002)

    Google Scholar 

  3. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. arXiv:1409.0473 [cs.CL] (2014)

  4. Black, E., Hunter, A.: An inquiry dialogue system. Auton. Agent. Multi-Agent Syst. 19(2), 173–209 (2009)

    Google Scholar 

  5. Chaudhari, S., Polatkan, G., Ramanath, R., Mithal, V.: An attentive survey of attention models. arXiv:1904.02874 (2019)

  6. Chorowski, J., Bahdanau, D., Serdyuk, D., Cho, K., Bengio, Y.: Attention-based models for speech recognition. Comput. Sci. 10(4), 429–439 (2015)

    Google Scholar 

  7. Eckert, W., Levin, E., Pieraccini, R.: User modeling for spoken dialogue system evaluation. In: Proceedings of IEEE Workshop on Automatic Speech Recognition and Understanding, pp. 80–87 (1997)

    Google Scholar 

  8. García, P., de Parga, M., López, D., Ruiz, J.: Learning automata teams. In Proceedings of International Colloquium on Grammatical Inference, pp. 52–65 (2010)

    Google Scholar 

  9. Hansen, L.K., Salamon, P.: Neural network ensembles. IEEE Trans. Pattern Anal. Mach. Intell. 12(10), 993–1001 (1990)

    Google Scholar 

  10. Hinton, G.E., Srivastava, N., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.R.: Improving neural networks by preventing co-adaptation of feature detectors. arXiv:1207.0580 (2012)

  11. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Google Scholar 

  12. Hopcroft, J.: An n log n algorithm for minimizing states in a finite automaton. In: Theory of Machines and Computations, pp. 189–196 (1971)

    Google Scholar 

  13. Hovy, E., Ravichandran, D.: Learning surface text patterns for a question answering system. In: Proceedings of the 40th Annual Meeting on Association for Computational Linguistics, pp. 41–47 (2002)

    Google Scholar 

  14. Komer, B., Bergstra, J., Eliasmith, C.: Hyperopt-Sklearn: automatic hyperparameter configuration for Scikit-learn. In: Proceedings of the 13th Python in Science Conference, pp. 32–37 (2014)

    Google Scholar 

  15. Kumar, R., Raghavan, P., Rajagopalan, S., Tomkins, A.: Recommendation systems. J. Comput. Syst. Sci. 40(1), 42–61 (1997)

    MathSciNet  MATH  Google Scholar 

  16. Le, T.-D.B., Lo, D.: Deep specification mining. In Proceedings of the 27th ACM SIGSOFT International Symposium on Software Testing and Analysis, pp. 106–117 (2018)

    Google Scholar 

  17. Li, Y., Zhao, H., Zhang, W., Jin, Z., Mei, H.: Research on the merging of feature models. Chin. J. Comput. 36(1), 1–9 (2013)

    Google Scholar 

  18. Liu, J., Wang, G., Hu, P., Duan, L., Kot, A.C.: Global context-aware attention LSTM networks for 3D action recognition. In: Proceedings of 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 3671–3680 (2017)

    Google Scholar 

  19. Liu, X.-Y., Wu, J., Zhou, Z.-H.: Exploratory undersampling for class-imbalance learning. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 39(2), 539–550 (2008)

    Google Scholar 

  20. Lo, D., Khoo, S.: QUARK: empirical assessment of automaton-based specification miners. In: Proceedings of 13th Working Conference on Reverse Engineering. pp. 51–60 (2006)

    Google Scholar 

  21. Lodhi, H., Saunders, C., Shawe-Taylor, J., Cristianini, N., Watkins, C.: Text classification using string kernels. J. Mach. Learn. Res. 2(3), 419–444 (2002)

    MATH  Google Scholar 

  22. Newling, J., Fleuret, F.: Nested mini-batch k-means. arXiv:1602.02934 (2016)

  23. Prabowo, R., Thelwall, M.: Sentiment analysis: a combined approach. J. Informetr. 3(2), 143–157 (2013)

    Google Scholar 

  24. Reiter, E., Dale, R.: Building natural language generation systems. Comput. Linguist. 27(2), 298–300 (1996)

    Google Scholar 

  25. Leino, K.R.M., Müller, P.: Object invariants in dynamic contexts. In: Odersky, M. (ed.) ECOOP 2004. LNCS, vol. 3086, pp. 491–515. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-24851-4_22

    Chapter  Google Scholar 

  26. Salembier, P., Smith, J.R.: MPEG-7 multimedia description schemes. IEEE Trans. Circ. Syst. Video Technol. 11(6), 748–759 (2001)

    Google Scholar 

  27. Shannon, R.V., Zeng, F.G., Kamath, V., Wygonski, J., Ekelid, M.: Speech recognition with primarily temporal cues. Science 270(5234), 303–304 (1995)

    Google Scholar 

  28. Shiba, T., Tsuchiya, T., Kikuno, T.: Using artificial life techniques to generate test cases for combinatorial testing. In: Proceedings of the 28th Annual International Computer Software and Applications Conference, vol. 1, pp. 72–77 (2004)

    Google Scholar 

  29. Shoham, S., Yahav, E., Fink, S.J., Pistoia, M.: Static specification mining using automata-based abstractions. IEEE Trans. Softw. Eng. 34(5), 651–666 (2008)

    Google Scholar 

  30. Specht, D.F.: A general regression neural network. IEEE Trans. Neural Netw. 2(6), 568–576 (2002)

    Google Scholar 

  31. Stewart, A.K., Boyd, C.A.R., Vaughan-Jones, R.D.: A novel role for carbonic anhydrase: cytoplasmic PH gradient dissipation in mouse small intestinal enterocytes. J. Physiol. 516(1), 209–217 (1999)

    Google Scholar 

  32. Tan, S., Sim, K.C., Gales, M.: Improving the interpretability of deep neural networks with stimulated learning. In: Proceedings of 2015 IEEE Workshop on Automatic Speech Recognition and Understanding (ASRU), pp. 617–623 (2015)

    Google Scholar 

  33. Wang, K., Wan, X.: SentiGAN: generating sentimental texts via mixture adversarial networks. In: Proceedings of the 27th International Joint Conference on Artificial Intelligence (IJCAI 2018), pp. 4446–4452 (2018)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nan Zhang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Cao, Z., Zhang, N. (2020). Deep Specification Mining with Attention. In: Kim, D., Uma, R., Cai, Z., Lee, D. (eds) Computing and Combinatorics. COCOON 2020. Lecture Notes in Computer Science(), vol 12273. Springer, Cham. https://doi.org/10.1007/978-3-030-58150-3_15

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-58150-3_15

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-58149-7

  • Online ISBN: 978-3-030-58150-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics