Advertisement

Spatial-domain steganalytic feature selection based on three-way interaction information and KS test

  • Xiangyuan Gu
  • Jichang GuoEmail author
  • Huiwen Wei
  • Yanhong He
Methodologies and Application
  • 7 Downloads

Abstract

To select informative features from steganalytic features, a spatial-domain steganalytic feature selection method based on three-way interaction information and Kolmogorov–Smirnov (KS) test is proposed. Three-way interaction information is employed to rank all the features, and KS test is exploited to remove redundant features. Feature selection process of the proposed method is presented as follows: It calculates mutual information between features and the class label and selects the feature with the maximum value. Then, it loops to calculate three-way interaction information among each candidate feature, the previously selected feature and the class label and select the candidate feature with the maximum value. Following that, it calculates KS test between features and compares an obtained parameter with the predefined significance level for eliminating redundant features. To validate the performance of the proposed method, several typical feature ranking methods based on information measure and spatial-domain steganalytic feature selection methods are adopted for performance comparisons. Experimental results demonstrate that the proposed method can achieve better feature selection performance.

Keywords

Spatial-domain steganalytic features Three-way interaction information Feature selection Steganalysis KS test 

Notes

Acknowledgements

The authors would like to thank Jicang Lu and his co-authors and Morteza Darvish Morshedi Hosseini and his co-author for providing their codes.

Compliance with ethical standards

Funding

This study was funded by the National Natural Science Foundation of China (61771334).

Conflict of interest

All the authors declare that they have no conflict of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Informed consent

Informed consent was obtained from all individual participants included in the study.

References

  1. Amin F, Fahmi A, Abdullah S, Ali A, Ahmed R, Ghani F (2018) Triangular cubic linguistic hesitant fuzzy aggregation operators and their application in group decision making. J Intell Fuzzy Syst 34(4):2401–2416CrossRefGoogle Scholar
  2. Bas P, Furon T (2007) Bows-2Google Scholar
  3. Bennasar M, Setchi R, Hicks Y (2013) Feature interaction maximisation. Pattern Recognit Lett 34(14):1630–1635CrossRefGoogle Scholar
  4. Bennasar M, Hicks Y, Setchi R (2015) Feature selection using joint mutual information maximization. Expert Syst Appl 42(22):8520–8532CrossRefGoogle Scholar
  5. Chang C, Lin C (2011) Libsvm: a library for support vector machines. ACM Trans Intell Syst Technol 2(3):27CrossRefGoogle Scholar
  6. Cheddad A, Condell J, Curran K, Kevitt P (2010) Digital image steganography: survey and analysis of current methods. Signal Process 90(3):727–752CrossRefzbMATHGoogle Scholar
  7. Estevez P, Tesmer M, Perez C, Zurada J (2009) Normalized mutual information feature selection. IEEE Trans Neural Netw 20(2):189–201CrossRefGoogle Scholar
  8. Fahmi A, Abdullah S, Amin F, Siddiqui N (2017) Aggregation operators on triangular cubic fuzzy numbers and its application to multi-criteria decision making problems. J Intell Fuzzy Syst 33(6):3323–3337CrossRefGoogle Scholar
  9. Fahmi A, Abdullah S, Amin F, Khan MA (2018a) Trapezoidal cubic fuzzy number einstein hybrid weighted averaging operators and its application to decision making. Soft Comput.  https://doi.org/10.1007/s00500-018-3242-6 Google Scholar
  10. Fahmi A, Amin F, Abdullah S, Ali A (2018b) Cubic fuzzy Einstein aggregation operators and its application to decision making. Int J Syst Sci 49(11):2385–2397MathSciNetCrossRefGoogle Scholar
  11. Fayyad U, Irani K (1993) Multi-interval discretization of continuous-valued attributes for classification learning. In: Proceedings of international joint conference on artificial intelligence, pp 1022–1029Google Scholar
  12. Fleuret F (2004) Fast binary feature selection with conditional mutual information. J Mach Learn Res 5:1531–1555MathSciNetzbMATHGoogle Scholar
  13. Gu X, Guo J, Tian Y, Li C (2017) A spatial-domain steganalytic feature selection algorithm based on conditional mutual information. J Tianjin Univ 9:961–966Google Scholar
  14. Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten I (2009) The weka data mining software: an update. ACM SIGKDD Explor Newsl 11(1):10–18CrossRefGoogle Scholar
  15. Hosseini M, Mahdavi M (2015) F plus ks: A new feature selection strategy for steganalysis. In: International symposium on computer science and software engineering, pp 1–6Google Scholar
  16. Hou X, Zhang T, Ji L, Wu Y (2017) Combating highly imbalanced steganalysis with small training samples using feature selection. J Vis Commun Image Represent 49:243–256CrossRefGoogle Scholar
  17. Jakulin A, Bratko I (2004) Testing the significance of attribute interactions. In: Proceedings of international conference on machine learning, pp 409–416Google Scholar
  18. Jia X, Kuo B, Crawford M (2013) Feature mining for hyperspectral image classification. Proc IEEE 101(3):676–697CrossRefGoogle Scholar
  19. Khatami A, Khosravi A, Nguyen T, Lim C, Nahavandi S (2017) Medical image analysis using wavelet transform and deep belief networks. Expert Syst Appl 86:190–198CrossRefGoogle Scholar
  20. Liu Q, Sung A, Chen Z (2008) Feature mining and pattern classification for steganalysis of lsb matching steganography in grayscale images. Pattern Recognit 41(1):56–66CrossRefzbMATHGoogle Scholar
  21. Lu J, Liu F, Luo X (2014) Selection of image features for steganalysis based on the fisher criterion. Dig Investig 11(1):57–66CrossRefGoogle Scholar
  22. Meyer P, Bontempi G (2006) On the use of variable complementarity for feature selection in cancer classification. In: Proceedings of applications of evolutionary computing, pp 91–102Google Scholar
  23. Peng H, Long F, Ding C (2005) Feature selection based on mutual information: criteria of max-dependency, max-relevance and min-redundancy. IEEE Trans Pattern Anal Mach Intell 27(8):1226–1238CrossRefGoogle Scholar
  24. Pevny T, Bas P, Fridrich J (2010) Steganalysis by subtractive pixel adjacency matrix. IEEE Trans Inf Forensic Secur 5(2):215–224CrossRefGoogle Scholar
  25. Ren J, Jiang X, Yuan J (2015) Learning lbp structure by maximizing the conditional mutual information. Pattern Recognit 48(10):3180–3190CrossRefGoogle Scholar
  26. Shang C, Li M, Feng S, Jiang Q, Fan J (2013) Feature selection via maximizing global information gain for text classification. Knowl-Based Syst 54:298–309CrossRefGoogle Scholar
  27. Tang B, Kay S, He H (2016) Toward optimal feature selection in naive Bayes for text categorization. IEEE Trans Knowl Data Eng 28(9):2508–2521CrossRefGoogle Scholar
  28. Xie C, Cheng Y, Chen Y (2011) An active steganalysis approach for echo hiding based on sliding windowed cepstrum. Signal Process 91(4):877–889CrossRefzbMATHGoogle Scholar
  29. Ye Q, Sun Y (2018) Weighted structure preservation and redundancy minimization for feature selection. Soft Comput 22(21):7255–7268CrossRefGoogle Scholar
  30. Zhao Z, Morstatter F, Sharma S, Alelyani S, Anand A, Liu H (2010) Advancing feature selection research. ASU feature selection repository, pp 1–28Google Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2019

Authors and Affiliations

  • Xiangyuan Gu
    • 1
  • Jichang Guo
    • 1
    Email author
  • Huiwen Wei
    • 1
  • Yanhong He
    • 1
  1. 1.School of Electrical and Information EngineeringTianjin UniversityTianjinChina

Personalised recommendations