Research on Communication Individual Identification Method Based on PCA-NCA and CV-SVM

  • Xinghao GuoEmail author
  • Shuai Liu
Conference paper
Part of the Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering book series (LNICST, volume 301)


In recent years, high-dimensional data has often appeared in the fields of science and industry, such as computer vision, pattern recognition, biological information, and aerospace. Feature dimension reduction and selection are the process of reducing data from high dimensionality to low dimensionality to reveal the nature of the data. In the field of wireless communication, in view of the feature redundancy caused by the high-dimensional features of wireless device startup transient signals, this paper converts the high-dimensional features of signals into low-dimensional features that are conducive to classification through the feature dimensionality reduction and selection method based on PCA-NCA. In addition, this paper also carried out parameter optimization for SVM classifier, and the established CV-SVM classifier improved the classification performance. This paper also carries out simulation devices on the measured start-up signals of ten identical walkie-talkies. When the SNR is greater than 0 dB, the recognition accuracy of the PCA-NCA algorithm is 10% higher than recognition accuracy of the PCA algorithm alone; when the SNR is greater than 10 dB.


PCA Individual identification Feature selection 



The authors would like to thank State Key Laboratory of Complex Electromagnetic Environment Effects on Electronics and Information System Director Fund (CEMEE2019K0104B).


  1. 1.
    Yu, L., Liu, H.: Efficient feature selection via analysis of relevance and redundancy. J. Mach. Learn. Res. 5(12), 1205–1224 (2004)MathSciNetzbMATHGoogle Scholar
  2. 2.
    Mao, K.Z.: Fast orthogonal forward selection algorithm for feature subset selection. IEEE Trans. Neural Netw. 13(5), 1218–1224 (2002)CrossRefGoogle Scholar
  3. 3.
    Hua-Liang, W., Billings, S.A.: Feature subset selection and ranking for data dimensionality reduction. IEEE Trans. Pattern Anal. Mach. Intell. 29(1), 162–166 (2007)CrossRefGoogle Scholar
  4. 4.
    Liu, Y., Zheng, Y.F.: FS_SFS: a novel feature selection method for support vector machines. Pattern Recogn. 39, 1333–1345 (2006)zbMATHCrossRefGoogle Scholar
  5. 5.
    Jain, A.K., Duin, R., Mao, J.C.: Statistical pattern recognition a review. IEEE Trans. Pattern Anal. Mach. Intell. 22(1), 4–37 (2000)CrossRefGoogle Scholar
  6. 6.
    Martin-Bautista, M.J., Vila, M.A.: A survey of genetic feature selection in mining issues. In: Proceeding of the 1999 Congress on Evolutionary Computation, pp. 13–23. IEEE Press (1999)Google Scholar
  7. 7.
    Goldberger, J., Roweis, S., Hinton, G., et al.: Neighbourhood components analysis. In: Proceedings of 2005 Conference on Neural Information Processing Systems. MIT Press (2005)Google Scholar
  8. 8.
    Singh-Miller, N., Collins, M., Hazen, T.J.: Dimensionality reduction for speech recognition using neighborhood components analysis. In: Proceedings of Interspeech 2007, Antwerp, Belgium (2007)Google Scholar
  9. 9.
    Butman, M., Goldberger, J.: Face recognition using classification based linear projections. EURASIP J. Adv. Signal Process. 2008, 1–7 (2008)zbMATHCrossRefGoogle Scholar
  10. 10.
    Xing, E.P., Ng, A.Y., Jordan, M.I., et al.: Distance metric learning, with application to clustering with side-information. In: Proceedings of 2002 Conference on Neural Information Processing Systems. MIT Press, Cambridge (2002)Google Scholar
  11. 11.
    Globerson, A., Roweis, S.T.: Metric learning by collapsing classes. In: Proceedings of 2006 Conference on Neural Information Processing Systems. MIT Press, Cambridge (2006)Google Scholar
  12. 12.
    Liu, S., Bai, W., Liu, G., et al.: Parallel fractal compression method for big video data. Complexity 2018, 2016976 (2018)zbMATHGoogle Scholar
  13. 13.
    Zhang, X., Lu, X., Shi, Q., Xu, X.Q., Leung, H.C.E., Harris, L.N., et al.: Recursive SVM feature selection and sample classification for mass-spectrometry and microarray data. BMC Bioinform. 7(1), 1–13 (2006)CrossRefGoogle Scholar
  14. 14.
    Miao, L., Shuai, L., Weina, F., et al.: Distributional escape time algorithm based on generalized fractal sets in cloud environment. Chin. J. Electron. 24(1), 124–127 (2015)CrossRefGoogle Scholar
  15. 15.
    Lu, M., Liu, S., Sangaiah, A.K., et al.: Nucleosome positioning with fractal entropy increment of diversity in telemedicine. IEEE Access 6, 33451–33459 (2018)CrossRefGoogle Scholar

Copyright information

© ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering 2019

Authors and Affiliations

  1. 1.Harbin Engineering UniversityHarbinChina
  2. 2.Inner Mongolia UniversityHohhotChina

Personalised recommendations