Advertisement

Electrical Characteristics and Correlation Analysis in Smart Grid

Chapter
  • 24 Downloads

Abstract

The power grid system contains a large number of data sources, the amount of data and data types are very large, and the correlation between different kinds of data is also very complex. How to extract effective data sources from massive data, simplify the identification process, and effectively improve the accuracy of data processing is a necessary way to realize power grid intelligence. In this chapter, taking the air-conditioning circuit system of a complete building as an example, 11 features and 3 dependent variables are extracted, the correlation among features is analyzed by using three correlation analysis methods, and one dependent variable is selected. Based on the rough extraction of all the features, three feature selection methods are used to search for the optimal features, and finally, the optimal feature subset based on the selected dependent variables is obtained by comprehensive comparative analysis.

References

  1. Abbasi Z, Rahmani M (2019) An instance selection algorithm based on reliefF. International Journal of Artificial Intelligence Tools 28(01):1950001CrossRefGoogle Scholar
  2. Aliane, A. A., Aliane, H., Ziane, M., & Bensaou, N. (2017). A genetic algorithm feature selection based approach for Arabic Sentiment Classification. In Computer Systems & Applications.Google Scholar
  3. Ashok KJ, Abirami S (2018) Aspect-based opinion ranking framework for product reviews using a Spearman’s rank correlation coefficient method. Information Sciences 460:23–41Google Scholar
  4. Atashpaz-Gargari E, Reis MS, Braga-Neto UM, Barrera J, Dougherty ER (2017) A fast branch-and-bound algorithm for U-curve feature selection. Pattern Recognition 73:172–188CrossRefGoogle Scholar
  5. Barman S, Kwon YK (2017) A novel mutual information-based Boolean network inference method from time-series gene expression data. PLoS One 12(2):1–19CrossRefGoogle Scholar
  6. Dikbaş F (2018) A new two-dimensional rank correlation coefficient. Water Resources Management 32(5):1–15CrossRefGoogle Scholar
  7. Doorn JV, Ly A, Marsman M, Wagenmakers EJ (2018) Bayesian inference for Kendall’s rank correlation coefficient. American Statistician 72(4):303–308MathSciNetCrossRefGoogle Scholar
  8. Koizumi Y, Niwa K, Hioka Y, Koabayashi K, Ohmuro H (2017) Informative acoustic feature selection to maximize mutual information for collecting target sources. IEEE/ACM Transactions on Audio Speech & Language Processing 25(4):768–779CrossRefGoogle Scholar
  9. Lei, X., Yan, P., & Tong, C. (2002). Best first strategy for feature selection. In International Conference on Pattern Recognition.Google Scholar
  10. Ly A, Marsman M, Wagenmakers EJ (2018) Analytic posteriors for Pearson’s correlation coefficient. Statistica Neerlandica 72(1):4–13MathSciNetCrossRefGoogle Scholar
  11. Mafarja MM, Mirjalili S (2017) Hybrid Whale Optimization Algorithm with simulated annealing for feature selection. Neurocomputing 260:302–312CrossRefGoogle Scholar
  12. Mu Y, Liu X, Wang L (2018) A Pearson’s correlation coefficient based decision tree and its parallel implementation. Information Sciences 435:40–58MathSciNetCrossRefGoogle Scholar
  13. Murrieta-Mendoza A, Beuze B, Ternisien L, Botez RM (2017) New reference trajectory optimization algorithm for a flight management system inspired in beam search. Chinese Journal of Aeronautics 30(4):1459–1472CrossRefGoogle Scholar
  14. Nandi, G. (2011). An enhanced approach to Las Vegas Filter (LVF) feature selection algorithm. In Emerging Trends & Applications in Computer Science.Google Scholar
  15. Rahim R, Abdullah D, Nurarif S, Ramadhan M, Anwar B, Dahria M et al (2018) Breadth first search approach for shortest path solution in Cartesian area. Journal of Physics Conference Series 1019(1):012038CrossRefGoogle Scholar
  16. Rodriguez-Galiano VF, Luque-Espinar JA, Chica-Olmo M, Mendes MP (2018) Feature selection approaches for predictive modelling of groundwater nitrate pollution: An evaluation of filters, embedded and wrapper methods. Science of the Total Environment 624:661–672CrossRefGoogle Scholar
  17. Sabeti M, Boostani R, Katebi SD, Price GW (2007) Selection of relevant features for EEG signal classification of schizophrenic patients. Biomedical Signal Processing & Control 2(2):122–134CrossRefGoogle Scholar
  18. San-Chuan LI, Li-Li WU (2018) Forward sequence feature selection algorithm based on correlation search. Communications Technology 51(12):2920–2924Google Scholar
  19. Sembiring, P., Sinulingga, U., Situmorang, M., & Sembiring, S. (2017) Representative model the graph theory in calculations Kendall correlation coefficient. In International Conference on Information and Communication Technology 930.Google Scholar
  20. Su X, Li L, Shi F, Qian H (2018) Research on the fusion of dependent evidence based on mutual information. IEEE Access 6:71839–71845CrossRefGoogle Scholar
  21. Sun L, Fu S, Wang F (2019) Decision tree SVM model with Fisher feature selection for speech emotion recognition. EURASIP Journal on Audio, Speech, and Music Processing 2019(1):2CrossRefGoogle Scholar
  22. Xiong CZ, Su M, Jiang Z, Jiang W (2019) Prediction of hemodialysis timing based on LVW feature selection and ensemble learning. Journal of Medical Systems 43(1):18CrossRefGoogle Scholar
  23. Zhang K, Dong Y, Andrew B (2015) Feature selection by merging sequential bidirectional search into relevance vector machine in condition monitoring. Chinese Journal of Mechanical Engineering 28(6):1248–1253CrossRefGoogle Scholar
  24. Zhu M, Jie S (2013) An embedded backward feature selection method for MCLP classification algorithm ☆. Procedia Computer Science 17:1047–1054CrossRefGoogle Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. and Science Press 2020

Authors and Affiliations

  • Hui Liu
    • 1
  1. 1.School of Traffic and Transportation EngineeringCentral South UniversityChangshaChina

Personalised recommendations