Abstract
This paper simulated the missile engagement situation using a simulator and conducted a machine learning study based on the generated data. The simulator simulates missile engagements between the enemy and our forces and collects data. The collected data is learned using random forest, XGBoost, and LGBM models after preprocessing. In addition, hyperparameter adjustments were performed for each model to find the optimal parameters. Different metrics for accuracy, F1-score, and ROC-AUC were used for performance comparison. As a result of the experiment, XGBoost showed the best performance in performance indicators, and LGBM was the fastest in terms of learning speed. This paper suggests that XGBoost, which is slow in learning speed but has the best accuracy and performance indicators, is suitable for one-to-one interception situations, and LGBM, which is fast in learning and has excellent performance indicators, is suitable for many-to-many interception situations.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Jung, Y.-H., Kim, S.-N., K., Park, K.-H., Park, H.-S.: Recent research trends in defense ICT convergence technology. J. Korean Assoc. Telecommun. (Inf. Commun.) 37(4), 54–62 (2020)
Hong, S., Song, J., Jang, Y.: Analysis of the military effectiveness of domestic intercept systems through timeline analysis. J. KIMST 22(1), 93–105 (2019)
Dahouda, M.K., Joe, I.: A deep-learned embedding technique for categorical features encoding. IEEE Access 9, 114381–114391 (2021). https://doi.org/10.1109/ACCESS.2021.3104357
Visa, S. , Ramsay, B., Ralescu, A., van der Knaap, E.: Confusion matrix-based feature selection. In: Proceedings of the 22nd Midwest Artificial Intelligence and Cognitive Science Conference (MAICS ’11), pp. 120-127, Cincinnati, Ohio, USA, Apr 2011
Sun, Y., Wong, A.K., Kamel, M.S.: Classification of imbalanced data: a review. Int. J. Pattern Recognit. Artif. Intell. 23, 687–719 (2009)
Breiman, L.: Random forests. Mach. Learn. 45, 5–32 (2001). https://doi.org/10.1023/A:1010933404324
Chen, T., Guestrin, C.: XGBoost: a scalable tree boosting system. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (2016)
Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.: LightGBM: a highly efficient gradient boosting decision tree. NIPS (2017)
Acknowledgement
This work was supported partly by the Institute for Information & Communications Technology Promotion (IITP) grant funded by the Korea government (MSIP) (No. 2020-0-00107, Development of the technology to automate the recommendations for big data analytic models that define data characteristics and problems), and partly by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No. NRF-2019R1A2C1009894).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Jin, S., Dahouda, M.K., Joe, I. (2023). Ensemble Machine Learning Models for Simulating the Missile Defense System. In: Silhavy, R., Silhavy, P., Prokopova, Z. (eds) Data Science and Algorithms in Systems. CoMeSySo 2022. Lecture Notes in Networks and Systems, vol 597. Springer, Cham. https://doi.org/10.1007/978-3-031-21438-7_12
Download citation
DOI: https://doi.org/10.1007/978-3-031-21438-7_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-21437-0
Online ISBN: 978-3-031-21438-7
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)