Skip to main content
Log in

A reinforcement learning based mobile charging sequence scheduling algorithm for optimal sensing coverage in wireless rechargeable sensor networks

  • Original Research
  • Published:
Journal of Ambient Intelligence and Humanized Computing Aims and scope Submit manuscript

Abstract

Mobile charging provides a new way for energy replenishment in the Wireless Rechargeable Sensor Network (WRSN), where the Mobile Charger (MC) is employed for charging nodes sequentially via wireless energy transfer according to the mobile charging sequence scheduling result. Mobile Charging Sequence Scheduling for Optimal Sensing Coverage (MCSS-OSC) is a critical problem for providing network application performance; it aims to maximize the Quality of Sensing Coverage (QSC) of the network by optimizing the MC’s mobile charging sequence and remains a challenging problem due to its NP-completeness in nature. In this paper, we propose a novel Improved Q-learning Algorithm (IQA) for MCSS-OSC, where MC is taken as an agent to continuously learn the space of mobile charging strategies through approximate estimation and improve the charging strategy by interacting with the network environment. A novel reward function is designed according to the network sensing coverage contribution to evaluate the MC charging action at each charging time step. In addition, an efficient exploration strategy is also designed by introducing an optimal experience-strengthening mechanism to record the current optimal mobile charging sequence regularly. Extensive simulation results via Matlab2021 software show that IQA is superior to existing heuristic algorithms in network QSC, especially for large-scale networks. This paper provides an efficient solution for WRSN energy management and new ideas for performance optimization of reinforcement learning algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Algorithm 1
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  • Chen Hongbin, Li Xueyan, Zhao Feng (2016) A reinforcement learning-based sleep scheduling algorithm for desired area coverage in solar-powered wireless sensor networks. IEEE Sens J 16(8):2763–2774

    Article  Google Scholar 

  • Chen Shuai, Jiang Chengpeng, Li Jinglin, Xiang Jinwei, Xiao Wendong (2021) Improved deep q-network for user-side battery energy storage charging and discharging strategy in industrial parks. Entropy 23(10):1311

    Article  Google Scholar 

  • Chen S, Li J, Jiang C, Xiao W (2022) Optimal energy-storage configuration for microgrids based on soh estimation and deep q-network. Entropy 24(5):630

    Article  Google Scholar 

  • Chi Lin Y, Sun KW, Chen Z, Bo X, Guowei W (2019) Double warning thresholds for preemptive charging scheduling in wireless rechargeable sensor networks. Comput Netw 148:72–87

    Article  Google Scholar 

  • Chi L, Jingzhe Z, Chunyang G, Houbing S, Guowei W, Obaidat MS (2017) A temporal-spatial real-time charging scheduling algorithm for on-demand architecture in wireless rechargeable sensor networks. IEEE Trans Mob Comput 17(1):211–224

    Google Scholar 

  • David S, Schrittwieser J, Simonyan K, Antonoglou I, Huang A, Guez A, Hubert T, Baker L, Lai M, Adrian Bolton et al (2017) Mastering the game of go without human knowledge. Nature 550(7676):354–359

    Article  Google Scholar 

  • Gaudette B, Hanumaiah V, Krunz M, Vrudhula S (2014) Maximizing quality of coverage under connectivity constraints in solar-powered active wireless sensor networks. ACM Trans Sens Netw (TOSN) 10(4):1–27

    Article  Google Scholar 

  • Han G, Liu L, Jiang J, Shu L, Hancke G (2015) Analysis of energy-efficient connected target coverage algorithms for industrial wireless sensor networks. IEEE Trans Industr Inf 13(1):135–143

    Article  Google Scholar 

  • Han G, Guan H, Jiawei W, Chan S, Shu L, Zhang W (2018) An uneven cluster-based mobile charging algorithm for wireless rechargeable sensor networks. IEEE Syst J 13(4):3747–3758

    Article  Google Scholar 

  • He L, Linghe Kong Y, Pan J, Zhu T (2014) Evaluating the on-demand mobile charging in wireless sensor networks. IEEE Trans Mob Comput 14(9):1861–1875

    Article  Google Scholar 

  • Jia R, Zhang X, Feng Y, Wang T, Jianfeng L, Zheng Z, Li M (2021) Long-term energy collection in self-sustainable sensor networks: a deep q-learning approach. IEEE Internet Things J 8(18):14299–14307

    Article  Google Scholar 

  • Jiang C, Wang Z, Chen S, Li J, Wang H, Xiang J, Xiao W (2022) Attention-shared multi-agent actor-critic-based deep reinforcement learning approach for mobile charging dynamic scheduling in wireless rechargeable sensor networks. Entropy 24(7):965

    Article  MathSciNet  Google Scholar 

  • Jiang C, Liu F, Li J, Peng LV, Xiao W (2020) Mobile energy replenishment scheduling based on quantum-behavior particle swarm optimization. In 2020 39th Chinese Control Conference (CCC), pages 5253–5258. IEEE

  • Kan Y, Chang C-Y, Kuo C-H, Roy DS (2021) Coverage and connectivity aware energy charging mechanism using mobile charger for wrsns. IEEE Syst J 16(3):3993–4004

    Article  Google Scholar 

  • Le Nguyen P, Nguyen TH, Nguyen K, et al (2020) Q-learning-based, optimized on-demand charging algorithm in wrsn. In 2020 IEEE 19th International Symposium on Network Computing and Applications (NCA), pages 1–8. IEEE

  • Li Jinglin, Jiang Chengpeng, Wang Jing, Taian Xu, Xiao Wendong (2023) Mobile charging sequence scheduling for optimal sensing coverage in wireless rechargeable sensor networks. Appl Sci 13(5):2840

    Article  Google Scholar 

  • Lin C, Han D, Deng J, Guowei W (2017) P\(^2\)s: A primary and passer-by scheduling algorithm for on-demand charging architecture in wireless rechargeable sensor networks. IEEE Trans Veh Technol 66(9):8047–8058

    Article  Google Scholar 

  • Madana S, Tarach A (2022) Delay-tolerant charging scheduling by multiple mobile chargers in wireless sensor network using hybrid gsfo. Journal of Ambient Intelligence and Humanized Computing, pages 1–17

  • Moloud Amini S, Karimi A, Shehnepoor SR (2019) Improving lifetime of wireless sensor network based on sinks mobility and clustering routing. Wireless Pers Commun 109:2011–2024

    Article  Google Scholar 

  • Naween K, Dinesh D, Mukesh K (2021) An efficient on-demand charging schedule method in rechargeable sensor networks. J Ambient Intell Humaniz Comput 12(7):8041–8058

    Article  Google Scholar 

  • Ottoni André LC, Nepomuceno Erivelton G, de Oliveira Marcos S, de Oliveira Daniela CR (2022) Reinforcement learning for the traveling salesman problem with refueling. Complex Intell Syst 8(3):2001–2015

    Article  Google Scholar 

  • Tang D, Yusuf B, Botzheim J, Kubota N, Chan CS (2015) A novel multimodal communication framework using robot partner for aging population. Expert Syst Appl 42(9):4540–4555

    Article  Google Scholar 

  • Touati F, Mnaouer AB, Erdene-Ochir O, Mehmood W, Hassan A, Gaabab B (2016) Feasibility and performance evaluation of a 6lowpan-enabled platform for ubiquitous healthcare monitoring. Wirel Commun Mob Comput 16(10):1271–1281

    Article  Google Scholar 

  • Vahabi S, Eslaminejad M, Dashti SE (2019) Integration of geographic and hierarchical routing protocols for energy saving in wireless sensor networks with mobile sink. Wireless Netw 25(5):2953–2961

    Article  Google Scholar 

  • Vahabi S, Mojab SP, Hozhabri A, Daneshvar A (2023) Reinforcement learning movement path for multiple mobile sinks in wireless sensor networks. Int J Commun Syst 36(6):e5402

    Article  Google Scholar 

  • Vahabi S, Mojab SP, Hozhabri A, Daneshvar A (2023) Reinforcement learning movement path for multiple mobile sinks in wireless sensor networks. Int J Commun Syst 36(6):e5402

    Article  Google Scholar 

  • Wei Q, Wang F (2022) Reinforcement learning. Tsinghua University Press, Beijing

    Google Scholar 

  • Wei Z, Liu F, Lyu Z, Ding X, Shi L, Xia C (2018) Reinforcement learning for a novel mobile charging strategy in wireless rechargeable sensor networks. In Wireless Algorithms, Systems, and Applications: 13th International Conference, WASA 2018, Tianjin, China, June 20-22, 2018, Proceedings 13, pages 485–496. Springer

  • Woiceshyn K, Kashino Z, Nejat G, Benhabib B (2018) Vehicle routing for resource management in time-phased deployment of sensor networks. IEEE Trans Autom Sci Eng 16(2):716–728

    Article  Google Scholar 

  • Xiao L, Wang P, Niyato D, In Kim D, Zhu Han (2020) Wireless charging technologies: fundamentals, standards, and network applications. IEEE Commun Surv Tutor 18(2):1413–1452

    Google Scholar 

  • Youchao Wang SM, Rajib SM, Chris C, Grieve B (2018) Low-cost turbidity sensor for low-power wireless monitoring of fresh-water courses. IEEE Sens J 18(11):4689–4696

    Article  Google Scholar 

  • Zhang S, Jie W, Sanglu L (2014) Collaborative mobile charging. IEEE Trans Comput 64(3):654–667

    Article  MathSciNet  Google Scholar 

  • Zhu Xiaojian, Li Jun, Zhou Mengchu (2019) Target coverage-oriented deployment of rechargeable directional sensor networks with a mobile charger. IEEE Internet Things J 6(3):5196–5208

    Article  Google Scholar 

  • Zhu G, Dongzhu L, Yuqing D, Changsheng Y, Jun Z, Kaibin Huang (2020) Toward an intelligent edge: wireless communication meets machine learning. IEEE Commun Mag 58(1):19–25

    Article  Google Scholar 

  • Zijing M, Shuangjuan L, Longkun G, Guohua W (2020) Non-linear k-barrier coverage in mobile sensor network. In International Symposium on Parallel Architectures, Algorithms and Programming, pages 12–23. Springer,

  • Zou J, Chang Q, Lei Y, Arinez J (2018) Event-based modeling and analysis of sensor enabled networked manufacturing systems. IEEE Trans Autom Sci Eng 15(4):1930–1945

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported in part by the National Natural Science Foundations of China (NSFC) under Grant 62173032, the Foshan Science and Technology Innovation Special Project under Grant BK22BF005, and the Regional Joint Fund of the Guangdong Basic and Applied Basic Research Fund under Grant 2022A1515140109.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wendong Xiao.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, J., Wang, H. & Xiao, W. A reinforcement learning based mobile charging sequence scheduling algorithm for optimal sensing coverage in wireless rechargeable sensor networks. J Ambient Intell Human Comput 15, 2869–2881 (2024). https://doi.org/10.1007/s12652-024-04781-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12652-024-04781-3

Keywords

Navigation