Skip to main content

An Interpretable Environmental Sensing System with Unmanned Ground Vehicle for First Aid Detection

  • Chapter
  • First Online:
Interpretable Cognitive Internet of Things for Healthcare

Part of the book series: Internet of Things ((ITTCC))

  • 116 Accesses

Abstract

Today, as a result of the development of remote sensing techniques, the importance of using sensing sensors on unmanned ground vehicles (UGVs) has increased. Developing sensor technologies are used in many different areas, from natural disasters to the defense industry. Here, among many different scenarios, it is too important to provide immediate medical first aid at the time of disasters. The unmanned ground vehicle, thanks to its sensors, is able to recognize its environment and transfer the correct data about the environment to the relevant people or institutions, preventing possible bad scenarios. Autonomous robots used today have insufficient mobility or sensing techniques and are costly for individuals or institutions. In this study, it is aimed that the developed unmanned ground vehicle can be easily accessed in environments where it is planned to detect people needing first aid, thanks to the sensor techniques to be used and to perform its task more effectively by recognizing the relevant environment. However, the low cost of the developed unmanned ground vehicle is very important. In the developed system, a LIDAR laser scanner sensor is used to model the environment where the unmanned ground vehicle is located. Within the scope of the study, 3D environmental modeling was carried out using 2D LIDAR. In the system design, the environment definition has been enriched by using the image processing technique and infrared camera. The motor driving operations of the unmanned ground vehicle and the control of various peripherals are provided by the Arduino microcontroller. LIDAR and camera are controlled on Raspberry Pi embedded system computer. All data from the LIDAR sensor, camera, motor driver, and other peripherals are displayed and controlled in a single interface via the developed mobile application. As a result of the study, an ergonomic, safe, integrated robot design that will reduce financial resources for the organizations targeted to use unmanned ground vehicles, where the user can monitor the dangerous environments remotely and recognize the relevant environment, has been created. Furthermore, thanks to the formed IoT synergy, the detection of people needing first aid in unreachable places can be ensured easily with an unmanned solution.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 129.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Pettigrew, S. (2018). [Online]. Available: https://theconversation.com/driverless-cars-really-do-have-health-and-safety-benefits-if-only-people-knew-99370

  2. Roborock. (2020). [Online]. Available: https://us.roborock.com/pages/roborock-s6-maxv

  3. Stein, S. (2021). [Online]. Available: https://www.cnet.com/how-to/future-of-lidar-cool-now-going-to-be-cooler-apple-iphone-12-and-ipad-pro/

  4. Corns, A., & Shaw, R. (2009). High resolution 3-dimensional documentation of archaeological monuments & landscapes using airborne LIDAR. Journal of Cultural Heritage, 10, e72–e77.

    Article  Google Scholar 

  5. Dwivedi, M., Uniyal, A., & Mohan, R. (2015). New horizons in planning smart cities using LIDAR technology. International Journal of Applied Remote Sensing and GIS (IJARSGIS), 1(2), 40–50.

    Google Scholar 

  6. Weiss, U., & Biber, P. (2011). Plant detection and mapping for agricultural robots using a 3D LIDAR sensor. Robotics and Autonomous Systems, 59(5), 265–273.

    Article  Google Scholar 

  7. Akay, A. E., Oğuz, H., Karas, I. R., & Aruga, K. (2009). Using LIDAR technology in forestry activities. Environmental Monitoring and Assessment, 151(1), 117–125.

    Article  Google Scholar 

  8. Zhang, J., & Singh, S. (2014, July). LOAM: Lidar Odometry and mapping in real-time. Robotics: Science and Systems, 2(9), 1–9.

    Google Scholar 

  9. Wasik, A., Ventura, R., Pereira, J. N., Lima, P. U., & Martinoli, A. (2016). Lidar-based relative position estimation and tracking for multi-robot systems. In Robot 2015: Second Iberian robotics conference (pp. 3–16). Springer.

    Google Scholar 

  10. Kağızman, A. (2018). Otonom araçlar için 2B lazer tarayıcı kullanılarak yeni 3B LIDAR sistemi elde edilmesi ve engel tespiti. İstanbul Teknik Üniversitesi, Fen Bilimleri Enstitüsü, Yüksek Lisans Tezi, 100s, İstanbul.

    Google Scholar 

  11. Akyol, S., & Ayşegül, U. Ç. A. R. (2019). Rp-lidar ve mobil robot kullanılarak eş zamanlı konum belirleme ve haritalama. Fırat Üniversitesi Mühendislik Bilimleri Dergisi, 31(1), 137–143.

    Google Scholar 

  12. Boston Dynamics. (2015). [Online]. Available: https://www.bostondynamics.com/legacy

  13. Aselsan. (2017). [Online]. Available: https://www.aselsan.com.tr/ffdf27dd-4f67-4abb-bf02-9022a7ce042c.pdf

  14. Texas Instruments. (2016). [Online]. Available: https://www.ti.com/lit/ds/symlink/hdc 1080.pdf?ts=1623524310036&ref_url=https%253A%252F%252Fwww.google.com%252F

  15. ROS. (2018). [Online]. Available: http://wiki.ros.org/melodic/Installation/Debian

  16. Github. (2018). [Online]. Available: https://github.com/robopeak/rplidar_ros

  17. Haala, N., Peter, M., Kremer, J., & Hunter, G. (2008). Mobile LIDAR mapping for 3D point cloud collection in urban areas—A performance test. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 37, 1119–1127.

    Google Scholar 

  18. Teixidó, M., Pallejà, T., Font, D., Tresanchez, M., Moreno, J., & Palacín, J. (2012). Two-dimensional radial laser scanning for circular marker detection and external mobile robot tracking. Sensors, 12(12), 16482–16497.

    Article  Google Scholar 

  19. Ocando, M. G., Certad, N., Alvarado, S., & Terrones, Á. (2017, November). Autonomous 2D SLAM and 3D mapping of an environment using a single 2D LIDAR and ROS. In 2017 Latin American robotics symposium (LARS) and 2017 Brazilian symposium on robotics (SBR) (pp. 1–6). IEEE.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ali Topal .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Topal, A., Ersoy, M., Yigit, T., Kose, U. (2023). An Interpretable Environmental Sensing System with Unmanned Ground Vehicle for First Aid Detection. In: Kose, U., Gupta, D., Khanna, A., Rodrigues, J.J.P.C. (eds) Interpretable Cognitive Internet of Things for Healthcare. Internet of Things. Springer, Cham. https://doi.org/10.1007/978-3-031-08637-3_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-08637-3_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-08636-6

  • Online ISBN: 978-3-031-08637-3

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics