Abstract
Border surveillance plays a critical role in ensuring national security by detecting and preventing illegal activities. This chapter presents a novel approach utilizing a lighter-than-air (LTA) unmanned aerial vehicle (UAV), equipped with ultra-high-resolution multisensor payload, meant for border surveillance (relevant to transnational organized crime activities), search and rescue (SaR), and rough terrain detection. The BorderUAS Project proposes a payload platform equipped with a synthetic aperture radar (SAR), a shortwave/longwave infrared (SWIR/LWIR) and acoustic sensors, coupled with optical and hyperspectral cameras intended for indirect detection. To achieve its goals, BorderUAS is further capitalizing on border police infrastructure (command and control centers), innovative data models for irregular crossing patterns and routes identification, events detection, as well as advanced audio/video analytics, data fusion, and storage. First results show that the proposed solution offers increased operational flexibility, rapid deployment, and comprehensive multisensory coverage and analysis of the surveyed areas.
You have full access to this open access chapter, Download chapter PDF
Keywords
- Unmanned aerial vehicle (UAV)
- Multisensor payload
- Border surveillance
- Ultra-high-resolution camera
- Thermal camera
- Hyperspectral camera
- Acoustic sensor
- Synthetic aperture radar (SAR)
Introduction
At the onset of the 2020s, EU Member States and their non-EU (Third) Country Neighbors have come to share 14.647 km of land borders and 67.571 km of maritime borders (coastline) [1]. The width, landscape biodiversity, and seasonal terrain’s adversities of EU’s borderlands have, before long, come to impact the frontier surveillance duties ascribed to various agencies operating within their premises, that is, National Police Units, Gendarmerie, and Frontex. Faced with such territorial hardships and an ever-increasing number of illegal activities, such as drug and human trafficking, border crossings, and state-backed hybrid threats, Border and Coast Guard Authorities have become increasingly reliant on state-of-the-art equipment (i.e., UAVs, mobile equipment, sensors, machine learning, etc.) to counter the constantly shape-shifting threats [2]. The recent technological advancements in the field of unmanned aerial vehicles (UAVs), often referred to as drones, have emerged as a promising tool in border surveillance given their versatility and mobility, coupled with AI-backed applications and a multiplicity of integrated sensors. In this regard, UAVs can provide EU Border Security Practitioners with significantly improved, timelier, situational awareness, and decision-making capacity. To this end, our work explores an ultra-high-resolution multisensor surveillance payload supporting border surveillance, search and rescue applications, and rough terrain detection(s) using a lighter-than-air (LTA) UAV to complement the efforts of Border Police Agencies during and after the project [3]. By employing a synthetic aperture radar (SAR), shortwave/longwave infrared (SWIR/LWIR) and acoustic cameras for direct target detection, as well as optical and hyperspectral cameras for indirect detection (via vegetation disturbance), this project will use innovative data models (AI) to identify illegal crossing patterns and preferred routes and advanced audio/video analytics and storage in the C2 center to make borderlands safer and more secure. The remainder of this chapter consists of the literature review section and the BorderUAS’ methodology analysis, expanding on the analysis and performance of the payload components. The proofs-of-concept (SAR, acoustic sensor) are also included along with preliminary results. The chapter’s final segment touches upon a series of concluding points and future work until the project’s completion.
Related Work
Border surveillance and control (land, air, and maritime) is conducted by the European or national authorities as depicted in Fig. 32.1 [4].
In particular, the case of airborne border surveillance has garnered significant interest from EU agencies. This interest is fueled by remarkable progress witnessed in the drone industry, coupled with advancements in computer vision technology. A comprehensive overview of this tendency is analyzed in [5], where it is stated, among others, that Frontex and the European Maritime Safety Agency began using drones for border surveillance in 2018 after many years of research and pilot projects. The EU-funded program FOLDOUT [6] presented an under-foliage detection method of illegal activity at borders and traces the movement and routes prior to arrival in border areas. To achieve this, a holistic system employing fused advanced ground, aerial, and space-based sensors mounted on Stratobus™ and satellite platforms was deployed in different countries for testing the performance under different weather conditions. In the same context, [7] examines the use of UAVs to support maritime safety and search and rescue operations by providing live video streams and images from the area of operations, within the EFFECTOR project. They developed an embedded system that employs machine learning algorithms, allowing a UAV to autonomously detect objects in the water and keep track of their changing position through time. Another innovative project in the field of sea border surveillance is [8], developing unmanned vehicles (UxV)—aerial, sea surface, and submarine ones, to enhance current maritime border surveillance operations regarding the detection of irregular migrants and narcotics smugglers. In addition to the state-of-the-art, BorderUAS combines for the first time a multirole lighter-than-air (LTA) unmanned aerial vehicle (UAV) with an ultra-high-resolution multisensor surveillance payload, including objects and events detection, providing real-time data stream and storage. BorderUAS also expands on the data harmonization shared among different countries; Ref. [9] developed a knowledge–based model that defines the entities of the land border surveillance domain and establishes potential associations among them, following the preceded work of the EUCISE-OWL project (http://www.eucise2020.eu).
BorderUAS Project Methodology and Payload
The project at hand presents a novelty of having all its payload components mounted on a single platform to allow for interoperability, thus making it stand apart from like-minded ventures mentioned above. BorderUAS payload carries with it two arrays of sensors. Each component is intended to support one (or more) of the border surveillance functions that BorderUAS seeks to empower. Table 32.1 provides a list of sensor functionality and their positioning thereof.
The footage secured by the payload’s components is then processed by means of an algorithmic design. The feed processing is there to enable end users to determine the degree of a threat each detected event may represent. In a nutshell, the techniques that will be applied to acoustic and visual feedback based on component functionality are laid down as follows:
-
Data acquisition, storage, and streaming
-
Object detection, tracking, characterization, and reidentification
-
Data and metadata fusion for the reduction of false positives or negatives
-
Geo-localization of detections
-
Photogrammetry and multimodal terrain mapping
The payload components, along with the algorithmic design applied in the feed received from them, will both be used in a set of field-trial scenarios designed to test them in real-life conditions. The scenario concepts were the outcome of end user-directed questionnaires [10] and are designated to feature detection/tracking of individuals/objects in open areas, hardly accessible areas or under canopy, as well as indirect (non-real time) detection of individuals and events through observations of the land cover.
BorderUAS Payload-Related Functionalities
The payload components of BorderUAS are briefly described below.
Acoustic Sensor (Proof of Concept)
For the purpose of mapping hardly accessible border areas, especially so during S&R operations, BorderUAS features an acoustic sensor (AS) designed and developed to capture acoustic signal from far distances that can be related to specific events and observations on the ground. Considered are working machinery, passing vehicles, or yelling people. The AS comprises a microphone array and uses a beamforming algorithm to generate a visual presentation of the acoustic field, that is, acoustic mapping of the UAV’s borderland surroundings, in real time.
Besides the acoustic map, which is georeferenced using coordinates from UAV’s GPS, the sensor also allows for real-time augmented listening to specific locations on the ground. The overall concept of the acoustic map acquisition is depicted in Fig. 32.2, where detected ground noise has been geo-localized based on the GPS position of the sensor and the individual results of the beam-forming algorithm.
LWIR/SWIR Novelty
BorderUAS will also feature two IR sensors (LWIR, SWIR) for enhancing border surveillance under nightly conditions. LWIR comprises two cameras, one with fixed and another with a zoom lens, operating at a 7–14 μm wavelength range and will be used for direct detection of individuals/objects via its thermal vision property. SWIR comprises a camera with a fixed lens that operates at a 0.9–1.7 μm wavelength range and uses night vision to detect individuals/objects in ambient lighted terrains and through a glassy surface (i.e., a car’s windshield). The three cameras’ output will be fused (LWIR/SWIR) to allow for freely switching between thermal and night vision whilst tracking the same target.
Hyperspectral Cameras
Hyperspectral (HS) cameras are used for terrain mapping and the case of indirect detection via vegetation disturbance (i.e., stepping on a stuck of tree vines, etc.). Their output is a mosaic type of snapshot that pans out 16 bands of a 4 × 4 grid. Visual as well as near-infrared spectra are captured (see Fig. 32.3). The information captured by the different wavelengths helps in enhancing vegetation imaging by discerning between fresh, dry, or different types of greenery or foliage. An example is the normalized difference vegetation index (NDVI) that is used to quantify vegetation greenness, density, and greenery health. Ultimately, the HS sensor will be used in conjuncture with other payload components in producing multilayered map(s) of a given surveyed area that can be visually assessed by the LEA.
SAR (Proof of Concept)
The land observations through optical imaging have received a lot of attention due to their high spatial resolution and low cost. However, cloudy weather and night observation are critical aspects of optical images. A solution is provided by active remote sensing with synthetic aperture radar (SAR) that works with signal wavelengths that penetrate clouds and smoke.
The most important factor of SAR instruments is the backscattering coefficient that depends on vegetation, humidity, ground terrain, transmit frequency, and the polarization of the frequency. Some of the most common SAR active systems operate in L (1–2 GHz), C (4–8 GHz), and X (8–12 GHz) frequency bands. A collection of SAR ground-based received images is given below. Figure 32.4 shows how the soil and vegetation response varies in all three frequencies. Figure 32.5 shows the contrast between the backscattering coefficients of a water surface and an urban area. The larger the wavelength, that is, lower frequency, the better is the penetration through foliage or ground. To this end, L-band SAR is deployed on the UAV platform that allows for stationary and continuous monitoring of area of interest at the time when a specific event occurs.
Raw data is processed to generate both full-resolution SAR images and stacks of real aperture radar (RAR) profiles with only range resolution. Both products are georeferenced and are updated about every 5 min. Detection techniques involve target detection via position change made visible in a pair of SAR images through anomaly detection. Additionally, RAR profile stacks generated from a raw dataset allow for target movement detection within the time window of raw data file collection.
Optical High-Resolution RGB Sensor
To best acquire video stream from very large distances, a set of optical sensors that supports a resolution beyond the standards was deployed. Two high-resolution cameras are used, along with a set of interchangeable lenses and one processing unit that serves as the control and processing unit (see configuration in Fig. 32.6). Camera functionalities include, but are not limited to, data storage, preprocessing, live streaming, and perception used for object tracking and characterization of detections.
After the preprocessing of the image data, the data must be stacked in a way for automatic feature extraction, simultaneous detection, and localization of an object. Due to the ultra-high resolution of each input image and the use of a resource-limited embedded system, real-time processing is very laborious. In fact, any modern DNN-based real-time detection algorithm has been designed to handle inputs of much lower resolution. We implemented a three-step optimized scheme for handling ultra-high-resolution images for real-time object detection on our embedded system. We split the input image into K subframes, applying YOLOv5 on each one of the K subframes, and finally merging all the intermediate detections for determining a final detection on the initial input image (for more details, see Antonakakis et al. [11]).
Preliminary Results
BorderUAS platform is an event-driven system of systems. Sensors’ output creates events upon data processing and fusion enhancing them with information from the analytics subsystems. All events and detections are displayed within the Command and Control Centre (C2) for assessment by a LEA operator.
The C2 is a web-based, platform-independent, application designed to enable Border Guard Authorities involved in the future field trials to interact with the UAV platform’s payload and effectively handle events unfolding during scenario execution. The major components featured by the C2 are side-menu, real-time video stream visualization, event list and management, and GIS interface where events, targets, and points of interest are visualized at their global world location (see Fig. 32.7). Additional functionalities supported through particular views are dedicated to sensor and video management, business intelligence, and exporting or reporting on the found incidents.
The components constituting the payload of the airborne vessel have already entered into their individual testing phase(s), with some initial results being briefly outlined below.
Acoustic Sensor
Work with the AS so far has involved the use of a beamforming algorithm coupled with array of microphones to improve the signal-to-noise ratio output along with the proper Microphone EMI and wind shielding. Moreover, the use of LMS algorithm can help with noise cancellation coming from the engines of the airborne vessel. Concerning the analysis of received feed will entail K-NN, K-means, & GSOM algorithms to perform noise classification to spot the events of interest.
Hyperspectral Camera
Testing involved the application of principal component analysis (PCA) and support vector machine (SVM) algorithms to perform image segmentation using the HS camera and providing information about the land cover (see Fig. 32.3). Along with the false-color visualizations, these will be used to evaluate the accuracy of image segmentation captured by the HS camera.
MIMO SAR
The added value of the proposed MIMO radar is twofold. First, it can provide SAR images when the LTA UAV is still over the area of interest. Usually, SAR systems need the movement of the platform to generate the synthetic aperture as in the case of space- and airborne SAR systems. In addition, the possibility to generate both an SAR image and stacks of RAR profiles provides a means of detecting anomalies in the scene via a backscattering coefficient every 5 min and tracking the movement of moving targets within the 5-min time window needed to complete the raw data acquisition (see Figs. 32.4 and 32.5).
EOS Sensors (IR and High-Resolution RGB)
Initial test sessions have used the video feeds, along with optimized YOLO4 and YOLOv5 algorithms’ real-time airborne detections of targets. Detection is then followed by object tracking by assigning it a unique ID; this is done using the 2D Kalman filter, along with an association function (OverUnion algorithm), to form a relation between the former’s predictions and the YOLO’s detections. So far, the main objective is to improve the training on detections through the public Stanford dataset and by creating a separate one tailored to the needs of the BUAS scenarios to follow (see Fig. 32.6).
Conclusions
The BorderUAS Project seeks to provide Border Surveillance Authorities with a lighter-than-air UAV platform of 24/7 Operational Autonomy featuring a multiplicity of payload components. The UAV’s payload output is further analyzed and refined by means of algorithmic processes revolving around machine learning and AI support. End users (LEAs) designated to participate in field trials for testing the concept’s operational added value in borderland surveillance will be able to utilize a state-of-the-art Command and Control (C2) System to both have access to payload component feed and help automatize, in an interoperable fashion, the content of alerts, events, and border mission guidelines for in-field units to follow.
Future research expanding on the given EU project’s area of expertise may unfold along the lines of the following topics:
-
Organizing and carrying out field trials to validate the solution at hand
-
Building an ecosystem including industry partners that may wish to employ the payload’s capabilities in the private sector
-
Posthumously testing the bundled payload components on other unmanned or manned airborne/ground vessels to check as to the interoperability, versatility, and viability of the solution at hand
-
Training the models (i.e., for detection, tracking, reidentification, etc.) using additional (public, etc.) datasets for improved sensor/camera output analysis.
References
Buzmaniuk, S. (2021). The Union’s external borders: A European debate revisited. Fondation Robert Schuman, 585(European issues), 1–7.
Barnes, S., Georgadze, A., et al. (2023). Cosmic-ray tomography for border security. Instruments, 7, 13.
Chatzis, P., & Stavrou, E. (2022). Cyber-threat landscape of border control infrastructures. International Journal of Critical Infrastructure Protection, 36, 100503.
Koslowski, R. (2021). Drones and border control: An examination of state and non-state actor use of UAVs along borders. In Research handbook on international migration and digital technology (pp. 152–165). Edward Elgar Publishing Limited.
Bolakis, C., Mantzana, V., et al. (2022). Correction to: FOLDOUT: A through foliage surveillance system for border security. In Technology development for security practitioners (pp. C1–C2). Springer.
Vasilopoulos, E., Vosinakis, G., et al. (2022). Autonomous object detection using a UAV platform in the maritime environment. In International conference on research challenges in information science (pp. 567–579). Springer.
Bauk, S., Kapidani, N., et al. (2019). Autonomous marine vehicles in sea surveillance as one of the COMPASS2020 project concerns. Journal of Physics: Conference Series, 1357(1), 012045.
Kontopodis, I., Leskovsky, P., et al. (2022). BorderUAS: A knowledge-based representation of the border surveillance domain. In IEEE international conference on imaging systems and techniques (IST 2022) (pp. 1–6). IEEE.
Kampas, G., Vasileiou, A., et al. (2022). Design of sensors’ technical specifications for airborne surveillance at borders. Journal of Defence & Security Technologies, 5, 58–83, 65.
Antonakakis, M., Zervakis, V., Petrakis, M., et al. (2022, June). Real-time object detection using an ultra-high-resolution camera on embedded systems. Paper presented at the 2022 IEEE International Conference on Imaging Systems and Techniques (IST 2022), pp. 1–6.
Acknowledgments
This research received funding from the European Union’s Horizon 2020 Research and Innovation Framework Programme under grant agreement no. 883272, project BorderUAS.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
Copyright information
© 2025 The Author(s)
About this chapter
Cite this chapter
Athanasakis, I. et al. (2025). BorderUAS Project: Semiautonomous Border Surveillance Platform Combining a Lighter-Than-Air (LTA) Unmanned Aerial Vehicle (UAV) with Ultra-High-Resolution Multisensor Surveillance Payload: A Comprehensive Overview. In: Gkotsis, I., Kavallieros, D., Stoianov, N., Vrochidis, S., Diagourtas, D., Akhgar, B. (eds) Paradigms on Technology Development for Security Practitioners. Security Informatics and Law Enforcement. Springer, Cham. https://doi.org/10.1007/978-3-031-62083-6_32
Download citation
DOI: https://doi.org/10.1007/978-3-031-62083-6_32
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-62082-9
Online ISBN: 978-3-031-62083-6
eBook Packages: Physics and AstronomyPhysics and Astronomy (R0)