Towards A Standardized Aerial Platform: ICUAS’22 Firefighting Competition

This paper presents an Unmanned Aerial Vehicle (UAV) platform used for the competition held at the International Conference for Unmanned Aircraft Systems 2022 (ICUAS’22). The envisioned scenario in the competition is delivering an extinguishing ball to a fire with a UAV. Driven on that thought, the qualifying round of the competition is organized in a realistic simulation environment. The competitors are required to navigate through a cluttered environment into a free space, where they perform a search for the target and deliver the ball as close as possible to that target. The competition finals were organized as a part of the conference, with the same goal as in the simulation qualifiers. The UAVs used for the competition are prepared and provided by the organizers. A detailed description of the UAV software and hardware is outlined in the paper as a step towards standardizing aerial research platform.


Introduction
Robotics competitions exist in different shapes and forms and have been growing more and more popular every year. Lovro  From the humble beginnings in 1970s to recent multi-million dollar competitions, they were used to showcase state-ofthe-art technology, develop new materials and algorithms, engage the general public and promote STEM education and research. More recently, robotics competitions have been recognized as drivers towards repeatability and reproducibility of research within robotics community, and multiple instances have been organized to bridge the gaps between competition scenarios and scientific experiments. One such initiative was Robot Competitions Kick Innovation in Cognitive Systems and Robotics (RoCKIn) [2] where, in contrast to previous competitions (such as early RoboCup [3] competitions), the performance of the robotic system was not assessed by comparing the results between teams on highlevel tasks, rather several benchmarks were introduced that evaluated the performance of the robots in more detail. Similar approach was undertaken in European Robotics Challenges [4], and was later further advanced through European Robotics League (ERL) which was built on top of euRathlon [5]. Currently, the METRICS project [6] is at the forefront of the robotics competitions based on funcional and task benchmarks.
Nowadays, the most famous robotics competitions (DARPA [7], ERL [8], MBZIRC [9]) are multi-domain competitions aimed towards well-equipped research laboratories and / Published online: 10 July 2023 Fig. 1 The flight arena at ICUAS 2022 held in Dubrovnik, Croatia. A large audience gathers as the UAV navigates the obstacles to find the target. The UAV is captured just before launching the ball towards the target on the right. Obstacles in the arena are composed of balloons and poles made of soft material to minimize the damage in case of a crash. The safety net around the flight arena ensures safety for the observers. The YouTube stream of the entire competition is available in [1] usually require a large team to develop multiple custom robotic platforms over relatively long periods of time. The requirement of having to buy or develop their own platforms and commit multiple years towards development poses as a high entry barrier to newcomers and smaller, more focused research groups. Having participated in both EuRoC, MBZIRC and ERL, we have identified a niche which could bring aerial robotics competitions to a larger audience. This niche consists of sprint-like competitions that do not require large teams, take a couple of months to complete and use a standardized open-source aerial robot platform. Such competition could lower the barrier to entry and engage more research groups, especially undergraduate students, if the following conditions are met: • The competition is higly visible and can reach a large number of research groups • Most of development for one instance of the competition can be performed in a realistic simulation environment • Software stack for the robot enables seamless transition from simulation to the real platform • Aerial platform is modular and can be extended with various sensors and actuators • Aerial platform is affordable and available • The scoring is transparent and the results are reproducible.
Based on the Aerial robotics control and perception challenge we organized at Mediterranean Conference on Control and Automation in 2018 [10], we believe that tying a competition to an aerial robotics conference can bring the competition into spotlight and expose it to a large number of research groups. In addition to increased visibility, having a competition collocated and as a part of a conference such as ICUAS provides the competitors with ability to meet not just other competitors but also researchers in the field (as evidenced by Fig. 1), further increasing attractiveness of the competition.
With the COVID-19 pandemic still ongoing with various degrees of severity in different parts of the world, the uncertainty of travel imposes a significant constraint on the competitions. Various competitions employed different strategies for remote competitions: a) fully virtual competitions using real world data [11]; b) competitions in which the teams send their equipment to the location but participate remotely [12], c) fully remote participation on a robot provided by the competition organizers [13], d) hybrid events in which teams are allowed to participate by either being on site or performing tasks in their own laboratory remotely [14]. For the ICUAS'22 UAV Competition, we adopted an approach similar to [13] by providing the UAV platform that the teams will use in the finals at the conference venue and concentrating on seamless transition from simulation to the real world. This enables teams to rapidly deploy their solutions on-site, but also facilitates remote participation if required. The use of a standard platform is not new in robotics competitions, as RoboCup Standard Platform League 1 is one of the oldest still running competitions. More recently, RoboCup@Home [15] also adopted two standard platforms, and we believe similar can be beneficial for aerial robotic competitions by leveling the playing field.
In this work, we describe the development of a software stack for an aerial robot that would satisfy aforementioned conditions which ensure lower barrier to entry and higher engagement in aerial robotics competitions. The aerial platform presented in this paper integrates components from multiple research projects conducted in the Laboratory for Robotics and Intelligent Control Systems at the University of Zagreb Faculty of Electrical Engineering and Computing, coupled with the best practices from other research laboratories such as [16][17][18]. A number of components described herein are beyond the scope of the ICUAS'22 UAV Competition, but open up the possibility of different scenarios for future competitions and increase the overall attractiveness of the platform.
Following the introductory section, Section 2 describes the aforementioned UAV platform, both in terms of hardware and software. Section 3 showcases the deployment of developed platform in the ICUAS'22 UAV Competition, while Section 4 discusses the reach of the ICUAS'22 UAV Competition. Section 5 brings forth concluding remarks and briefly discusses the future of ICUAS UAV Competitions.

UAV Platform
Our hypothesis is that a standardized platform serves as an enabler of successful competition, allowing the teams to participate with little or no additional cost. It also serves to level the playground for the competitors, allowing them to focus on the control and software side of the problem. In this section, we describe this proposed UAV platform. First, the hardware components of a general-purpose UAV platform for research and development are outlined. In addition, the software modules that enable the execution of autonomous missions in a complex environment are described. Finally, the paper disseminates how all the components are integrated within the Gazebo software-in-the-loop (SITL) environment to facilitate an easier transition from the simulator to the real world.

Hardware Components
Although there are many off-the-shelf UAVs available, their hardware is usually tightly integrated and closed for modifications. One of the goals of this paper is to provide a standardized list of "must-have" hardware components of an open aerial platform capable of autonomous navigation.

Frame and Propulsion
The most basic component of the aerial vehicle is its frame. It must be strong enough to withstand crashes, but also light enough to ensure longer flight times. Furthermore, the propulsion system is another integral element of aerial vehicles. A general rule of thumb is that the drone should reach hover at 35% thrust and maximum payload weight. Once this is achieved, adding more propellers only increases battery consumption, providing little additional benefit other than an increase in fault tolerance. The vehicle presented in this paper has a carbon fiber frame with four arms arranged in a cross structure. The propulsion system consists of four 15-inch T-Motor 1503 propellers and T-Motor MN4014 KV400 motors, resulting in a hover state at 38% of maximum thrust. The system is powered with 12000mAh 25V battery, which provides up to 20 minutes of flight time. It is important to note that these components must be integrated in a highly modularized manner, as they typically bear the brunt of the impact during a crash and should therefore be easily replaceable.

The Autopilot
The main job of the autopilot is to achieve stable vehicle flight by integrating all of the major vehicle components such as motors, battery, RC receiver, and optional sensors. Among the necessary functions that such a hardware piece should have are attitude (rate) control and state estimation using available integrated sensors such as an inertial measurement unit (IMU) and barometers. The final and perhaps most important requirement is that the autopilot is able to communicate with MAVLink 2 messages. This feature enables the UAV platform to communicate via the Mavros 3 interface, which is essential for smooth integration with any state-of-the-art autonomy package based on the Robot Operating System (ROS) framework. The UAV platform presented in Fig. 3 uses a Cube Orange autopilot hardware placed on a ProfiCNC/HEXKore highcurrent power board. It runs the Ardupilot -Copter 4 firmware, a full-featured, open-source UAV controller enabling flight in various control modes, mission planning, sensor integration, etc. Although it has a large feature set, the proposed UAV platform uses only one -attitude (rate) and altitude control. This further relaxes hardware constraints and allows other autopilot systems such as PX4, BetaFlight, etc. to be used interchangeably.
The Onboard Processing Computer The onboard processing computer is an integral hardware piece, on which the software modules shown in Fig. 2 are executed. Starting from the left, there is a mapping module that provides information about the occupancy state of the environment. A global path planner uses this information to generate a collision-free path through the desired waypoints. The trajectory generator's task is to provide feasible trajectory points based on the given path and given dynamic constraints. The state estimation module provides the complete UAV body state, both translation and rotation, based on the available sensor data such as IMU measurements, visual SLAM pose, LiDAR SLAM pose, etc. Finally, the onboard controller is responsible for tracking the generated trajectory by providing attitude (rate) and thrust commands to the UAV, i.e., the low-level attitude controller running on the Ardupilot firmware. The transmission of incoming control inputs and outgoing sensor data to and from the UAV system is handled by the Mavros package. The implementation of the specified pipeline is presented in this paper and is described in the corresponding uav_ros_stack [19] ROS packages.
Such a modular framework is capable of performing autonomous missions using a diverse sensor suite. As explained earlier, the importance of the Mavros package is reiterated here. It is the bond that connects the low-level autopilot component to the on-board computer. All data sent to and received from the autopilot needs to go through the Mavros interface. This includes the attitude (rate) and thrust control inputs, sensor data such as GPS and IMU and RC input values among others.
An Intel NUC, with the Ubuntu Linux 20.04 operating system running ROS Noetic, is used as the onboard computer on the presented UAV platform (Fig. 3).
The Peripherals Peripherals include anything that is not part of the vehicle's basic functions but adds some value to the overall system. Their utility may take the form of enhanced navigation or environmental awareness, such as LiDAR or visual SLAM, or the ability to perform a unique task, such as payload delivery, aerial manipulation, visual inspection, etc. It is important to note that the frame of the vehicle should be designed to allow the mounting of additional peripherals. To enable autonomous navigation, the Intel NUC is tightly integrated with the Cube Orange autopilot in the core of the UAV body. Reflective markers combined with the Optitrack motion capture system provide precise and stable localization, while the Intel Realsense D435 camera provides environment sensing capabilities. Finally, the magnetic gripper provides support to a variety of ferromagnetic payloads The presented UAV platform is envisioned as a generalpurpose payload delivery vehicle capable of executing autonomous flight missions in complex environments. A magnetic gripper is attached to the bottom of the frame. This allows the UAV to carry a useful payload such as a fire extinguisher ball. Furthermore, to enhance the environmental awareness capabilities, an Intel D435 RGB-D camera is mounted on the front of its frame.

Low-Level Control
The main objective of the low-level control is to reliably stabilize the vehicle during flight, both in terms of attitude and altitude. This is done in three steps: filtering setpoints, control loop, and thrust mixing. Thrust mixing refers to the distribution of the calculated controller outputs among the available rotors.
The proposed platform uses the controllers included in the Ardupilot -Copter firmware. The structure of the attitude controller, shown in Fig. 4, follows the form of a cascading PID controller with a single angle feed-forward term. An interesting feature is that the first proportional term of the angle controller has a nonlinear square root behavior for large angular errors. The altitude controller has a similar PID cascade structure. The low-level controller has several different modes of operation both for attitude and altitude control. The three ways to control attitude are shown in Table 1. The first and last rows correspond to the roll, pitch, yaw rate, and complete rate control, respectively. These modes are typically used when an external type of state estimation is used, such as Optitrack Motion Capture (MOCAP) or simultaneous localization and mapping (SLAM). When using an external localization source that is not connected to the autopilot's internal localization, it is advantageous to use commands that refer to the drone's body reference point. On the other hand, if the autopilot's internal estimation is used, the preferred mode is the angle control shown in the second row of Table 1.  Table 1. If rate targets are used, only the inner loop is utilised; whereas, when angle targets are employed, both loops are engaged As for the altitude control, there are two types of thrust interpretations, both scaled to the [0, 1] range. By default, the thrust control input is interpreted as a climb rate, i.e. a vehicle ascends and descends at prescribed maximum speeds when the value is 0 and 1, respectively. Meanwhile, by setting the appropriate firmware parameter (GUID_OPTIONS=8), the controller interprets "thrust-as-thrust". In other words, the value of the hover thrust now depends on both the UAV mass and the battery voltage.

Onboard Control
The onboard controller is responsible for tracking the commanded trajectory, which consists of position, velocity, acceleration, and heading. In general, the controller differentiates the current trajectory point with the latest vehicle state obtained from the estimation module and produces control inputs based on the state error. The control inputs are sent via the Mavros topic mavros/setpoint_raw/attitude according to the guidelines explained in Section 2.2. The main package used for onboard position control of the proposed UAV platform is uav_ros_control 8 . Here, a commonly used position control method, resembling a cascading PID structure with feed-forward elements similar to Fig. 4, can be found. Additionally, a standard model predictive controller is implemented, which can prove favorable when faced with noisy and unreliable state estimates. As for the control methods for nonstandard UAVs, with additional sources of actuation such as moving masses or tilting propellers, a nonlinear geometric approach exploiting the variable center of gravity property [20] is included.

Trajectory Generation
The task of the trajectory generator is to produce a smooth trajectory across all vehicle states, based on the desired path. The generated trajectory is sampled and sent consecutively to the onboard control. The main method for generating trajectories used on the presented UAV platform is the Time Optimal Path Parameterization by Reachability Analysis (TOPP-RA) [21] algorithm found inside the topp_ros 9 package. It uses a constrained numerical integration approach that operates on the bangbang principle of the actuator control value, which is acceleration in the case of the aerial vehicle. An additional method used, inspired by the research conducted in [22], is a predictive, model-based trajectory generation method located in the uav_ros_tracker 10 package. Given the desired path, it uses a model-predictive approach to generate feasible trajectory points wrt. the model dynamics and given constraints.

State Estimation
The state estimation module is possibly one of the most important pipeline components for autonomous operations. It is responsible for providing a steady stream of UAV state estimates, i.e. position, velocity, acceleration, and heading. A general state estimation module usually uses a variant of the Kalman filter with IMU as the baseline measurement and additional sources such as barometer, GPS, visual SLAM, LiDAR slam, ultra wide-band (UWB) positioning, etc. The quality of the state estimate, which implies a low level of noise and time delay, strongly correlates with the flight stability of the UAV. Therefore, to obtain the best state estimate it is important to know the sensor characteristics and behavior in different environments. For example, a GPS does not work well indoors, visual SLAM usually does not perform well in changing light conditions, LiDAR SLAM performs poorly in featureless environments, etc. The UAV platform presented includes several variants of state estimation. The most reliable source of localization measurements is a motion capture system. For the purposes of this paper, a UAV with reflective markers is tracked by the Optitrack system utilizing the ros_vrpn_client 11 . A Luenberger state observer is included to provide accurate and stable model estimates.
Sometimes it is simply not feasible to set up an indoor motion capture system due to the size of the arena and the obstacles within it. Therefore, an alternative is to use multiple sources of measurement with a sensor fusion framework. This is done with the sensor_fusion 12 package [23] and successfully deployed in GPS-denied environments. To achieve this, an error-state Kalman filter with position and orientation drift estimation is implemented and various sensors such as visual and LiDAR SLAM and UWP position are used as correction measurements. Finally, for outdoor competition, the safest localization option is GPS. Directly connected to the autopilot, the UAV state estimate can be easily obtained by listening to the Mavros topic mavros/global_position/local.

Global Path Planning
In general, the goal of global path planning is to find a feasible, collision-free path based on the vehicle's initial and target states given a map of the environment. Therefore, the 11 A ROS VRPN interface -https://github.com/ethz-asl/ ros_vrpn_client 12 A multi-sensor error-state EKF implementation for UAV localization -https://github.com/larics/sensor_fusion input to such a system consists of an occupancy grid of the environment and a set of waypoints, while the output is a piecewise straight line connecting the provided waypoints. The package used for this purpose is larics_motion_planning. 13 It uses the Rapidly-exploring Random Tree (RRT) [24] method to generate a collision-free path. The RRT algorithm starts from vehicle's initial state and performs a random sampling in multiple directions. Valid states from the random sample are added to the tree and the same search is performed again for the new nodes. This way, the tree grows in all directions trying to reach the goal which makes it suitable for exploring large environments. The RRT based path is then parameterized using the TOPP-RA algorithm, yielding a trajectory suitable for the UAV to execute.
For more information about potential use cases for the path planning algorithm, the reader is referred to parabolic airdrop experiments presented in [25] and its model-based path planning variant from [26].

Mapping
The vehicle needs a way to understand its environment, which requires the introduction of some kind of mapping into the system. A map is usually represented as an occupancy grid, meaning that the environment is divided into fixed-size cubes, also known as voxels, that mark space as either occupied or unoccupied. This leads to performance and memory issues for larger environments. Therefore, an efficient algorithm for 3D space decomposition, called OctoMap [27], is employed. A visual example of an OctoMap is presented in Fig. 5. This format of environment representation is well suited for collision checking and subsequent path planning. Another important aspect to consider is localization. Assuming perfect localization, the mapping task is straightforward if a sensor contains depth information. However, poor localization may require a SLAM approach. Suitable mapping sensors mounted on the presented UAV platform include only the Intel Realsense D435 depth camera. However, a Light Detection and Ranging sensor (LiDAR) is also very popular in UAV applications and could be mounted in a different scenario. We opted for a simple depth camera, to lower the cost of the platform. The Cartographer [28] SLAM algorithm is tightly integrated with the described software modules using the uav_ros_cartographer 14 package. It is useful both for path planning and state estimation purposes [29].

Gazebo Simulation
Up to this point, the software modules described are all a part of or closely related to the uav_ros_stack [19] package. These modules are capable of running natively on the computer onboard the UAV platform. In this section, it is shown how exactly these modules are integrated into the Gazebo simulation environment providing a software-inthe-loop framework. The obvious advantage is a seamless transition of new features from the simulation to the experimental validation.
All of the software modules as well as the required simulation components are integrated under the uav_ros_simulation [30] package. The outline of the simulation pipeline is shown in Fig. 6. The core of the Gazebo UAV model is the rotor model plugin from the rotors_simulator [17] package, which real- Fig. 6 This figure shows the pipeline connecting the Arducopter firmware to the Gazebo simulator and ROS, enabling the softwarein-the-loop simulations. On one end there is a Gazebo UAV model that can control its motor velocities, while on the other end is the connection to the pipeline shown in Fig. 2 via. the Control Inputs. In both the simulation and real-world deployment, Mavros provides a generalized interface to the Gazebo model or the actual vehicle respectively. This allows a seamless transition of software modules from the simulation to the real world istically simulates the forces and moments generated by the propellers. The connection to the Ardupilot firmware, running the Copter-SITL binaries, is established via UDP. The firmware needs to receive all of the available sensor data such as IMU, position and velocity measurements and in order to output the target rotor velocities. Finally, the MAVProxy terminal, which is a fully-fledged command-line ground control station (GCS), relays the information stream to and from the Mavros node. All of the necessary components for running the Ardupilot Gazebo SITL are found in the ardupi-lot_gazebo 15 package.

Competition Scenario: Firefighting UAVs
In this section, a use case for the proposed UAV platform is showcased in the form of a competition. First, the competition tasks and their original motivation are stated. Furthermore, each task is described in detail and the scoring scheme is given. Finally, a discussion about the challenge setup, the experiences, and the results of the simulation qualifiers along with the Dubrovnik arena finals is presented.

Competition Description
The competition is geared towards students and provides them with a unique opportunity to test their knowledge and expertise against their peers from all around the world. The initial qualifying stages were conducted in the Gazebo simulation environment while the finals were held onsite at the ICUAS'22 conference in Dubrovnik, Croatia. The competition is based on the challenges faced by firefighting UAVs. In the envisioned scenario of a burning building, UAVs can be deployed to extinguish the fires by throwing a fire extinguisher ball from a safe distance at the fire. Such a scenario can be divided into three separate tasks: exploration, target detection, and precision payload delivery. With respect to the competition layout shown in Fig 7, exploration takes place in Zone 2, while target detection and precision payload delivery are performed in Zone 3. The layout dimensions are provided to the competitors, both for the simulation and the onsite arena. To successfully complete the competition challenge, participants must complete each task sequentially and autonomously during a single flight. The run starts as soon as the UAV takes off and ends either after a predetermined timeout duration or when the payload is released from the magnetic gripper.

Exploration
The first task evaluates the UAV's ability to navigate through a set of static obstacles. To successfully 15 A repository for ArduPilot & Gazebo Software In Loop Simulation Interfaces, Models -https://github.com/larics/ardupilot_gazebo There are two levels of difficulty for this task. Participants can choose to use a pre-made map of the arena environment and plan a collision-free path based on the given map. Alternatively, they can use the sensor data available on the drone, namely the RGB-D camera, to navigate through the obstacles. Regardless of the total point count, those who choose the former option are ranked below the participants who chose the latter.

Target Detection
In this task, an ArUco is placed somewhere in Zone 3. To successfully complete this task, the UAV needs to find and reconstruct the position of the ArUco marker. The reconstructed position, corresponding to the center of the marker, must be published to a predefined ROS topic to evaluate the competitors' target detection solution.

Precision Payload Delivery
Once the UAV finds the ArUco marker, it can begin the execution of the final task. The goal of this task is to launch the payload and hit the located ArUco marker. It is important to note that the orientation of the marker can be anywhere from horizontal to vertical. This means that a simple airdrop solution is not sufficient to accomplish this task. Instead, a series of trajectory points should be generated for the UAV in such a way that when the magnetic gripper releases the payload, it flies in a parabolic trajectory and hits the desired target.

Scoring
Teams receive points only if the run is considered valid. The simulation score is calculated as an average of several runs, while in the on-site final the team with the highest score in a single run wins. The scoring scheme is given as follows: Reconstructed target position accuracy: Accuracy of the ball impact wrt. the target: Time required to complete the run: . (3)

Penalties and Disqualifying Behaviour
The competitors do not receive points for the run if: • The solution code submitted is unable to be executed either on the evaluation machine or the UAV platform, • The UAV crashes after collision with obstacles at any point during the run, • The run exceeds the specified timeout duration, • The ball is released before reaching Zone 3.
In case the UAV touches the obstacles but continues to complete the task, teams are penalized by deducting 15% of total points achieved in that run for each contact with the obstacle.

Gazebo Simulation Qualifiers
During the initial stage of the the simulation qualifiers, participating teams received the rulebook and the icuas22_competition [31] repository where they found installation and simulation startup instructions. The competition backend includes the uav_ros_simulation [30], which participants can either install natively, build with Docker or download a pre-built Docker image from DockerHub 16 . Participants are able to raise issues or start a discussion directly on the icuas22_competition GitHub repository. Furthermore, Tmux / Tmuxinator based routine is provided to easily run both the competition simulation and the teams' solution. The simulation includes a Gazebo maze-like environment and a single UAV with a mounted magnetic gripper, as shown in Fig. 8. After starting the simulation, the UAV takes off and the magnetic payload is spawned under the UAV, signaling the beginning of the challenge. As for the solution, there are no software limitations. Teams can use any number of existing algorithm implementations or create their own. Each node used for the solution is expected Fig. 8 This figure shows the layout for the simulation stage of the competition. This includes an Arducopter SITL Gazebo simulation of the proposed aerial vehicle platform in a maze-like environment with a hidden ArUco marker. The UAV is equipped with similar sensors and payload as shown in Fig. 3 to run autonomously, without external support, using the Tmux environment mentioned earlier. The only requirement is that teams combine all their packages into a single ZIP file or alternatively provide a working Dockerfile.

Dubrovnik Arena Finals
The finals were held in Dubrovnik, Croatia, during the ICUAS'22 conference. There were five teams that qualified to the finals. The only requirement for the teams was to bring their own personal laptop in order to improve and modify their solution during the competition, while everything else was provided by the organizing committee. The complete setup of the competition arena is shown in Fig. 9. The first part of the arena consists of obstacles such as vertical poles, balloons and boxes. The second part of the arena has an ArUco marker over the box opening, which represents the payload dropoff area. The Optitrack motion capture cameras were placed at the edges of the arena to provide a consistent localization source for the UAV. There is a safety margin as well as a net placed around the designed flight area to provide a safe spectator experience The finals were organized over three days: • Day 1 -Integration workshop: Teams were invited to the competition arena to meet the organizing committee, judges and to familiarize themselves with the arena setup, the UAV, and the procedure of launching their code on the UAV. Additionally, the teams' solutions were checked for build errors and whether they could be deployed on the UAV without issues. • Day 2 -Practice day: Each of the five teams was allocated two 30 minutes long slots to test their solution. • Day 3 -Competition day: Similar to the practice day, each team was granted two slots of 30 minutes to perform as many runs as they could fit in the slot.
Similar to the simulation phase, the UAV's onboard computer came prepared with both native and Docker installations of the uav_ros_stack [19]. The competitors were able to either install their solution natively, in separate workspaces, or use a containerized approach. The latter approach is preferred because it minimizes the risk of dependency collisions and makes it easier for competitors to deploy their solutions. The only restriction imposed on the competitors was the use of low-level controller, which they weren't allowed to change in terms of structure, but they were allowed to calculate and use different parameters. Before each run, participants had 5 minutes to install or update their existing solution on the UAV's onboard computer. Afterwards, they spent the remaining 25 minutes trying to complete the entire challenge in a single run. Live stream of the entire competition can be seen at [1].
To ensure UAV safety, additional collision checks were imposed on the generated trajectory inputs against an OctoMap model of the competition layout. Fortunately, no UAVs were harmed in the making of this competition.

Scoring in the finals
The scoring for the finals was adapted from the simulation phase, but still focused on the target position reconstruction and ball launch accuracy: • Target reconstruction error less than 0.5 m: 25 points • Launching the ball on the top surface of the target box (shaded blue in Fig. 1): 25 points • Launching the ball on the white border surrounding the target tag (shaded green in Fig. 1): 50 points • Hitting the hole in the box that was placed behind the tag: 75 points The final score for the team was the score achieved in their best run, and time to finish the run was used as a tie breaker.

Competition Timeline And Overall Reach
The competition was announced on the ICUAS'22 website 17 on November 23rd 2021. Interested teams were asked to fill a form to pre-register for the competition and officially declare their interest in competing. In total, more than 50 teams from 22 countries declared their interest. The competition kicked off on February 1st 2022 (simultaneous with ICUAS'22 paper submission deadline) with the release of the software, several practice Gazebo worlds, and a rulebook. Starting from April 1st, teams were allowed to submit their solutions for evaluation. The evaluation was performed in a Gazebo world not shared with the teams. Results from these evaluations were shared with all teams via a public web page, but the teams were kept anonymous, providing the submitting team with valuable feedback but also providing a target for other teams. Along with the score for the run, teams were provided with a detailed log of timestamped events in the run. The timeline of team interest, registration, and solution submissions is shown in Fig. 10. In total, 62 valid solutions were evaluated in the simulation phase. Teams were also instructed to complete their registration by providing proofs of student status, which was completed by 16 teams consisting of 125 people throughout April. Of those 16 fully registered teams, only one team did not submit a valid solution by the end of simulation phase. The simulation phase ended on May 10th 2022, with 15 teams submitting valid solutions that were then evaluated. Simultaneously, the Gazebo worlds and automated scripts used for evaluation were released to the teams to validate achieved results on their on platform. No complaints were received from the teams. The finals were held at the ICUAS'22 conference venue in Dubrovnik, Croatia. Five teams successfully qualified for the finals and received invitations to participate onsite. Consequently, all of these finalist teams were present at the conference, accounting for a total of 14 individuals in attendance. Competition arena was visited by more than 200 ICUAS participants, and live streams of the practice and the competition have been viewed more than 400 times. The competition Github repository [31] has been cloned more than 200 times and visited by more than 1000 unique visitors.

Conclusion
This paper attempts to distinguish and standardize both the hardware and software modules of an aerial vehicle capable of autonomous operations. Having a modular approach in mind, all of the requirements are clearly stated so that each component can readily be replaced with likewise pieces. 17 http://www.uasconferences.com/2022_icuas/uav-competition/ Fig. 10 The timeline of the ICUAS'22 UAV Competition simulation phase. Starting from November 23rd 2021 when the competition was announced (labeled with A), most of the interest within teams was generated up until the competition kickoff (label K ) on February 1st 2022. Following the competition kickoff (K ) and rulebook release (R R), and with passage of ICUAS'22 paper deadline, no further interest was generated and pre-regsitration phase was closed on March 15th. On April 1st 2022, teams were allowed to submit their solution for evaluation (label SO), which prompted a few teams to submit their preliminary work. Following the rulebook update on April 13th (label RU ), which finalized the scoring scheme, the number of solutions submitted rises towards the final deadline on May 10th (label D). Results of the simulation stage and finalists were announced on May 14th (label R) Furthermore, a concrete example of such an aerial platform, with all of its components outlined in a standardized manner, is presented. A complementary Ardupilot SITL Gazebo simulation structure is introduced to provide a seamless transition from a simulation environment to the real-world deployment. Finally, a use case of the presented UAV platform is demonstrated in the form of a firefighting UAV competition organized as a part of ICUAS'22. Leveraging deployment tools such as Docker, the platform proved accessible to the competitors both in a simulation environment and the onsite challenge arena. The scoring scheme is set up in such a way to facilitate a fair and transparent environment, where the competitors are able to easily evaluate their own solutions. In total, 125 people across 16 teams attempted the Gazebo simulation qualifiers, while 5 teams qualified for the finals. Each of the five teams managed to successfully deploy their solution using the presented UAV platform and 3 teams successfully completed the entire challenge autonomously in a single run.
Tying the competition to the ICUAS conference has been a major success as it attracted a large number of teams. As evidenced by the interest graph, without the conference promoting the call for papers that briefly presents the competition, it is generally difficult to generate additional interest in the competition without large financial incentives. The most notable drawback of this approach is the amount of logistics required to construct a safe arena that enables fair, transparent and fast scoring at the conference venue. To further increase the safety, but also the availability of the aerial platform, we aim to replace the large UAV used in ICUAS'22 with a smaller frame. This, coupled with the ongoing development and refinement of the UAV platform, will also open up different scenarios for future competitions. The scoring at the finals was also difficult to perform since only small parts of it were automated and benchmarked. Therefore, our aim for the next competition at ICUAS'23 is to join forces with the METRICS project to bring more benchmarks from the simulation phase to the competition arena at the conference venue. in 2008 at the University of Zagreb, Croatia. He is currently an assistant professor at the University of Zagreb Faculty of Electrical Engineering and Computing (UNIZG-FER). As a researcher, he participated in several national and international research projects in the field of robotics, control, and automation. In 2011/2012, he worked as a visiting researcher at the Drexel University, Philadelphia, USA as a recipient of the Fulbright exchange grant. He authored and coauthored over 30 scientific and professional papers, including journal and conference papers, as well as a monograph and a book chapter in the field of unmanned aerial systems and robotics. His main areas of interest are autonomous systems, robotics and intelligent control systems.
Stjepan Bogdan received his Ph.D.E.E. in 1999, M.S.E.E. in 1993 and B.S.E.E. in 1990 at the University of Zagreb, Croatia. He is a Full Professor at the Laboratory for Robotics and Intelligent Control Systems (LARICS), Department on Control and Computer Engineering, University of Zagreb Faculty of Electrical Engineering and Computing (UNIZG-FER). His main areas of interest are autonomous systems, aerial robotics, multi-agent systems, intelligent control systems, bioinspired systems and discrete event systems. He spent one year as Fulbright researcher at the Automation and Robotics Research Institute, Arlington, USA, in Prof. Frank Lewis' lab. He is a coauthor of four books and numerous papers published in journals and proceedings.