1 Introduction

Communication security is vital for ensuring personal privacy, commercial confidentiality, government integrity, and defence. Current communication network encryption infrastructures are built upon public-key encryption methodsFootnote 1 whose security relies on computational complexity properties of certain mathematical problems. However, their security is increasingly under threat, from advances in cryptanalysis and, most notably, from the imminent arrival of large-scale quantum computers. Even if current public-key encryption methods are not vulnerable now, transmitted information with long-term value will need “forward security” to guard against future threats.

Quantum Key Distribution (QKD) is a quantum cryptography protocol that offers secret communication against current and foreseeable threats by exploiting the quantum properties of light, whose security is guaranteed by the laws of physics. More precisely, the no-cloning theorem ensures that an unknown quantum signal cannot be copied or amplified with arbitrary precision, causing possible eavesdropping activities to leave detectable traces [1].

QKD was first proposed in 1984 in the seminal paper [2] and since then underwent many theoretical developments [35] and groundbreaking demonstration in real-world implementations [68]. Channel losses which restrict the key rate, increase exponentially at about 0.2 dB/km in optical fibre, limiting the effective range of terrestrial systems. Consequently, quantum repeaters are required to extend the range of fibre systems. Still, these are far from being technologically mature, and the no-cloning theorem itself will severely bound their performance [911]. Currently, the maximum ground-based communication range achieved is 421 km in fibre [12] and 144 km in free space [13].

Recently, Satellite Quantum Key Distribution (SatQKD) has proved to be able to overcome these range limits, enabling secure communication globally. The Chinese mission Quantum Experiments at Space Scale (QUESS) [14], also known as Micius, successfully demonstrated various quantum communication protocols in space [1517]. These groundbreaking results have spurred an international space race aiming not only to establish the first global quantum communication network but also to develop and deploy the architecture to merge different quantum technologies, such as sensing and computing, to build the future quantum internet. Besides China, other countries and space agencies are designing SatQKD missions involving one or more satellites, including Japan [18], Canada (QEYSSat) [19], Luxembourg (QUARTZ led by SES) [20], UK (QKDSat led by ArQit, QUARC) [21], Austria/France (NanoBob) [22], Germany (QUBE) [23], and UK-Singapore (QKD-Qubesat) [24]. The projects mentioned above address mostly Lower Earth Orbit (LEO) satellites, but there is a growing interest for higher orbits as well [25, 26]. Furthermore, satellites smaller and cheaper than Micius, such as nano-satellites (with a mass ∼10 kg) have been in the spotlight [2729], for the possibility of establishing quantum communication services using a constellation [30].

1.1 Contributions

The main contributions of the paper are listed below.

  1. 1.

    We propose a formulation for the SatQKD scheduling problem. It aims to schedule an optical downlink from one satellite to the network of ground stations allocating them time suitable to download the number of keys relative to their importance in the system expressed as a weight. To the best of our knowledge, such a problem has not been considered before in the literature. Solving the model indicates optimal performance possible to achieve for the given parameters of the satellite orbit and the location of ground stations. Consequently, the formulation we propose could be used as an evaluation framework for potential designs of a future SatQKD system. In the past, some related studies have been conducted in the field of optical communication to estimate the availability for communication or the amount of data possible to download to a given network of optical ground stations [31, 32]. However, the critical difference in our approach is distinguishing which ground station receives the transmission. The previous studies considered all ground stations equivalent and it was not important which one communicated with the satellite. On the other hand, the ground stations in our study effectively compete for access to the satellite, and each of them must be supplied with keys for the communications system to remain operational.

  2. 2.

    We include relevant constraints to model the availability of the optical link and its throughput. We disallow the communication when the satellite is in sunlit. To the best of our knowledge, this requirement has not been considered in the literature on the downlink scheduling and the selection of optical ground stations. Furthermore, the transfer rate changes with the elevation angle rather than remaining constant. Modelling that phenomenon was suggested as the future work by [33].

  3. 3.

    We solve the above-mentioned SatQKD formulation and conduct a performance study of a hypothetical but realistic communications system with ground stations located in a single country at high latitudes and profoundly affected by changing cloud cover. We solve the model with a rolling horizon of one year using historical observations of cloud cover available for the period between the years 2013 and 2019. The length of the time frame is sufficient to capture the influence of seasonal weather patterns and changes in satellite illumination on the number of keys transferred to ground stations. Whenever possible, visualisations of the network properties and changes of the environment complement the study.

  4. 4.

    Finally, we perform a design of an experiment to find the parameters of the satellite orbit that provide the best performance of the communications system measured in terms of the keys delivered. Subsequently, we compute the number of keys which could be consumed weekly by each node in the communications system at some predefined service level guarantee, ensuring the number of keys a node owns will not be overdrawn. The example demonstrates that precise selection of the satellite orbit is critical for the performance of the communications system.

1.2 Paper’s structure

The paper is structured as follows. Section 2 reviews literature on scheduling satellites’ operations. Section 3 describes SatQKD from an optimisation perspective, introduces the network of ground stations selected for the study, and defines the index for quantifying the performance of the communications system. Results and discussion of the numerical study, including the long-term performance evaluation of the system are the subject of Sect. 4. Concluding remarks are given in Sect. 5.

Additional details on problem modelling and formulation of the optimisation problem are provided in the appendix, in Sect. A and Sect. B, respectively. In particular, Sect. A covers aspects related to modelling the dynamics of the system and its environment. We explain the model used to estimate the transfer rate between a satellite and a ground station, the position and the movement of the satellite, the duration of communication windows with ground stations, and the impact of cloud cover on the transfer rate. Section B presents the formulation of the optimisation problem and briefly explains the solution method.

2 Literature review

Concepts and present literature relevant to scheduling tasks for satellites placed in the LEO regime is discussed in this section. Special attention is devoted to scheduling downlinks. We cover both radio and optical communication because the modelling paradigms and the solution procedures are the same.

Researchers and the industry experts agreeably acknowledge the need for optimisation in the scheduling of satellites’ operations. The approach currently practised in real applications relies on a human intervention which is not considered sustainable for the number of satellites expected in the future and the complexity of large constellations [34]. The case study [35] describes an example operational system for the management of the satellite optical communication.

Conceptually a task to be executed by a satellite regardless of the details of the operation (i.e. manoeuvre, data collection, health check, commands uplink, payload downlink, etc.) has a release date, due date and estimated duration [36]. The problem of scheduling satellite’s operations belongs to the class of Machine Scheduling [37]. Some researchers [34] who consider task scheduling in a constellation of satellites prefer to model a spacecraft as a resource instead of a machine and use the Multiprocessor Scheduling [38] as the baseline problem.

Tasks may have different priorities or rewards for their execution. It reflects a situation when multiple clients of varying importance compete for access to the satellite. If tasks priorities are considered, then the scheduling problem is an example of the Resource-Constrained Project Scheduling [39]. That analogy was suggested by [40], who studied scheduling download of images with priorities and deadlines. The scheduling system considered was oversubscribed, hence postponing deadlines of some tasks or cancelling them was unavoidable. The authors applied the state of the art meta-heuristics to a real problem instance. The number of completed downloads in optimised schedules increased twofold compared to the mission schedule used in practice. It will become apparent shortly that for scheduling downloads which do not require handling priorities optimisation models are effectively solvable to optimality. Hence the lack of priorities can be exploited to derive more computationally attractive problem formulations.

Regardless of the problem taxonomy, what makes the satellite applications unique, are the external constraints which either must be satisfied to commence a task or must remain valid for the entire duration of the task execution [36]. For instance, a ground station must be visible to a satellite to establish and maintain a downlink. Such conditions are commonly defined using time windows.

Historically the dominating techniques for scheduling satellite communication were based on genetics algorithms [41], heuristics and meta-heuristics [36, 40] often combined with Local Search (LS) and constraint propagation. Nowadays, these approaches are incrementally superseded by exact methods, in particular, modelling the scheduling problem as a Mixed-Integer Linear Program (MILP) with time discretisation, which has been demonstrated to have a strong bound of its linear programming relaxation [33, 42, 43]. These models can be solved using commercial solvers without the need for devising custom solution procedures. Such solvers provide a certificate of optimality which is not available for heuristic procedures.

The problem of scheduling downloads from a single satellite to multiple ground stations was studied by [42]. The researchers included the data and energy acquisition, which made the scheduling problem more realistic for the application in Earth Observations (EO) combined with downlink using radio communication. Tacking data and energy dynamics already made the problem NP-hard. The author [42] demonstrated that a Mixed-Integer Program (MIP) with time discretisation is easy to solve to optimality for problem instances based on real data. Interestingly, the solution process for these instances did not require branching. The formulation was extended to a scenario with multiple satellites by [33] who demonstrated that such a model remains easy to solve.

Apart from generic models considered in the literature, the need to accommodate some specific constraints may arise when a given formulation is adapted to solve a real-world problem. For instance, a satellite may have two antennas and support several operational modes with different energy to bitrate thresholds. Fortunately, the model with time discretisation facilitates incorporation of additional constraints [34]. The initial version of the problem proposed by [42] already supported different power manager configurations.

The model with time discretisation is typically solved standalone, without resorting to reformulations and decompositions [33, 42]. The size of the formulation can be considered its major limitation and was emphasised in an application to the constellation of 30 satellites with a fine discretisation interval solved using a low spec machine. For such a configuration, the researchers devised a heuristic based on Lagrangian relaxation. The heuristic applied a sequence of subgradient minimisations and progressively fixed decision variables. Interestingly, the solution procedure can be used without a MIP solver, albeit with an adverse impact on the cost of the final solution.

If the orientation of a spacecraft can be changed, such a satellite is called agile to emphasise that the attitude control system manoeuvres can be included in the execution of a schedule. This feature is desirable in EO, as the spacecraft can perform image acquisition several times during a single visibility window [43]. The introduction of the attitude manoeuvres elevates the complexity of the scheduling problem. For such problem instances, it has been shown that a better alternative to solving a model standalone is the application of the column generation scheme [43]. The researchers exercised it to the root node of the branch and bound tree and then solved the remaining nodes without extending the pool of columns.

In real-world, the satellites significantly outnumber ground stations which leads to the competition for access to the ground station infrastructure. The process of resolving such conflicts is known as deconfliction. The initial progress in this area was driven by solving the Satellite Range Scheduling [44, 45] which allocates antennas to spacecraft for some desired time within a communication window. The problem instances are defined to accurately illustrate the United States Airforce Satellite Control Network (AFSCN) which operates more than 100 satellites using 16 antennas located in nine ground stations. The scheduling system is oversubscribed, and two variants of the objective function were studied: minimisation of the number of cancelled tasks and minimisation of the total time when any antenna is assigned to more than one task simultaneously. The latter variant allows for scheduling long tasks and is preferable by human operators who resolve remaining conflicts by negotiations with clients [45]. The best results for realistic problem sizes were obtained using a genetic algorithm [44, 45]. Nowadays, human schedulers who supervise operations of AFSCN are supported by a proprietary heuristic algorithm which schedules the most restricted tasks first preventing further depletion of resources [46]. Instead of cancelling requests which initially cannot be satisfied, the system attempts to resolve remaining conflicts automatically by relaxing operational constraints following a set of predefined business rules. A formal study of the computational complexity of a conflict resolution was conducted by [47] using a hypothetical scenario with multiple satellites competing for access to a single ground station. Interestingly, the complexity of the case allowing task preemption and no setup cost remains an open problem. If the setup cost is present or task preemption is disallowed, the problem is NP-hard in general [47].

For scheduling data transfers, a network of ground stations is given as an input. Such an assumption is also valid for traditional monolithic radio frequency ground stations whose locations worldwide are well known (i.e. Estrack). On the other hand, designing a network of optical ground stations is an open problem and can be the subject of optimisation as well. The practical approach is finding a subset of the ground stations from the list of possible candidates [48]. The available budget restricts the number of ground stations. Consequently, the problem is similar to the Warehouse Selection Problem [49]. A typical objective function applied in the literature is the Maximum Percentage Data Transferred (MaxPDT) [32, 48]. It evaluates how much data collected by an example EO mission can be downloaded using the given network of ground stations. Computing such an objective involves finding a schedule of data transfers. Fortunately, the inner problem admits a pseudopolynomial complexity and can be solved using dynamic programming [31, 32]. The subset of the ground station is found either through direct enumeration for small problem sizes or using LS.

The studies of optical ground station selection have been carried out for different regions, the entire world [31, 48], Europe [31, 50] or Germany [31]. These analyses account for cloud cover over multiple years. However, researchers do not consider satellite illumination, which is relevant to ground stations located at high altitudes. In a similar vein, the transmission rate does not change with the elevation angle between the ground station and the satellite. To increase the performance of the ground station network and make it more resilient to cloud cover, [51] considered more complex setups involving a geostationary satellite as a relay, using a high altitude platform or a combination of both radio and optical communication. So far, such analyses were limited to a single ground station.

3 Problem statement

Satellites are currently the only platform allowing QKD to achieve intercontinental range. SatQKD can be realised using a satellite as a transmitter (downlink configuration) or as a receiver (uplink configuration). A downlink configuration requires the development of a space-qualified optical assembly for precise pointing of the order of few μ-rad, quantum sources such as lasers, and a quantum random number generator [27, 30]. The uplink configuration arguably requires a less complicated payload consisting of single photon detectors. However, in the uplink configuration the optical beam encounters turbulence early during its path leading to larger angular deviation and therefore higher losses, of about 20 dB with respect to down-link [52].

Assuming that the eavesdropper cannot access the satellite, we employ the trusted-node architecture. In trusted-node SatQKD, a satellite distributes keys sequentially, first to a ground station A, then to ground station B. Subsequently, the satellite broadcasts over a public channel the XOR hash of both keys allowing thus the A and B to have a shared key with which they can employ (quantum-safe) symmetric key encryption protocols [52].Footnote 2 In this work, we will focus on the downlink and trusted node scenario due to its comparative simplicity.

Before proceeding, we wish to highlight that this method may also be applied to the uplink scenario. The link topology and the formulation of the scheduling problem would remain the same. However, attenuation is about 20 dB in that case.

On the other hand, the current analysis would need some modification for untrusted node operation. Firstly, keys would need to be generated pairwise between communicating nodes which requires simultaneous visibility by the satellite of both ground stations for an extended portion of the pass. Secondly, the full network traffic graph would be needed for optimisation to establish the pairwise key generation demand, rather than the summed values at each node. Furthermore, the untrusted node SatQKD is a lot more challenging to implement than the trusted node and is not thought to be likely in the near to medium term.

The communication system consists of a QKD satellite and a network of ground stations spread across cities of the United Kingdom. The satellite operates as a “trusted node” to mediate the distribution of secure encryption keys pairwise between ground stations. The locations were selected based on their importance for the country and geographical dispersion. From a practical standpoint, the distance between satellite-linked ground stations should exceed 100 km because communication on shorter distances could be handled using a fibre optic link.Footnote 3 Ten ground stations were shortlisted. Their geographical dispersion is illustrated in Fig. 1 and the motivation behind establishing a ground station in the particular city is explained below.

Figure 1
figure 1

Artist’s impression of the communications system. We consider ten locations throughout the UK to represent potential nodes in a national quantum-secured communications network. Conventional communication between these nodes is encrypted with symmetric key pairs distributed via a satellite placed in a Sun-Synchronous Orbit. The satellite has a single quantum transmitter that can send single-photon level signals to optical ground stations. The dashed arc is the ground track the spacecraft follows. The satellite passes over the UK mainland territory South to North around local midnight

Belfast:

Largest city in Northern Ireland,

Birmingham:

Second largest population in the country,

Bristol:

Largest city in south-west England,

Cambridge:

Science centre and fibre optic communication hub,

Glasgow:

Largest city in Scotland,

Ipswich:

British Telecommunication headquarters,

London:

Largest population and urban zone in the United Kingdom,

Manchester:

Second largest urban zone in the country,

Thurso:

Northmost city in the network, it has the weakest correlation of weather conditions with other ground stations,

York:

Railway network hub.

A satellite transfers cryptographic keys to ground stations which store them in buffers. The keys will be used later for encrypting ground-to-ground communication between stations. To establish a secure connection between two parties, each has to use the same key for encryption and decryption. Then, after the connection is closed, each party removes the key from its buffer due to the security requirement that a key cannot be reused.

Operational requirements of ground stations for a given week should be satisfied using the keys stored in a buffer of the ground station before the start of the week. Consequently, the keys delivered to the ground station throughout the week will not be used for encryption immediately. Their availability is delayed in time for the next week because the raw transmitted keys require post-processing performed in batches after enough raw keys are collected [53]. Apart from the privacy amplification, buffering keys mitigates scenarios in which overcast or satellite illumination prevent the delivery of new keys to individual ground stations.

Before we present how the SatQKD is modelled as a scheduling problem, let us introduce the assumptions and terminology. A schedule is a sequence of tasks a satellite should execute. We consider a problem setting with one satellite and a set of tasks of the same kind—optical data transfers. Therefore, we use the terms task and data transfer interchangeably. Each task in a schedule has a start time when its execution should commence, duration and the target ground station. Tasks are executed from start to completion without preemption. Consequently, a data transfer is a single continuously attempted optical link between a given ground station and the satellite. The satellite can communicate with at most one ground station at a time. For the convenience of presenting the results and technical considerations explained in Sect. A.1, we aggregate the volume of data transferred into 256 bit unit blocks and refer to them as keys.

The period for which a schedule is computed is called the planning horizon. We are interested in building a long-term schedule and proving its optimality. Doing so for the whole six-year period would lead to a computationally challenging problem due to the size of its formulation. Hence, we restrict our attention to the sequence of one-year-long periods. The final state of the communication system for a given planning horizon, i.e., the position of the satellite and the size of key buffers for each ground station, is the initial state for the subsequent planning horizon. This technique is known as the rolling horizon in the literature on scheduling and inventory control [54]. In scheduling optical satellite communication, the length of a planning horizon usually ranges between two days and one week, which can be justified by the availability of accurate weather forecast. Since we consider scheduling using historical weather information, and our model remains effectively solvable, we prolonged the planning horizon to the period of one year. Ultimately, it leads to more efficient schedules, as the optimization model considers weather seasonality patterns and changes in satellite illumination throughout the entire year. Hence, the solver is capable of making decisions which bring long-term benefits. Such a schedule can then be used to assess the best possible performance that the design of the future communications system allows. It should be a valuable indicator to support decisions involving the locations of ground stations and the selection of the orbital parameters for a satellite. Finally, our model could easily be scaled down to shorter planning horizons without any inherent difficulty and we would obtain the corresponding computational savings.

The objective of the space-to-ground data transfer optimisation is to maximise the minimum usable number of keys buffered by ground stations. To recognise where keys should be distributed, the ground stations are assigned non-negative weights corresponding to the desired activity of a given node in the communication system. The methodology of using weights to measure the importance of nodes in a network was developed for the analysis of graphs with weights assigned to edges, which are known as weighted networks [55, 56]. Several measures for studying interactions between nodes in such systems have been proposed. For instance, the sum of weights assigned to incident edges referred to as vertex’s strength [56] is considered a generalisation of a vertex’s degree in an unweighed graph.

Table 1 presents the weights assigned to the nodes in the hypothetical network we consider. Their values are proportional to the number of premises with high-speed broadband access (300 MBit/s download speed or higher) located in a given city. The numbers were derived from datasets compiled by the Office of Communications [57], which is the governmental body that regulates the telecommunication sector in the UK. We assigned Thurso a higher weight than data indicates (0.01 v.s. 3E-5) because the city could be the hub for the Highlands and Islands region. The economic development of these remote areas increasingly depends on reliable and secure communication links [58]. The final weights were normalised, so the sum of all weights assigned to nodes equals one and then rounded to three digits after the decimal point.

Table 1 Importance of ground stations in the network. First two columns contain the name of the city and the number of premises with High-Speed Broadband (HSBB) access located in that administrative area. The third column displays weights proportional to the number of premises from the second column. The last column contains the final weight of the ground station used in the study

We aim to distribute key maximising the network traffic uniformly on all links. For that reason, we are interested in a measure which quantifies the traffic globally for the entire communication network. Maximising such a performance index could become the objective of an optimisation problem. Furthermore, maintaining its value above a certain threshold could be subject to a service level agreement. In the following subsection, we provide a possible definition of such an index. For the sake of convenience, we call it the traffic index.

3.1 Traffic index

To formally define the traffic index, we adopt the following notation and symbols. Capital letters represent sets. The syntax \(|S|\) stands for the cardinality of the set S. Small boldface letters denote vectors. The i-th element of a vector v is accessed using a subscript notation \(\text{v}_{i}\).

Consider the following symbols.

N:

Set of ground stations in the problem definition. We refer to them as regular ground stations. Besides, there is an auxiliary station 0 observed by the satellite when no data transfer can be in progress. refers to the set of all ground stations including the auxiliary one, i.e., \(\bar{N}:= N\cup \{0\}\).

\(\mathbf{b}^{n}\):

Vector indexed by time \(t \in T\) tracking the number of keys the ground station n stores in its buffer.

\(\underline{b}\):

Number of keys reserved for authentication of a ground station by the satellite.

w:

Vector indexed by ground stations \(n \in N\) storing the assigned weights.

T:

Set of points partitioning the planning horizon into smaller periods. We evaluate the traffic index for each element of T. The sum of traffic indices over T is the objective function of the optimisation problem to maximise. For example, with no loss of generality, the traffic index could be evaluated every Monday throughout the planning horizon.

α:

Desired service level defined as the probability that the size of the key buffer of every ground station does not fall beyond the threshold \(\underline{b}\) at the time \(t \in T\).

A feasible solution to the following optimisation problem, parametrised by α, satisfies properties of the index described in the previous section. Henceforth, we refer to it as the traffic index and denote using the symbol \(\lambda _{\alpha ,t}\), where \(t \in T\).

$$\begin{aligned} \begin{aligned} &\max{\sum _{t \in T}\lambda _{\alpha ,t}} \\ &\quad \mathbb{P} \bigl( b^{n}_{t} - w_{n}\lambda _{\alpha ,t} \geq \underline{b} \bigr) \geq \alpha &&\forall n \in N, \forall t \in T \\ &\quad \lambda _{\alpha ,t} \in \mathbb{R}^{+} \cup \{0 \} &&\forall t \in T \end{aligned} \end{aligned}$$
(1)

Let \(\boldsymbol{\lambda }_{\alpha }^{*}\) be the vector of the optimal values for the problem above indexed by time \(t \in T\). Intuitively, \({\frac{w_{n}}{|T|}\sum_{t \in T}\lambda _{\alpha ,t}^{*}}\) is an upper bound on the constant, periodic key consumption rate at the node \(n \in N\) which can be maintained at the service level α. Note, that maximising \(\sum_{t \in T}\lambda _{\alpha ,t}\) goes alongside maximising the number of keys a ground station receives and simultaneously ensuring this number is consistent with the weight assigned to the communications node. In some sense, what we are doing is we are maximising the minimum number of keys sent to any ground station at any given time period. The minimum number of keys is what we call as usable keys that is also weight adjusted.

The complexity of solving Problem 1 increases by considering the probability distribution of its constraint, which can be unknown or imprecise. However, to find the maximum key consumption rate for a given communications node, there is no need to handle probabilistic constraints directly. Instead, we reformulate the optimisation problem by removing the dependence on the service level α and replacing the probabilistic constraints by their deterministic equivalents, leading to the following formulation.

$$\begin{aligned} \begin{aligned} &\max{\sum _{t \in T}\lambda _{t}} \\ &\quad b^{n}_{t} - w_{n} \lambda _{t} \geq \underline{b} &&\forall n \in N, \forall t \in T \\ &\quad \lambda _{t} \in \mathbb{R}^{+} \cup \{0 \} &&\forall t \in T \end{aligned} \end{aligned}$$
(2)

To stress the independence of the service level α, we removed the subscript from the traffic index notation \(\lambda _{t}\). Now suppose, we temporarily ignore the key consumption. Consequently, buffers’ sizes are non-decreasing over time. Then, the maximum constant periodic key consumption rate for a given node \(n \in N\) guaranteed at some desired service level α can be found by analysing the marginal increase rates of \(b^{n}_{t}\) over time, which is the approach we adopted in Sect. 4. For the complete model of the optimisation problem, including the formal definition of all relevant constraints, see Sect. B.2.

3.2 Impact of the environment

Some current physical and technological limitations restrict opportunities for a successful space to ground data transfer. Section A.1 explains the key transfer rate estimation model in the cloud-free line of sight. For simplicity, we assume the satellite transfers keys at the rate a ground station can receive them, and all ground stations have the same transmission capabilities.

A transmission must happen during night hours due to stray light restrictions, and only when a ground station is visible to a satellite. We assume that the satellite can connect to the ground station when its elevation angle, relative to the ground station, is greater than 15. The larger the elevation angle, the higher is the transmission rate. Section A.2 describes the satellite propagation model and the procedure to calculate the elevation angle. The opportunity for communication is further restricted to periods when the satellite experiences a total eclipse in Earth’s shadow (umbra). Sect. A.3 provides some insights on the duration and the frequency of communication windows.

Besides sunlight, local weather conditions, cloud coverage, in particular, may adversely impact the transmission rate. The periods of clear sky are intermittent in the weather patterns observed in the UK. As a result we assume the communications system remains operational in presence of clouds and the transfer rate declines linearly as cloud cover percentage increases. We explain the relationship between cloud cover and the transmission rate in more detail in Sect. A.4.1. This simplistic approximation scheme was taken for the ease of exposition because it is commonly used to evaluate amount of data possible to download in the optical ground station selection problem [32, 48]. The formula is not exploited to make the optimisation process easier or more efficient. Finally, the formulation of the problem and the solution procedure could remain unchanged after the transfer loss function is updated to a different model.

4 Results and discussion

In this section, we analyse results of the space-to-ground data transfer optimisation for the network of ground stations introduced in Sect. 3.

After explaining the configuration parameters of the optimisation problem, we present an example solution of a one-week scheduling problem. This should help the reader to develop an understanding of the behaviour of the communications system at the level of individual data transfers. Next, we analyse aggregated results of scheduling data transfers over six years. In particular, we focus on the performance of the communications system considering the number of keys which can be consumed weekly by a ground station at the desired service level. We conclude the section by providing empirical evidence that our solution method and the results discussed are resilient to random perturbations of the input data in the definition of the optimisation problem. All results presented below were obtained by solving the formulation described in Sect. B. The dataset of problem instances is available for testing and benchmarking purposes [59].

4.1 Configuration parameters

The schedules were computed for six years starting from the 1st of January 2013 with the rolling horizon of one year. The planning horizon T was partitioned into one-week long segments. Keys delivered to a ground station in a given week were released on Monday the following week. The time was indexed into 15 seconds long periods.

Every city received an initial buffer of 64 keys reserved for authentication. These keys were not allowed for communication between ground stations, and the actual threshold of the authentication reserve did not affect the service levels computed in the simulations.

The formulation was solved using a workstation with AMD Ryzen 7 2700X eight-core processor and 32 GB of RAM. The criterion for stopping computations was reaching optimality gap between the objective value of the current best solution and the estimated upper bound below 1%. Hence, it can be shown that improving the final cost by more than that amount is impossible. The total time for computing a schedule for one year ranged between 2 and 43 minutes. The theoretical computational complexity of solving the formulation follows its definition presented in Sect. B.2.

4.2 Orbital parameters

The satellite’s orbit is circular with an altitude of 566.897 km above Earth’s surface. As a result, the spacecraft makes exactly 15 complete passes of the orbit within 24 hours. One revolution takes 96 minutes. The inclination for the selected altitude is 97.658 to counterbalance the Earth’s nodal precession causing the drift in the Right Ascension of the Ascending Node (RAAN). For the given inclination, the cumulative RAAN drift throughout a year is 360. Hence, the precise characteristics of the time windows when a ground station is visible to the satellite observed a given night occurs every year. An orbit with such a property is called a Sun-Synchronous Orbit (SSO). The argument of the latitude set to 46 together with the initial epoch at 00:00:00 UTC, the 1st of January 2013 align one of the visibility windows around midnight. It is crucial during summer months, as throughout the visibility periods shifted off midnight the spacecraft may remain in sunlit despite night observed by the ground station. This phenomenon significantly affects the ability to communicate with ground stations located at high altitudes, such as Thurso, which has no contact with the spacecraft for several weeks during summer.

It was not immediately apparent to us how the RAAN parameter should be set to obtain the configuration that yields the most efficient schedules. To answer this question, we narrowed the RAAN values to the interval [\(90.5^{\circ}, 115.5^{\circ}\)] and computed a Service Level (SL) for every configuration obtained by incrementally iterating over values within that interval with the step of 1. The Service Level at a given threshold, e.g. 99%, is defined as the maximum number of keys a ground station may consume weekly without depleting its keys reserved for authentication for the number of weeks proportional to the given level in the time frame considered. The service level is estimated by analysing marginal changes in the size key buffers of all ground stations in the network. Therefore, every ground station could consume a similar number of keys relative to its weight. Furthermore, whichever ground station is selected, its SL gives the same perspective on the performance of the system.

Table 2 contrasts the initial RAAN value with the maximum weekly key consumption in London maintainable at 99% global service level over six years. The difference in the output service level between configuration variants is significant. Setting the initial RAAN to 109.5 yields the best performance, and the decrease in efficiency would be more than twofold if the value of 95.5 was used instead. The example demonstrates the utility of schedule optimisation to evaluate the best possible performance of the system to rule out suboptimal configurations. In the remaining text, we use the RAAN setting that allows the best performance.

Table 2 Sensitivity analysis of the initial RAAN on the performance of the communications system. Columns display the value of the RAAN at the initial epoch and the maximum weekly key consumption in London maintainable at 99% SL throughout six years

4.3 Weekly scheduling

Figure 2 illustrates a fragment of the optimal schedule for the first week of the year 2018.

Figure 2
figure 2

Sample weekly schedule. Data transfers scheduled to ground stations during the week of the 1st of January 2018. Blue curves indicate an expected transfer rate to a ground station which is adjusted to the cloud cover forecast. Black segments denote the time slots in which the satellite communicated with a given ground station

The visualisation helps to develop intuition about communication windows, their duration and the behaviour of the transfer rate. The length of a communication window varies significantly within the week. The shortest windows are less than 2 minutes whereas the longest exceed 6 minutes. The highest transfer rate is attained in the centre of a communication window. Furthermore, in the same cloud cover conditions, the longer the communication window, the higher the maximum value of the transfer rate is reached. Some communication windows are adversely affected by weather conditions, for instance, the 2nd of January around 10 PM cloud cover was at least 75% across all locations. Finally, during a communication window, the satellite switches between no more than a few ground stations. In the presented example, at most four data transfers were performed (London, Manchester, Glasgow and Thurso the 8th of January around midnight). Similar considerations can be generalised to any other week of a year.

4.4 Long-term scheduling

The long-term performance profile of the communications system was obtained by solving scheduling problems with a rolling horizon between the years 2013 and 2019.

Figure 3 displays the number of the keys transferred to a ground station every week. The plot presents results of a single run of the schedule optimisation for every year. There was no point in repeating optimisation for the same input problem because the solution method is deterministic. Consequently, solving a given model always returns the same result.

Figure 3
figure 3

Long-term simulation of the SatQKD optimisation. The number of keys delivered weekly to every ground station between the years 2013 and 2019

London, which is the most significant ground station, receives a substantial number of new keys compared to the remaining ground stations almost every week. Birmingham, which is the second most important station in the network, has a comparable key delivery profile. York is on the opposite end of the spectrum. The city receives keys only in a few weeks per year. Cambridge and Ipswich, which are also assigned small weights, obtain keys similarly, albeit they experience data transfers more frequently. On the other hand, Thurso, which is the least significant station in the network, does not share such a communication profile. The city benefits from location in a remote area and not sharing some part of its communication windows with any other ground station. Therefore, the satellite can send keys to Thurso because no other city would be able to receive them. Remaining ground stations (Glasgow, Belfast, Manchester, and Bristol) belong a group of cities whose weight is between 0.08 and 0.105. A distinctive pattern in this group is observed for Bristol, which is the most southern ground station in the network. Thus, it is also the first location visible to a satellite during its pass around midnight. We elaborate on the frequency and the duration of communication windows in Sect. A.3.

Taken together, ground stations observe different patterns of communication with the satellite, which conceivably depends on the weight assigned in the network. During summer all ground stations experience a notable decrease in the number of keys delivered. In particular, Thurso and Glasgow lose the ability to communicate with the satellite. Consequently, to maintain connectivity with other ground stations over summer, they must be topped up in advance. Roughly, the pattern of key distribution to a given ground station repeats every year.

Due to a vast disproportion in the number of keys delivered to a ground station every week, we computed the maximum constant key consumption rate, which could be maintained over time without exceeding buffer’s capacity. Figure 4 displays a relation between the maximum key consumption rate for London and the Service Level obtained. The shape of the plot is similar to any other ground station.

Figure 4
figure 4

Number of keys consumed by a ground station weekly at the desired Service Level guarantee. The weekly key consumption rate for London maintained without utilising keys beyond the threshold reserved for authentication. If the service level α is lower than one, then some keys reserved for authentication will be depleted for the \((1 - \alpha )\) relative number of weeks

Considering the number of keys the ground station received each week, London could consume up to 8327 keys weekly without exceeding the capacity of its key buffer for 99% of weeks. If the key consumption rate is raised above that threshold, then the service level will drop because there will not be enough keys to meet the demand for some additional weeks. It is important to emphasise that the service level is inferred accounting for the size of key buffers of all ground stations in the network. Table 3 reports the maximum number of keys possible to consume weekly at the 99% SL for every ground station.

Table 3 Maximum key consumption rate maintainable weekly at 99% Service Level by a given ground station. The key consumption at other levels for the London ground station can be inferred from Figure 4. The graphs for the remaining ground stations follow a comparable trend

Intuitively, the weight assigned to the ground station and the key consumption are related. For instance, Glasgow and Manchester, which are assigned similar weights, can consume keys at a comparable rate. However, a stronger observation valid for all ground stations is possible to infer. It can be shown that the 99% SL for a given ground station equals its weight multiplied by the coefficient \(21000 \pm 75\).

4.5 Evidence of numerical stability

The scheduling problem is defined by the orbital parameters of the satellite, the locations and weights of the ground stations, and the planning horizon. Among these settings, the weights are arguably the most difficult to know precisely, due to their arbitrary nature. For that reason, we study how the introduction of noise to the value of weights affects the final results, in particular, the number of keys a ground station receives every week, and the key consumption at the 99% SL.

Individual weights of nodes in the communications network were perturbed by introducing noise following the formula below.

$$ \hat{w}^{\prime }_{i} = \hat{w}_{i} \cdot (1 + r \cdot \tilde{n}). $$

Let \(\hat{w}^{\prime }_{i}\) and \(\hat{w}_{i}\) be the updated and the initial weight for a ground station i, r is the perturbation threshold, and ñ is a random variable following the standard normal distribution. The perturbation threshold was set to 0.1. Subsequently, new weights were normalised, so their sum equals one.

Figure 5 presents box-plots of the number of keys London received every week between the years 2013 and 2019 depending on the weight assigned to the station.

Figure 5
figure 5

Descriptive statistics of the number of keys sent to a ground station after its weight was perturbed. The figure illustrates the number of keys transferred to London every week between the years 2013 and 2019. Box plots correspond to independent runs of the SatQKD optimisation over six years with the rolling horizon of one year. In every run, the weight of the ground station was randomly changed by a small amount. The plot was drawn according to the following conventions. A box represents the area between the first and the third quantile. The horizontal line splitting the box denotes the median. Whiskers extend to the last datum below or above 1.5 times the distance between the first and the third quantile

The introduction of moderate noise in weights does not seem to affect significantly the number of keys delivered to a ground station. Predictably, the median increases as the weight rises, also the boxes and the whiskers move up.

Conceivably, the perturbation of weights influenced the number of keys which could be consumed by a ground station at some desired service level. Table 4 presents the maximum number of keys consumed weekly at 99% SL guarantee in London depending on the weight assigned.

Table 4 Weight of the ground station and its impact on the service level. Columns of the table display the weight assigned to the London ground station followed by the maximum number of keys possible to consume at the 99% SL. The box-plots presenting descriptive statistics of the number of keys delivered weekly to the ground station for every weight’s setting are displayed in Figure 5

The numbers of keys consumed after the perturbation of weights do not differ significantly from results obtained using the initial weights. Predictably, the key consumption rate changes by a small amount following the weight. Overall, the weekly key consumption remains in the interval of \([7364, 9048 ]\) for the weights between 0.352 and 0.424. Recall, the key consumption for London at 99% SL with the weight of 0.393 was 8327, which fits approximately in the centre of the interval for the weekly key consumption.

5 Conclusions

We formulated for the first time a SatQKD as a mathematical program. We then modelled a hypothetical but realistic network of optical ground stations and solved the scheduling problem with a rolling horizon of one year for the period between the years 2013 and 2019 using the state-of-the-art commercial solver. Computational results give insights into the number of keys which could be delivered to the network. The provided estimates on the weekly key consumption rate attainable in different locations of the communications system are intended to serve as a guideline on whether future investments in the development of this technology could meet the operational demands of telecommunication providers.

For the interest of the community investigating the optical ground stations selection problem and the optical downlink scheduling, we modelled two additional assumptions, which are relevant but not commonly used in the literature. Firstly, we model a variable transmission rate of the optical link which changes with the elevation angle between the satellite and the ground station. Secondly, we disallow communication when the spacecraft is in sunlit. Furthermore, we believe the key concepts applied in our formulation which allocate a downlink time proportionally to the importance of a given node in the communications system, can be translated to optical downlink scheduling for independent clients with different priorities.

The short computation times are due to the strong linear programming relaxation of the model with time discretisation that we have applied. As a result, reaching the optimality gap below 2% requires no branching for the problem instances used in this study. The exploration of the branch-and-bound tree was necessary to achieve the optimality gap of 1%. The observed computation time follows results reported by researchers who consider downlink scheduling using models with time discretisation [33, 42, 43]. Since these problems are effectively solvable to optimality, we suspect the community will eventually deprecate applications of heuristic, metaheuristic and evolutionary algorithms which do not provide information about the optimality gap and are inherently prone to premature convergence.

Our future work is focused on moving from historical weather observations to medium term weather forecasts. This step requires changes in the problem formulation to incorporate the uncertainty of cloud cover predictions and the development of an optimisation approach to solve the new problem definition in a reasonable computational time. Another research direction could explore the benefits of considering a constellation of satellites.