ESA Voyage 2050 white paper -- GrailQuest: hunting for Atoms of Space and Time hidden in the wrinkle of Space-Time

GrailQuest (Gamma-ray Astronomy International Laboratory for Quantum Exploration of Space-Time) is an ambitious astrophysical mission concept that uses a fleet of small satellites, whose scientific objectives are discussed below. Within Quantum Gravity theories, different models for space-time quantisation predict an energy dependent speed for photons. Although the predicted discrepancies are minuscule, Gamma-Ray Bursts, occurring at cosmological distances, could be used to detect this signature of space-time granularity with a new concept of modular observatory of huge overall collecting area consisting in a fleet of small satellites in low orbits, with sub-microsecond time resolution and wide energy band (keV-MeV). The enormous number of collected photons will allow to effectively search these energy dependent delays. Moreover, GrailQuest will allow to perform temporal triangulation of high signal-to-noise impulsive events with arc-second positional accuracies: an extraordinary sensitive X-ray/Gamma all-sky monitor crucial for hunting the elusive electromagnetic counterparts of Gravitational Waves. A pathfinder of GrailQuest is already under development through the HERMES (High Energy Rapid Modular Ensemble of Satellites) project: a fleet of six 3U cube-sats to be launched by the end of 2021.

magnetosphere, launched in 2000 and recently extended to the end of 2020. In the near future, a constellation of three satellites in formation is planned for the LISA mission, to reveal gravitational waves from space. Very recently, two extremely successful experiments, of paramount importance for fundamental physics, involve the combined use of several ground-based detectors. One is the LIGO/Virgo Collaboration (involving the two US-based LIGO and the European Virgo facilities) that allowed for the first detection and localisation of gravitational waves. In one case, temporal triangulation techniques, conceptually similar to those proposed for GrailQuest constellation and described in this work, effectively constrained the position of the event in the sky, allowing for fast subsequent localisation, in the electromagnetic window, of a double Neutron Star merging event. The other is the Event Horizon Telescope (which provides for the combined use of 8 radio/micro-wave observatories spread all over the world) that allowed to obtain the first image of the event horizon around a black hole. We consider these compelling results as the proof that modular astronomy, that benefits from the combined use of distributed detectors (to increase the overall detecting area and allow for unprecedented spatial resolution, in case of the Event Horizon Telescope and the GrailQuest project), is the new frontier of cutting-edge experimental astronomical science that is performed by exploiting the combination of a large number of detectors distributed all over the Earth surface. The GrailQuest project is a space-based version of this epochal revolution. Results: We performed accurate Monte-Carlo simulations of thousands of light curves of Gamma-ray Bursts, based on true data obtained from the scintillators of the Gamma Burst Monitor on board of the Fermi Satellite. We produced Gamma-ray Burst light curves in consecutive energy bands in the interval 10 keV-50 MeV, for a range of effective area. We then applied cross-correlation techniques to these light curves to determine the minimum accuracy with which potential temporal delays between these light curves are determined. As expected, this accuracy depends, in a complicated way, from the temporal variability scale of the Gamma-ray Burst considered, and scales with roughly the square root of the number of photons in the considered energy band. We determined that, for temporal variabilities in the millisecond range (that are expected in at least 30% of the observed Gamma-ray Bursts), with an overall effective area of ∼ 100 square meters the statistical accuracy of these delays is always smaller (for redshifts ≥ 0.5) than the delays expected by a dispersion law for the propagation of photons in vacuo that linearly depends on the ratio between the photon energy and the Planck energy.
This proves that the GrailQuest constellation is able to achieve the ambitious objectives outlined above, within the budget of a European Space Agency M-class mission.
Keywords constellation of satellites · quantum gravity · Gamma-ray Bursts · γ-ray sources · all-sky monitor 1 Introduction: Was Zeno right? -A brief summary of Quantum Gravity and the in-depth structure of Space and Time According to Plato, the great Greek philosopher, around 450 BC Zeno and Parmenides, disciple and founder of the Eleatic School, visited Athens [1] and encountered Socrates, who was in his twenties. In that occasion Zeno discussed his world famous paradoxes, "four arguments all immeasurably subtle and profound", as claimed by Bertrand Russell in 1903 [2].
In essence, Zeno's line of reasoning used for the first time a powerful logical method, the so called reductio ad absurdum, to demonstrate the logical impossibility of the endless division of space and time in the physical world.
Indeed, in his most famous paradox, known as Achilles and the tortoise, Zeno states that if one admits as true the endless divisibility of space, in a race the quickest runner can never overtake the slowest, which is patently absurd, thus demonstrating that the original assumption of infinite divisibility of space is false.
The argument is as follows: suppose that the tortoise starts ahead of Achilles, in order to overtake the tortoise, in the first place Achilles has to reach it. In the time that Achilles takes to reach the original position of the tortoise, the tortoise has moved forward by some space, and therefore, after that time, we are left with the tortoise ahead of Achilles (although by a shorter distance). In the second step the situation is the same, and so on, demonstrating that Achilles cannot even reach the tortoise.
Despite of the sophistication of logical reasoning, today we know that the error in the reasoning of Zeno was the implicit assumption that an infinite number of tasks (the infinite steps that Achilles has to cover to reach the tortoise) cannot be accomplished in a finite time interval, which is not true if the infinite number of time intervals spent to accomplish all the tasks constitute a sequence whose sum is a convergent mathematical series.
However, the line of reasoning reported above exerts a certain fascination on our brains, which reluctantly accept the fact that, in a finite segment, an infinite number of separate points may exist.
The mighty intellectual edifice of Mechanics developed by Newton has its foundations on the convergence of mathematical series which serves to define the concept of derivative (fluxions, to use the name originally proposed by Newton for them) which are ubiquitous in physics. Classical Physics has this idea rooted in the postulate (often implicitly accepted) that the physical quantities can be conveniently represented and gauged by real numbers.
At the beginning of the last century, the development of Quantum Mechanics has revolutionised this secular perspective. Under the astonished eyes of experimental physicists, Nature acted incomprehensibly when investigated at microscopic scales. It was the genius of Einstein who fully intuited the immense intellectual leap that our minds were obliged to accomplish to understand the physical world. In a seminal paper of 1905 [3] the yet unknown clerk of Patent Office in Bern shattered forever the world of Physics by definitely proving, with an elegant explanation of the Brownian motion, that matter is not a continuous substance but it is rather constituted by lumps of mass that were dubbed Atoms by the English physicist Dalton in 1803. The idea that matter is build up by adding together minuscule indivisible particles is very old, sprouted again from a surprising insight of Greek philosophers. The word itself, Atom, that literally means indivisible, was coined by the ancient Greek philosophers Leucippus and Democritus, master and disciple, around 450 BC, in the same period in which Zeno was questioning the endless divisibility of space and time!
In 1905 Einstein completed the revolution in the physics of the infinitely small by publishing another milestone of human thought [4] in which he argued that light is composed of minuscule lumps of energy that were dubbed photons by the American physicist Troland in 1916.
The idea that the fundamental "bricks" of matter were indivisible particles with universal properties characterising them like mass and electrical charge progressively settled in the physics world thanks to the spectacular discoveries of distinguished experimental physicists. In a quick overview in this hall of fame we have to mention (without claiming to perform a comprehensive review) Thomson, who discovered the electron in 1896, Rutherford, who discovered in 1909 that the positive charge of the Atom was concentrated in a small central nucleus, and discovered the proton in 1919, Chadwick, who discovered the neutron in 1932, Reines, who discovered the neutrino in 1956, following Pauli that in 1930 postulated his existence, Gell-Man and Zweig, who proposed the quark existence in 1964, Glashow, Salam and Weinberg, who proposed the existence of the W and Z gauge bosons in 1961, discovered by Rubbia and van der Meer in 1983, Higgs, Brout, Englert, Guralnik, Hagen, and Kibble who postulated the existence of the Higgs boson in 1964, discovered at the CERN laboratories in 2011 by teams leaded by Giannotti and Tonelli.
Summarising, by the beginning of the third millennium physicists have developed and experimentally verified a quite coherent and theoretically robust picture of the world at small scale that they dubbed with the rather unprepossessing expression Standard Model of Particle Physics, where the central role of the indivisible fundamental bricks that build up the world is alluded in the word "Particle". After 2,500 years, the formidable intuition of Greek philosopher has been confirmed: Democritus was right! But what about Zeno? The mighty and flawless edifice of Calculus, developed by giants of human thought like Archimedes, Newton and Leibniz, and the elegant and audacious construction of Cantor, who demonstrated that even the endless divisibility of fractional numbers was not powerful enough to describe the immense density of real numbers -and the name "real", used by mathematicians to describe this type of numbers allude to the idea that they are essential to adequately gauge the objects of the physical world -seemed to have finally relegated the sophisticated logical arguments of the philosopher from Elea in the endless graveyard of misconceptions.
However, the inverse square law, the universal law discovered by Newton for gravitation, that was successfully extended by Coulomb to the realm of electricity, and effectively generalised by Yukawa in 1930 for a massive scalar field, contained the seed that would resurrected the old proposal of Zeno in the vivacious crowd of modern scientific thought.
The crucial point is that the combination of the indivisible discreteness of some fundamental properties, like mass or charge -that allowed to develop the very concept of elementary particle, cornerstone of the Quantum Field Theory, the mathematical formulation behind the Standard Model -is at odds with the generalised Yukawa potential widely used at least in the lowest order formulation of the interaction of a pair of fermions in Quantum Field Theory. The crucial role of the Yukawa potential in the development of Quantum Field Theory is evident when using Feynman Diagrams (firstly presented by Feynman at the Pocono Conference in 1948) to represent the interaction of a pair of fermions. In simple words, the Yukawa potential is divergent with r → 0 and therefore in contrast with the existence of point-like particles.
In our opinion the essence of the conflict between the "granular" world of Quantum Particles (excited states of the fields) and the continuum manifold that is used to represent the Minkowski Space-Time over which the fields are represented has to be ascribed to the difficulty to insert, in the same logical scheme, the indivisible nature of elementary particles and the infinite divisibility of Space-Time over which Quantum Fields are defined.
To fully grasp this important aspect we must quickly summarise the stages through which the Fields, and the Space-Time on which they are defined, have become "actors" of the stage of physics playing an active supporting role, if not dominant, with respect to that of the Particles just discussed.
Together with Quantum Mechanics, General Relativity radically changed our understanding of Space and Time. According to the great philosopher Immanuel Kant, both these quantities are necessary a priori representations that underlies all other intuitions. Indeed, in his Critique of Pure Reason, Kant says: "Now what are space and time? Are they actual entities? Are they only determinations or also relations of things, but still such as would belong to them even if they were not intuited? Or are they such that they belong only to the form of intuition, and therefore to the subjective constitution of our mind, without which these predicates could not be ascribed to any things at all?" These fundamental issues, raised by the German philosopher, outline the sense of the immense epistemological revolution bravely fought by the audacious physicists of the nineteenth and twentieth centuries. Indeed, the seminal work of Maxwell and Einstein, just to mention the most prominent actors, has revealed that (electromagnetic) fields, space, and time, are not a priori categories of human thought, but physical objects, susceptible to experimental investigation. Their physical properties would have turned out, in the years to come, to be very different from those that our intuition could suggest to us. The initial albeit crucial point of this investigation can be identified in the Maxwell's proposal of adding the "displacement current" term to one of the electromagnetic laws, already proposed by Coulomb, Faraday, and Ampere. The addition of this term determines a complete feedback of the electric and magnetic fields, in the absence of charges or currents, and, therefore, determines a physical reality for electromagnetic fields, that is independent of the presence of the charges, and currents that generated them. Fields are no longer convenient mathematical tools to compute the forces acting on particles, but constitute physical objects endowed with their own independent existence! From the wave equation implied by these new laws, Maxwell obtained the constant that express the speed of propagation of these fields with respect to the vacuum. The genius of Einstein understood that the combination of the constancy of the speed of light with the principle of relativity, proposed in 1632 by Galilei in his Dialogue on the two greatest systems of the world, was to unhinge our Newtonian conception of absolute Space and Time, independent of each other. This led him to the extraordinary conception of a deformable Space-Time, subject to the Lorentzian invariance constraint. However, the price to pay for this epistemological revolution, was the acknowledge that, operationally -in the Bridgmanian sense of the term [5] -it is impossible to synchronise the clocks, and/or to define the distances, in an instantaneous way or, in any case, faster than imposed by the speed of light in vacuum. This led Einstein to the intuition that also Gravity (the only other field known at the time) should propagate through a wave equation, at the same speed determined by Maxwell's equations. Indeed, in their weak field limit, the field equations of General Relativity resemble Maxwell's equations, in the presence of the so-called Gravito-magnetic Field, a field generated by matter currents, in perfect analogy with the Magnetic Field generated by charge currents. Again, through the complete feedback determined by the equations relating temporal and space variations of Gravitational and Gravito-magnetic Fields, a wave equation was capable to describe the propagation of Gravitational Fields through the vacuum, at the very same speed of the Electromagnetic Fields! The overall coherence of this epistemological revolution, imposed by Special Relativity, was guaranteed by acknowledging that Space-Time was a physical entity, subject to oscillations in its texture, and not a couple of philosophical a priori categories, as discussed by Kant.
In summary, in modern physics, space and time have progressively changed their role. From mere passive containers of events (in line with the Kantian idea of mental categories) to physical quantities that, combined in the unique hyperbolic geometry implied by the constancy of the speed of electromagnetic waves, are able to deform under the gravitational action of the fields and of the particles. Even with due attention, the Space-Time of General Relativity can be considered, for all intents and purposes, a field with its associated quantum particles (excited states of the fields): the gravitons. In this unifying picture, macroscopic coherent states of a huge number of gravitons are the gravitational waves, recently detected by the LIGO and Virgo observatories.
The tension between the granularity of quantum particles and the continuity of fields (defined by real variables) has been alleviated by renormalisation techniques fully applicable in Gauge Theories of Quantum Field, as shown by Gerard 't Hooft for all fundamental forces except gravity. Renormalisation techniques have proved to be extremely effective in solving the problem of the infinities that arise when, in Quantum Field Theory, we try to combine point-like particles with fields diverging for r → 0. This approach is based on the existence of "charges" of opposite sign capable of producing, in the calculations of the associated physical quantities, terms of opposite sign which, although diverging, cancel each other out, when treated with sufficient care.
Despite their success, renormalisation techniques seem to be inadequate when gravity comes into play. Because of the mass-energy equivalence predicted by Special Relativity, the natural generalisation of the source "charge" of the gravitational field is the entire energy density and not only that associated with the rest mass of the particles. This implies that any type of field attempting to prevent gravitational collapse acts, through the energy density (usually positive) associated with it, as a further source of gravitational field, preventing, in fact, an effective renormalisation.
This last feedback is difficult to eliminate within the framework just described and makes it clear, in our opinion, the conceptual stalemate that prevents, at the present time, to unify the two most revolutionary physical theories of the twentieth century: General Relativity and Quantum Mechanics. In this perspective, Extended Theories of Gravity represent an approach to overcome the lack of a final theory of Quantum Gravity [6].
To overcome this formidable impasse, theoretical physics is today exploring more radical approaches that require a new conceptual revolution, a paradigm shift, to use Kuhn's words.
Here we just mention two opposite approaches that tackle the problem of the irresolvable dichotomy of particles and fields from somewhat opposite perspectives. String Theories (see e.g. Smolin for reviews and later criticism on this approach) that eliminate the point-like nature of the particles by assigning to each of them a (mono)-dimensional extension: the string. Loop Quantum Gravity (see e.g. Rovelli for reviews) that question the smoothness of Space-Time quantising it into discrete energy levels like those observed in classical quantum-mechanical systems to form a complex pregeometric structure (to use the words of Wheeler) dubbed Spin-Network.
In both proposed theories (although with different and somewhat opposite theoretical approaches) emerge a minimal length for physical space (and time). Atoms of Space and Time -to use an efficacious and vivid expression used by Smolin in 2006 -are a necessary consequences of this definitive quantisation of Space-Time.
However the spatial (and temporal) length-scales associated to this quantisation, are minuscule, in terms of standard units, as already suggested in a pioneering and visionary work of Planck in 1899 [64]: P ∼ hG/c 3 ∼ 10 −33 cm and t P ∼ hG/c 5 ∼ 10 −43 s for the Planck length and time, respectively. For comparison, the shortest distance (Compton wavelength) directly measured up to date at Large Hadron Collider at CERN are ∼ 10 −20 cm (for colliding energies of few 10 12 eV). The shortest time intervals ever measured are just above atto-seconds ∼ 10 −18 s (see e.g. Hentschel, Nature 2001). Experimentally, at present moment, we are more than ten orders of magnitude above the theoretical limit we would like to probe to effectively constrain our theoretical speculations! For a quick (and not exhaustive) overview of the variety of theoretical approaches exploring the possibility of the existence of fundamental limits in the ability to measure (and therefore to define, in the Bridgmanian sense) intervals of arbitrarily small space and time, we use, almost textually, what is reported in a recent work by some of us [8] and the references therein reported.
Several thought experiments have been proposed to explore fundamental limits in the measurement process of time and space intervals (see e.g. [16] for an updated and complete review). In particular Mead [9] "postulate the existence of a fundamental length" (to use his own words) and discussed the possibility that this length is the Planck length, min ∼ Gh/c 3 = P , which resulted in limitations in the measure of arbitrarily short time intervals originating relations similar to the Space-Time Uncertainty relation proposed in [8]. Moreover in a subsequent paper [9], Mead discussed an in principle observable spectral broadening, consequence of the postulate the existence of a fundamental length of the order of Planck Length. More recently, in the framework of String Theory a space-time uncertainty relation has been proposed which has the same structure of the uncertainty relation discussed in this paper ( [10], [11], see e.g. [12] for a discussion of the possible role of a space-time uncertainty relation in String Theory). The relation proposed in String Theory constraints the product of the uncertainties in the time interval c∆T and the spatial length ∆X l to be larger than the square of the string length S , which is a parameter of the String Theory. However, to use the same words of Yoneya [12], this relation is "speculative and hence rather vague yet". Indeed, in the context of Field Theories, uncertainty relations between space and time coordinates similar to that proposed here have been discussed as an ansatz for the limitation arising in combining Heisenberg's uncertainty principle with Einstein's theory of gravity [14]. In 1995 Garay [13] postulated and discussed, in the context of Quantum Gravity, the existence of a minimum length of the order of the Planck Length, but followed the idea that this limitation may have a similar meaning to the speed limit defined by the speed of light in Special Relativity, in line with what was already pointed out previously (see e.g. [15] and references therein). In the framework of the so called Quantum Loop Gravity (see e.g. [17], [18] and [19] for a review) a minimal length appears characteristically in the form of a minimal surface area ( [20], [21]): indeed the area operator is quantised in units of 2 P [22]. It has been sometimes argued that this minimal length might conflict with Lorentz invariance, because a boosted observer could see the minimal length further Lorentz contracted.
Indeed, some of the proposed theories allow for this Lorentz Invariance Violation (LIV, hereinafter) at some small scales (see e.g. [23], [24], [25] for reviews). Essentially in these scenarios the presence of a granular structure of space in which electromagnetic waves (i.e. photons, from the quantum point of view) propagate, determines the emergence of a dispersion law for light in vacuum, in close analogy with what happens for the propagation of photons in a crystal lattice.
We should stress that not all ways of introducing spacetime granularity will produce these dispersive effects. In particular, in Loop Quantum Gravity the granularity is mainly reflected in a minimum value for areas which however, is not a fixed property of geometry, but rather corresponds to a minimal (nonzero) eigenvalue of a quantum observable that has the same minimal area 2 Planck for all the boosted observers (what changes continuously in the boost transformation is the probability distribution of seeing one or the other of the discrete eigenvalues of the area (see e.g. [26])). Still, also in Loop Quantum Gravity there are results amenable for testing with gamma-ray telescope, the most studied possibility being an anomalous dependence of frequency on distance, producing a flattening of the cosmological redshift [27].
The energy scale at which dispersion effects become manifest can be easily computed e.g. equating the photon energy, E = hν, to ν ∼ 1/t P which provides the Planck Energy E P ∼ hc 5 /G ∼ 10 28 eV, a huge energy for the particle's world, corresponding to the mass of a paramecium (∼ 0.02 mg). Again, frustratingly, this energy scale is well beyond any possibility of direct investigation with any kind of colliders in the near and next future. It is worth to note that, in the simplest models, at lowest order, the dispersion law for photons speed v phot is dominated by the linear term: v phot /c ∝ hν/ hc 5 /G, with constant of proportionality ξ ∼ 1.
In our opinion, this unprecedented situation, in which the scale of the expected experimental phenomena is very far from the current possibilities of experimental verification, is hampering any significant progress in our understanding of the ultimate structure of the world. Physics is, after all, an experimental discipline in which continuous comparison with experimental data is essential even to draw unexpected clues from which to develop new theories. This was the case for the development of Relativity and Quantum Mechanics in which bold physicists and epistemologists had to develop new logical models to account for unexpected experimental results that were unimaginable for the classical conception of nature developed by Greek philosophers. Indeed, the fatal blow to the classical conception of physics developed up to Newton and Maxwell, was given by the experimental impossibility to determine the speed of Earth with respect to the Cosmic Aether (the medium in which electromagnetic waves propagate) as firmly established by the null result of the Michelson and Morley experiment [28].
Indeed, in the context of Quantum Gravity, we are witnessing a flourishing of countless elegant mathematically daring theories, which testify the lively interest of brilliant minds towards problems of undoubted physical and epistemological relevance that sadly, at the moment, lack the invigorating and vitalising confrontation with constraining experimental data.
For comparison, the recent discoveries of the existence of the Higgs Boson, which confirmed, strengthening it, the Standard Model of Particle Physics, the detection of Gravitational Waves, which confirmed what was predicted a century ago by General Relativity and the recent spectacular image, obtained interferometrically, of the event horizon around a supermassive black hole, which confirmed the formation of trapped surfaces in the Space-Time fabric, have vitalised these very interesting fields of research by opening the doors to new disciplines such as Multi-Messenger Astronomy [29] .
However, we believe that a giant leap is now possible also in the difficult experimental task of investigating the texture of Space on the minuscule scales provided by Quantum Gravity. In the following we will show how the technological progress in Space Sciences and the enormous reduction in the costs necessary to bring detectors into space, can allow us to conceive an ambitious experiment to verify, for the first time, directly, some of the most important consequences of the existence of a discrete structure for the texture of the space. To put it suggestively, twenty-five centuries after the meeting of the Eleatic philosophers with Socrates in Athens, we are able to investigate the problem raised by Zeno in a quantitative way.
In particular, in line with the suggestions outlined in some pioneering work in the field of experimental investigation of Quantum Gravity [30] [31], we propose a ambitious albeit robust experiment to directly search for tiny delays in arrival times of photons of different energies determined by the dispersion law for photons discussed above. Given the hugeness of the Planck Energy, we expect, as it will be shown in § 7, delays ∼ few µs for Gamma-ray Burst (GRB) photons that travelled for more that ten billion years! These last numbers show, in themselves, the difficulty and ambitiousness of the proposed experiment. We would like to emphasize here, however, that even a null result, that is a solid proof of the non-existence of a linear effect in the law of photonic dispersion for energies normalised to the Planck scale, would constitute a result of capital importance for the progress of fundamental physics. After all, the Michelson and Morley experiment [28], decisive for the acceptance, in an understandably conservative scientific community, of the revolutionary ideas on space and time implied by the Theory of Relativity, provided a null result with respect to the possibility of identifying motion with respect to the Cosmic Aether! Indeed we think that first order dispersion relation has not been investigated with the due accuracy at present. In particular, our major concerns are possible intrinsic delays (characterising the emission process) overprinted over the tiny quantum delays. This is particularly evident in caveat discussed in [32] on GRB090510 and, more recently, in the paper by [33] and [34] who set a robust constrain on LIV using Fermi-LAT GRB data of few 10 17 GeV. Further indications of no LIV violations come from HESS collaboration, in particular from spectral analysis of the blazar Mrk 501 [35], although also in this case a spectral shape and hypothesis on the emission process are assumed. Moreover, all these analyses assume a dependence of the effects on redshift which was conjectured in pioneering paper by [36]; however as theorists acquire the ability to test the Jacob-Piran conjecture in explicit models it is often found that other forms of redshift dependence apply [37]. In our opinion, given the importance of the question, a direct robust measure cannot be based on the analysis of a single object and a robust statistical analysis of a rich sample of data in which the natural direct timescale of the LIV induced delays in the gamma-ray band (one microsecond) is thoroughly searched is required. None of the experiments discussed above had the right combination of time resolution and collecting area to effectively scrutinise this regime.

GrailQuest and its scientific case in a nutshell
The coalescence of compact objects, neutron stars (NS) and black holes (BH), and the sudden collapse to form a BH, hold the keys to investigate both the physics of matter under extreme conditions, and the ultimate structure of space-time. At least three main discoveries in the past 20 years prompted such studies.
First, the arcmin localisation of GRBs (sudden and unpredictable bursts of hard-X/soft-γ rays with huge flux up to 10 −2 ergs/cm 2 /s), enabled for the first time by the instruments on board BeppoSAX, allowed to discover their X-ray and optical afterglows [38,39], which led to the identification of their host galaxies [40]. This definitely confirmed the extragalactic nature of GRBs and assessed their energy budget, thus establishing that they are the most powerful accelerators in the Universe. Even accounting for strong beaming, the energy released can indeed attain 10 52−53 erg, a large fraction of the Sun rest mass energy, in ≈ 0.1 − 100 seconds, produced by bulk acceleration of plasmoids to Γ ≈ 100 − 1000 [41,32].
Second, the large area telescope (LAT) on board the Fermi satellite established GRBs as GeV sources, confirming their capability to accelerate matter up to Γ ≈ 100 − 1000 and allowing us to apply for the first time the program envisioned by Amelino-Camelia and collaborators at the end of the 90' [30] to investigate quantum space-time using cosmic sources.
Third, the recent discovery of the gravitational signal from several BH-BH and one NS-NS mergers by Advanced Ligo and Virgo [42,43,44], opened a brand new window to investigate the astrophysics of compact object as well as fundamental physics. The gravitational signal carries a huge amount of information on the progenitors and final compact objects (masses, spins, luminosity, distance etc.). Moreover, the current values for the number of mergers (rate in excess of 12 Gpc −3 yr −1 ), implies that the number of Gravitational Wave Events (GWEs hereafter) associated with the merging of two compact objects is significant.
These scenarios and limits will be further constrained and improved in the coming few years when the sensitivity of the interferometers will be further improved, and the corresponding volume for BH-BH events further enlarged. The coming on-line of a third interferometer, Advanced Virgo on August 2017, has already greatly improved the localisation capability of the Advanced LIGO/Virgo system, producing error boxes of few hundreds of deg 2 , 10-100 times smaller than those provided by Advanced LIGO [44]. The localisation will reduce to few tens of deg 2 with the advent of KAGRA.
In August 2017 a first NS-NS event has been discovered by LIGO/Virgo [46], with associated a short GRB seen off-axis and detected first by Fermi/GBM, Integral/SPI-ACS [47], and, only nine days after the prompt emission, by Chandra [48]. The GBM provided a position with uncertainty ∼ 12 deg (statistical, 1σ, to which a systematic uncertainty of several deg should be added). The Ligo/Virgo error boxes led to the first identification of an optical transient associated to a short GRB and a GWE, opening de facto the window of multi-messenger astrophysics [49]. This can clearly add further astrophysical and cosmological key information on the GWE and GRB phenomena (e.g. [50]).
These considerations show that, in the next future, the prompt accurate localisation of the possible transient electromagnetic counterparts of GWE is mandatory in order to fully exploit the power of scientific investigation of Multi-messenger Astronomy. Indeed high sensitivity to transient events in the X-ray/Gammaray window and their fast localisation with accuracies in the arc-minutes range or below, are mandatory in order to point narrow field instruments to scrutinise the GWEs electromagnetic counterparts in the whole electromagnetic band.
Finally, GRB light-curves in different energy bands, in the X-ray/Gamma-ray window, with temporal resolution ≤ 1µs can be used to investigate a dispersion law for photons, predicted in some of the proposed theories of Quantum Gravity, as discussed in Section 1.
In summary, there are at least three broad areas that can/must be tackled in the next few years: 1. the accurate (arc-min/arc-sec) and prompt (seconds/minutes) localisation of bright transients; 2. the study of transient's hard X-ray temporal variability (down to the micro-second domain and below, i.e three orders of magnitude better than the best current measures), as a proxy of the inner engine activity; 3. the use of fast high energy transients to investigate the structure of space-time.
We will discuss these three broad themes in the next Sections. We devote the last Sections to describe our proposed approach to the tackling of the three main science themes listed above; this consists in a distributed instrument, a swarm of simple but fast hard X-ray detectors hosted by small/micro-satellites in low Earth orbit, the GrailQuest mission, specifically conceived to provide precise measurements on the three main science themes mentioned above.

Gamma-ray Burst fast variability
GRBs are thought to be produced by the collapse of massive stars and/or by the coalescence of two compact objects. Their main observational characteristics are the huge luminosity and fast variability, often as short as one millisecond, as showed by [51], both in isolated flares and in lower amplitude flickering. These characteristics soon led to the development of the fireball model, i.e. a relativistic bulk flow where shocks efficiently accelerate particles. The cooling of the ultra-relativistic particles then produces the observed X-ray and γ-ray emission. One possibility to shed light on their inner engines is through GRB fast variability. Early numerical simulations [52,53,62] suggested that the GRB light-curve reproduces the activity of the inner engine. More recently, GRB jets hydrodynamical simulations showed that to reproduce the observed light-curves fast variability must be injected at the base of the jet by the inner engine, while longer variations may be due to the interactions of the jets with the surrounding matter [54].
The most systematic searches for the shortest timescales in GRBs so far are those of [51], [55] and [56]. The first two works exploit a rather sophisticated statistical (wavelet) analyses, while the latter performs a parametric burst deconvolution in pulses. [51] conclude that the majority of analysed BATSE GRBs shows rise-time faster than 4msec and 30% of the events having rise-time faster than 1msec (observer frame). [55] use Fermi/GBM data binned at 200µs (the original bin size of GBM data is 2µs) and report somewhat longer minimum variability timescales than [51], but conclude that variability of the order of a few msec is not uncommon (although they are limited by the wider temporal bin size adopted of 200µs and much worse statistics than in the BATSE sample). Systematically longer time-scales are reported by [56], using data binned at 1msec. This is not surprising, because direct pulse deconvolution needs best statistics, which can hardly be obtained for the shortest pulses.

Synthetic Gamma-Ray Bursts
To estimate the accuracy obtainable from cross-correlation analysis, E CC , we started by creating synthetic Long and Short GRBs with the following characteristics. The Long and Short GRBs considered are of duration ∆t Long = 25 s and ∆t Short = 0.4 s, respectively. To simulate the GRBs variability with a time-scale of ∼ 1 ms we considered that each GRB results from the superposition of a great number of identical exponential shots of decay constant τ shot ∼ 1 ms, randomly occurring at an average arrival rate of λ shot = 100 shot/s during the whole GRB duration. The amplitude of each exponential shot is normalised to have a flux of 8.0 counts/s/cm 2 in the energy band 50 ÷ 300 keV, while the background photon flux in the same energy band has been fixed to 2.8 counts/s/cm 2 (consistent with typical background observed by Fermi GBM).

Fermi GBM Gamma-ray Bursts
To further investigate the method we to apply the same techniques on real data. In order to achieve the objectives extensively described above, we performed Simulations on short time scales (∼ 0.1 ms) of an unique-like type of transient events such as GRBs, based on observed light curves, can be challenging when the effective area of the detector is so small that the statistic is fully dominated by Poissonian fluctuations that unavoidably characterised the (quantum) detection process. In particular, if the detected counts within the given time scale is ≤ 1, quantum fluctuations of the order of 100% are expected. If, naively, the number of counts per bin is simply rescaled to account for an increase effective area, these quantum fluctuation can introduce a false imprint of 100% variability with respect to the original signal. No definite cure is available to mitigate this problem, that could be, however, alleviated by rebinning and/or smoothing techniques. Although smoothing techniques allow the creation of light curves for a desired temporal resolution, correlation between subsequents bins is unavoidable. Crosscorrelation techniques are strongly biased by this effects, therefore we opted for a more conservative method implying standard rebinning in which the number of photon accumulated in each (variable) bin is fixed. After several trials and Monte-Carlo simulations we find that 6 photons per bin allows to preserve the signal variability introducing undesired fluctuations not larger than ∼ 30%. Applying this rebinning techniques to the GBM light curve (at the maximum time resolution of 2µs) discussed above, we generated a variable bin size light curve. In order to produce a template for Monte-Carlo simulations, usable on any time scale, we linearly interpolated the previous light curve to create a functional expression (template) for the theoretical light curve. We note explicitly, that linear interpolation between subsequent bins is the most conservative approach that does not introduce spurious variability on any time scales.
For a given temporal bin size, we amplified the GRB template previously described in order to take into account the overall effective area of the detector(s) and we uses this value as the expectation number of photon within the bin. Poissonian randomisation has been then applied to produce a simulated light curve. The insets of Fig 2 show the results of this process for the Long and Short GRB described above for 10 −4 second time scale and overall effective area of 100 square meters.

Cross-Correlation technique and Monte-Carlo simulations
Starting from the GRB light curves described above, we apply cross-correlation techniques to determine time delays between two signals. Fig. 3 shows an example of cross-correlation function obtained processing two GRB light curves simulated using the templated of the Short GRB observed by Fermi GBM (GRB120323507) previously described that we rescaled to mimic a detector(s) with 100 square meters effective area. In order to extract the temporal information of the delay, we fitted a restricted region around the peak of the crosscorrelation function with an ad hoc model consisting in an asymmetric double exponential component (see inset in Fig. 3).
To investigate the accuracy achievable by the method, for each GRB and a specific instrument effective are, we performed 1000 Monte-Carlo simulations in which two light curves generated by means of randomisation of the template are cross-correlated. For each cross-correlation function we then fitted the peak extracting the delay between the light curves. From the over all distribution of delays we calculated its standard deviation which we interpret as a realist estimate of the accuracy on the time delay measured with the cross-correlation method. The left panel of Fig. 4 shows the distribution of delays obtained from 1000 Monte-Carlo simulations performed for the Long (GRB130502327) and the Short (GRB120323507) GRBs assuming a total collective area of 100 square meters. To proceed forward in the analysis of the technique we investigated the dependence of the cross-correlation accuracy as a function of the effective area of the instrument, which reflects the number of photons collected for the GRB. To do that, we performed 1000 Monte-Carlo simulations for two Short (one synthetic and one real) and two Long (one synthetic and one real) GRBs, simulating four different instrument collective areas, i.e. 1, 10, 50 and 100 square meters, for a total of 16000 simulations. We emphasise that each simulation performed on time scales of micro-seconds requires the creation of tens to hundreds millions of photons to be allocated in light curves with tens of millions of bins, which are then cross-correlated in pairs. The over all process required a substantial computational effort, which reflected on more that 6000 hours of CPU time in a multi-core (128 logical processors) server and several terabyte of storage.
From the simulations of the synthetic GRBs (in the band 50÷300 keV) we obtained the following relations between the cross-correlation accuracy E CC and the number of photons in the light curves N ph : E CC Long = 0.014µs ×

GrailQuest localisation capabilities
GrailQuest is designed to provide prompt (within seconds/minutes), arcmin-to-(sub)arcsec localisations of bright hard X-ray transients. This is the key to enable the search for faint optical transients associated to the GWEs and GRBs, because their brightness quickly fades after the event. In the GrailQuest concept, localisation is achieved by exploiting the delay between the transient's photon arrival times on different detectors, separated by hundreds/thousands km. Delays are measured by cross-correlating the source signals detected by different instruments.
The working principle of GrailQuest can be easily understood by considering the analogy with radio interferometry.
In the case of radio interferometry, having N observing radio telescopes, with average spatial separation d, the theoretical spatial resolution of the interferometric array results from the combination of N tot = N × (N − 1)/2 statistically dependent couples of interferometers, each having an angular resolution capability of where f (α; δ) i O(1) is a function that depends on the position of the source in the sky (α and δ are the right ascension and declination, respectively) with respect to the orientation of the distance connecting the couple of antennas of the i th interferometer, σ φ i is the uncertainty in the phase differences measurable by each couple of antennas, λ is the wavelength of the observation, i = 1, ...N . It is important to note that the number of statistically independent couples is N ind = N − 1. In practice, however, it is useful to consider the whole set of N tot equations to minimise the a priori unknown systematic effect on one or more radio telescopes. This system of N tot equation can be solved for the 2 unknowns α and δ giving a statistical accuracy of where g(α; δ)O(1) and σ φ are suitably weighted averages of f (α; δ) i and σ φ i , respectively. The factor σ φ × λ represents the accuracy on the determination of the phases of the ratio signal.
In the case of GrailQuest we can imagine that, because of the intrinsic variability of the signal of transient sources, we are able to determine the analog of the factor σ φ × λ by cross correlating the signal recorded by each couple of detectors of the GrailQuest constellation and determining the cross-correlation delay ∆t i . Indeed, since λν = c, and φ = νdt ∼ ν∆t for short signals (where c is the speed of light and ν is the light frequency), σ φ × λ = νσ ∆t λ = cσ ∆t , where σ ∆t is a suitably weighted average (over the whole ensemble of detectors) of the accuracy in the determination of ∆t i . Therefore, the accuracy in the source position obtainable with a constellation of N satellites is Finally, we have to add in quadrature all the statistical errors in the determination of σ ∆t . In particular we have: where E CC is the error on the delay time given by the cross-correlation between the light curves recorded by two detectors, E POS is the error induced by the uncertainty in the space localisation of the detectors, and E time is the error on the absolute time reconstruction. For large N , we adopt the reasonable value g(α; δ) ∼ 1 and N − 3 ∼ N , σ α ∼ σ δ = σ θ , where σ θ is the positional accuracy (PA hereinafter): The position and absolute time reconstruction provided by commercial GPS are of the order of 10-30 nano-seconds and a ∼ 10 meters (corresponding again to a few tens nano-seconds). Most likely, the error on delay time inferred from the cross-correlation analysis, is the biggest term in the time delay uncertainty.
Adopting N = 100 N 100 satellites for the constellation, d = 3 × 10 8 d 3000 km , The PA calculated above includes statistical errors only. Systematic errors are likely to be important, but at the stage of proof of concept we can conclude that localisation below arc-minutes level is feasible with the above parameter settings.

High energy Transient localisation in the Multi-messenger Era
As of today, the observatories dedicated to the search and study of hard X-ray transients are the NASA Swift and Fermi, and the ESA INTEGRAL satellites.
Swift has been launched in 2004 and it is equipped with the wide field of view (FoV) Burst Alert Telescope (BAT) to localise transient and the narrow field X-ray Telescope (XRT) and the Optical Monitor (OM), high sensitivity telescopes for detailed observations of the transient afterglows. BAT is coded mask instrument with Field of View, FOV∼1/6 of the full sky, and a collecting area of about 0.5 m 2 . It can provide GRB positions with 3-10 arcmin accuracy, depending on GRB strength and position in the FOV. XRT is a Wolter-I X-ray telescope, with FOV∼30 arcmin 2 , and collecting area ∼200 cm 2 , that can provide positions with arcsec accuracy of sources down to fluxes ∼ 10 −14 ergs/cm 2 /s. Swift has the unique capability to slew from its original pointing position to the position of the transient in tens of seconds/minutes, to study the transient with its narrow field telescopes.
INTEGRAL has been launched in 2002 and it is equipped with the wide field of view IBIS camera, FOV∼1000 deg 2 and collecting area ∼ 1 m 2 . IBIS has a smaller FOV than BAT, but a better sensitivity, allowing the detection of fainter transients with respect to BAT. The position accuracy is also slightly better than that of BAT (a few arcmin). In addition to IBIS, the anti-coincidence scintillators of SPI, the high energy spectrometer on board of integral, can be used as an all sky monitor to detect GRBs, although with basically null localisation capability.
Fermi has been launched in 2008 and hosts on board the GBM experiment, consisting in 12 NaI and 2 BGO scintillators, each of about 120 cm 2 of collecting area [57]. The GBM can provide GRB position with accuracy of > ∼ 10 deg. Swift, INTEGRAL and Fermi are working nominally after more than 12, 14 and 9 years from the launch respectively, providing 3-10 arcmin positions (Swift, Integral) or 10-20 deg positions (Fermi) over a large fraction of the sky. Their predicted lifetime would extend the missions through the second decade of the 2000, but of course all their equipment are ageing and it is not known how long they will survive after the 2020. This time window is crucial because of two main reasons.
1. The Advanced LIGO/Virgo will reach their final sensitivity and best localisation capability for GWE in a few years. KAGRA will join the network by the end of 2019. However, it will be necessary the coming on line of a fifth interferometer, LIGO-India, in the network (expected in 2025) to provide positions of a large fraction of GWE with accuracy smaller than 10 degrees. On the other hand, the improved sensitivity will increase the distance at which an event can be observed to several Gpc for BH-BH events and hundreds Mpc for NS-NS events, thus increasing the cosmic volume. The number of optical transients in such huge volumes is from many tens to several hundreds, making difficult to identify the one associated to the GWE. The number of high-energy transients in the same volume is much smaller, greatly helping the identification. It is instructive to consider the first identification of an electromagnetic transient with a GWE which occurred on August 17 2017. The Fermi GBM was observed a gamma-ray burst within few seconds from the GW detection. The combined LIGO/Virgo error-box was the order of 30 deg 2 (Abbott et al 2017d). However the LIGO/Virgo detection indicated a very close event (∼40 Mpc) greatly limiting the number of target galaxies. An optical transient from one of these nearby galaxies was soon discovered. Two were thus the key elements that allowed the discovery and localisation of the optical transient associated to the GWE: a) the prompt γ-ray detection from the Fermi GBM (and also INTEGRAL), and b) the relatively limited volume that had to be searched. For fainter events, further away, such those that will likely be provided by ground-based interferometers during the 2020, the volume to be searched will be much larger. The third observing run of LIGO and Virgo already showed that event more distant than GW170817 and large sky-localisation a well-localised high-energy counterpart becomes crucial to detect multi-wavelength signal and identify the host galaxy. The third generation of gravitational wave detectors is expected after 2030, e.g. Einstein Telescope; at that time the localisation of GRB possible counterparts will be crucial (see [65]). GrailQuest will be fundamental in this respect. 2. At the end of the 2020', ESA will launch its L2 mission Athena, carrying the most sensitive X-ray telescope and the highest energy resolution detector (XIFU) ever built. Among the core Athena science goals there are spectroscopic observations of bright GRBs, used as light-beacon to X-ray the inter-galactic medium (IGM). These observations may lead to the discovery and the characterisation of the bulk of the baryons in the local Universe, in the form of a warm IGM (a few millions K), through absorption line spectroscopy (e.g. Fiore et al. 2000). Athena will also target high-z GRBs, to assess whether they are the final end of elusive PopIII stars (through the measurements of the abundance pattern expected from the explosion of a star made only of pristine gas).
For these reasons several missions aimed at localising fast high energy transients have been and will be proposed to NASA (Midex class) and ESA (M class), to guarantee that the study of these elusive sources can be operative and efficient during the next decades. GrailQuest will offer a fast-track and less expensive fundamental complement to these missions, since it will be an all-sky monitor able to spot transient events everywhere in the sky and to give a fast (within minutes) and precise (from below 1 deg to arcsec, depending on the GRB flux and time variability) localisation of the event. This is extremely important to allow follow up observations of these events with sensitive narrow field instruments of future complex and ambitious missions in all the bands of the electromagnetic spectrum (from radio to IR/Optical/UV and to X and gamma-rays).
The main parameters affecting the discovery space in this area are: 1) number of event with good localisation; 2) quality of the localisation; 3) promptness of the localisation. GrailQuest will ensure all these three characteristics and will be fundamental to thoroughly study electromagnetic counterparts of GWE.
6 GrailQuest constellation as a single instrument of huge effective area Once the time of arrival (ToA) of the photons in each detector of the GrailQuest constellation are corrected by the delays induced by the position of the GRB in the sky, as deduced from the optical identification of the counterpart, it is possible to add all the photons collected by the N detectors of the constellation to obtain a single light-curve equivalent to that of a single detector of effective area A tot = N a where a is the effective area of each detector. In doing this an error in the ToA of each photon is introduced, because of the error in the position in the sky. However, since the optical counterpart will be known within 1 arcsec or below, the induced errors in the ToA are negligible.
7 Transients as tools to investigate the structure of space-time As discussed in § 1, several theories proposed to describe quantum space-time, predict a discrete structure for space on small scales, min ∼ P . For a large class of these theories this space discretisation implies the onset of a dispersion relation for photons, that could be related to the possible break or violation of the Lorenz invariance on such scales. Special Relativity postulate Lorentz invariance: all observers measure the same speed of light in vacuum, independent of photon energy, which is consistent with the idea that space is a three dimensional continuum. On the other hand, if space is discrete on very small scales, it is conceivable that light propagating in this lattice exhibits a sort of dispersion relation, in which the speed of photons depends on their energy. These LIV models predict a modification of the energy-momentum "dispersion" relation of the form where E is the energy of a particle of (rest) mass m and momentum p, and M QG = ζM P is the mass at which quantum space-time effects become relevant, where ζ ∼ 1, and (since Special and General Relativity were thoroughly tested in the last century) lim E/(M QG c 2 )→0 ∆ QG (E, p 2 , M QG ) = 0 (see e.g. [67]). In a very general way, the equation above can be used to determine the speed of a particle (in particular a photon). Moreover, when two photons of different energies, E 2 − E 1 = ∆E PHOT , emitted at the same time, travel over a distance D TRAV (short with respect to the cosmic distance scale, i.e. a distance over which the cosmic expansion can be neglected, see below), because of the dispersion relation above, they exhibit a delay ∆t LIV . It is conceivable to express this relation as a series expansion around its limit value 0 (since t 2 = t 1 = D TRAV /c implies ∆t LIV = 0) as: where ξ ∼ 1 is the coefficient of the first relevant term in the series expansion in the small parameter ∆E PHOT /(M QG c 2 ), the sign ± takes into account the possibility (predicted by different LIV theories) that higher energy photons are faster or slower than lower energy photons (discussed as subluminal, +1, or superluminal, −1, case in [58]. Note that ξ = 1 in some specific LIV theories (see e.g. [30,58], in particular their equation 13). The index n = 1 or 2 takes into account the order of the first non zero term in the expansion. When the distance traveled by the photons is comparable to the cosmic distance scale, the term D TRAV /c must be changed into D EXP /c to take into account the effect of a particle propagation into an expanding. The comoving trajectory of a particle is obtained by writing its Hamiltonian in terms of the comoving momentum ( [36]). The distance traveled by the photons, in a general Friedman-Robertson-Walker Cosmology, is determined by the different mass-energy components of the Universe. These energy contents can be expressed in units of the critical energy density ρ crit = 3H 2 0 /(8πG) = 8.62(12) × 10 −30 g/cm 3 , where H 0 = 67.74(46) km/s/Mpc is the Hubble constant (see Planck Collaboration, 2015, for the parameters and related uncertainties). Considering the different dependencies from the cosmological scale factor a, it is possible to divide the energy components of the Universe into: . With these notation it is possible to express the proper distance D P at present time (or comoving distance) of an object located at z (z is the redshift) as: where On the other hand, the term D EXP has to take into account the fact that the proper distance varies as the universe expands. Photons of different energies are affected by different delays along the path, thus, because of cosmological expansion, a delay produced further back in the path amounts to a larger delay on Earth. This effect of relativistic dilation introduces a factor of (1 + z) into the above integral ( [36]).
In particular, in the so called Lambda Cold Dark Matter Cosmology (ΛCDM) the following values are adopted (Planck Collaboration, 2015): H 0 = 67.74(46) km s −1 Mpc −1 , Ω k = 0, curvature k = 0 that implies a flat Universe, Ω R = 0, radiation = 0 that implies a cold Universe, w = −1, negative pressure Equation of State for the so called Dark Energy that implies an accelerating Universe, Ω Λ = 0.6911 (62) and Ω Matter = 0.3089 (62). With these values we have: Adopting as a firm upper limit for the distance of any GRB the radius of the visible (after recombination) Universe D P /c ≤ R V /c = 1.4 × 10 18 s (in the ΛCDM cosmology), we find: where ∆E PHOT MeV = ∆E PHOT /(1 MeV). This shows that first order effects (n = 1) would result into potentially detectable delays, while second order effects are so small that it would be impossible to detect them with this technique.
Therefore, it is possible to detect (or constrain) first order effects in space-time quantisation by detecting (or giving upper limits to) time delays between light curves of GRB in different energy bands. Indeed these quantum-space-time effects modifying the propagation of light are extremely tiny, but they cumulate along the way. GRBs are among the best candidates to detect the expected delays, since i) the signal travels over cosmological distances; ii) the prompt spectrum cover more than three order of magnitudes in energy; iii) fast variability of the light-curve is present at or below one millisecond level (see e. g. [30]). Such a detection could directly reveal, for the first time, the deepest structure of quantum space-time by gauging its structure in terms of the photon energies.
To better quantify this possibility, we considered a broad band, 5 keV − 50 MeV, covering a relevant fraction of the prompt emission of a typical GRB and within the energy range covered by NaI and BGO scintillators. Based on BATSE observations of GRB prompt spectra, the so called Band function, an empirical function describing the photon intensity energy distribution, has been developed (Band et al., 1993):   years (note that single satellite failure will not be a problem since these can be easily replaced with highperformance new versions). With the temporal triangulation technique described in the paragraph above, the position determination would be possible within minutes from the prompt event, allowing the prompt search for its counterpart in other wavelengths. Swift-BAT allows localisation of GRBs occurring in the BAT field of view with an accuracy of tens of arcsec (FoV of 17 arcmin), and subsequent optical localisation (with the OM on-board Swift) resulting in the determination of the redshift of the host galaxy for most long-GRBs. In the same way, the fast and precise GRB localisation offered by GrailQuest will allow to determine the optical counterpart and redshift for most of the long-GRBs and for the short-GRBs for which an optical counterpart can be revealed. Since the counterpart of the furthest GRBs may fall in the IR band because of the high redshift, once a precise localisation of the source is given, it can be effectively searched thanks to the synergy with e.g. James Webb Space Telescope (operating in the IR band); this will allow the detection of GRBs with z > 10 (the actual record is just above z = 9, [77]), opening a brand new window for high-redshift cosmology. Moreover, if a dedicated mission such as THESEUS (selected for a possible ESA M5 mission) will be approved by ESA, that would be totally synergic with GrailQuest since THESEUS may follow up both soft X-ray localisations (obtained by THESEUS itself) and harder X-ray (or soft gamma-ray) localisation obtained with GrailQuest. Moreover in temporal resolution and effective area GrailQuest will be unique (by several orders of magnitude!) with respect to missions like THESEUS. -Given the huge effective area, GrailQuest will be the ultimate experiment for prompt GRB physics. In this context we plan to produce a catalogue of GRB dynamic spectra over more than three orders of magnitude in energy (from 20 keV to 10 MeV) with unprecedented statistics and moderate energy resolution. Again, the combination of huge effective area and high time resolution will allow to have enough photons in the high-energy band to follow spectral evolution of the prompt emission on short timescale. This is particularly important to shed light on the complex and poorly studied details of the fireball models and the mechanism through which ultra-relativistic colliding shocks release the huge amount of gamma-ray photons observed and GRBs inner engine. GRBs are thought to be produced by the collapse of massive stars and/or by the coalescence of two compact objects. Their main observational characteristics are the huge luminosity and fast variability, often as short as one millisecond. These characteristics soon led to the development of the fireball model, i.e. a relativistic bulk flow where shocks efficiently accelerate particles. The cooling of the ultra-relativistic particles then produces the observed X-ray and gamma-ray emission. While successful in explaining GRB observations, the fireball model implies a thick photosphere, hampering direct observations of the hidden inner engine that accelerate the bulk flow. We are then left in the frustrating situations where we see at work daily the most powerful accelerators in the Universe, but we are kept in the dark over their operation. One possibility to shed light on their inner engines is through GRB fast variability. Early numerical simulations [52,53], as well as modern hydro-dynamical simulations [54], and analytic studies (e.g. [60]) suggest that the GRB light-curve reproduces the activity of the inner engine. GRB light-curves have been investigated in sone detail down to 1msec or slightly lower [51,55], the sub-msec window is basically unknown, as little known is also the real duration of the prompt event. We do not know how many shells are ejected from the central engine, which is the frequency of ejection and which is its length. Pushing GRB timing capabilities by more than three decades should help in answering at least some of these questions. -To add polarimetric information on the sample of GRBs detected. [78] proposed to measure the linear polarisation of GRBs by comparing the asymmetry in the rate of counts in different detectors of BATSE of the delayed component of photons Compton-backscattered by Earth atmosphere. This technique might be applied to data obtained by GrailQuest by comparing the photons detected by different satellites at different directions with respect to the Earth and by exploiting the timing capabilities of its instruments; in this case the method will be much more effective. Polarisation will provide other valuable information of extreme interest for the fireball models. -To scrutinise the whole sky to search for X and gamma-ray transients even of very short duration. Despite its lack of imaging capabilities, GrailQuest will benefit from the fact that background is relatively low at energies above few tens of keV. The huge area will guarantee an unprecedented sensitivity allowing to detect (signal-to-noise ratio > 1) transient phenomena even at the shortest temporal scale, mitigating the effects of the quantum-detection process that are blinding our sensitivity when the number of photons detected is small. It might exist a large class of fast transients that remained undiscovered up to date because of the small fluence associated with their short time duration. In the radio band this has been the case of the recently discovered Fast Radio Bursts (FRBs, see [79] as a review). Indeed, some theories predict a high energy counterpart of these compelling phenomena and GrailQuest is the right instrument for searching these counterparts. In particular, high energy counterparts are predicted in the context of Quantum Gravity [27]. In the same context it is possible that black holes hide a core of Planckian density, sustained by quantum gravitational pressure. As a black hole evaporates, the core remembers the initial mass and the final explosion occurs at macroscopic scale. Under several rough assumptions, it is possible that several short gamma-ray events per day, at energies around 10 MeV, with isotropic distribution, can be expected coming from a region of a few hundred light years around us. Further predictions can be done, in particular, the wavelength of the signal depends on the size of the black hole at the moment of the explosion: the further the explosion is, the smaller the black hole will be and therefore the wavelength is less and the emission fluence is less. -To monitor all kind of high-energy transients, both galactic and extra-galactic events, such as the flaring activity of magnetars, outbursts of black hole or neutron star transients, and so on. The monitoring of the high-energy sky has been very important in the last years to discover new events and/or peculiar behaviours or for a detailed characterisation of already known sources. GrailQuest will be a large area all-sky monitor , with good-temporal and moderate-energy resolution, able to add important information for the full understanding and the thoroughly study of high-energy transients, whose behaviour may give important advances in fundamental physics regarding strong gravity and extremely high-density matter. -To monitor the onset of Tidal Disruption Events (TDE, hereafter) with fast variability. Tidal disruption events [81] are generally very luminous (often above Eddington) in the soft X-ray band, with an X-ray spectrum usually dominated by a thermal component at a few keV [82]. However, a sub-class of TDEs, called "jetted TDEs" are characterised by a much harder and non thermal spectrum extending up to the gamma ray band (see the prototypical case of Swift J16644 [83]). They are a fundamental tool to study the "onset" of AGN-like activity in otherwise quiescent black holes. Since most of the emission arises close to the black hole, they can be used to study relativistic phenomena such as precession induced by the black hole spin [84]. Also, they can serve as an important probe of hidden, sub-pc black hole binaries that are in the process of merging and are thus progenitors of LISA events [85]. Finally, TDEs also produce dim, but potentially detectable gravitational wave emission [86] and might thus be important electromagnetic counterparts to a sub-class of gravitational wave sources. -To perform high-quality timing studies of known high-energy pulsators. The most interesting window in this field is certainly the population of millisecond pulsars (accreting and/or transitional and/or rotationally powered, see e.g. [80]) and the enigmatic gamma-ray pulsars. Millisecond pulsars often show (transient) X-ray and gamma-ray emission whose properties are not completely understood yet. This emission may be caused by intra-binary shocks of the pulsar emission (consisting of both radiation and high-energy particles) with a wind of matter from the companion star. In this case, a modulation of the X and gamma-ray emission with the orbital period is expected and may be searched for with GrailQuest. Also, the orbital period evolution of these systems is very important to address in order to investigate their formation history and their connection with Low Mass X-ray Binaries, as envisaged by the recycling scenario. Orbital evolution may also be studied in high inclination X-ray binary systems (containing black holes or neutron stars) where periodic signatures (such as dips and/or eclipses) are observed. Despite the lack of imaging capabilities and no possibility of background rejection, GrailQuest is capable to detect any (quasi-)periodic signal for which the period is known thanks to folding techniques coupled with a huge collecting area. This makes this instrument an ideal tool to perform timing studies of any kind of high-energy (quasi-)periodic signal.

Detector description
The key requirement for the a detector in the GrailQuest contest are an active area of the order of 1000 cm2, precision timing of the event down to 10-100 nano-seconds, a continuous extension of the energy band for X and soft gamma-rays from few keV to some MeV and a moderate energy resolution in a robust assembly suitable for space environment. A technique for X/gamma detectors widely used in countless space experiments but that is continuously renewed thanks to the evolving of the technology is based on the use of scintillators materials coupled to suitable photodetector and electronics. Nowadays, inorganic scintillator materials like Lanthanum Bromide (LaBr3:Ce), GAGG (Gadolinium Aluminium Gallium Garnet) or similar, combine high scintillation light emission with fast response (tens of nano-seconds), and high efficiency. The choice of the scintillator can already today span in a certain number of materials whose characteristics allow when combined with a fast and efficient photodetector to fulfil GrailQuest project requirements. The criteria for the choice of scintillator can then take into account parameters like intrinsic low background of the material, non hygroscopic, cost, radiation damage. A fast photodetector for the readout of the scintillation light can be Photomultiplier (PMT) or solid state Silicon-PMT (Si-PM), both devices having a response to a light pulse than can be contained in few nano-sec. Alternatively, Silicon Drift Detectors (SDDs) can be used to readout the scintillation light with timing capabilities of the order of tens of nano-seconds. Despite their relatively lower response to light pulses, SDDs have several advantages with respect to Si-PM, namely their higher robustness against radiation environment and higher efficiency (90% against 20-30%). Both kind of devices, when optically coupled to the above mentioned scintillators, allow efficient detections of X-rays down to ∼ 10 keV and even below this energy. The criteria for the choice of the photodetector can take into account the dimension and roughness of the device, its ageing in the radiation environment of the space, and the availability for mass production. The architecture of an GrailQuest detector con be organised so that the detector itself can be divided in modules of 10-100 cm 2 each, so that the whole detector can be assembled to the necessary size adding modules, this will also ease the processing of intense impulsive events reducing the pile-up of signals in the same module.

Conclusion: GrailQuest mission concept
The scientific and technologic requirements presented above naturally drive the design of the GrailQuest concept. The full GrailQuest constellation will include the order of hundreds/thousands of detectors hosted on micro/small-satellites in Low Earth Orbit with an average separation between the modules of thousands km. The detectors will consist of a scintillator with high sensitivity in the keV-MeV band and temporal resolution 10-100 nano-seconds. The field of view of each detector will be several steradians. A trade-off between number of units recording the same event and portion of the sky monitored must be performed.
The biggest advantages of GrailQuest with respect to standard High Energy Astrophysics experiments are three: 1. modularity; 2. timing accuracy; 3. limited cost and quick development The first allows: a) to first fly a reduced version of GrailQuest (say 4-12 units, the GrailQuest pathfinder) to prove the concept; b) avoid single (or even multiple) point failures: if one or several units are lost the constellation and the experiment is not lost; c) first test the hardware with the first launches and then improve it, if needed, with the following launches. The second allows GrailQuest to open the new window of micro-second variability in bright transients. Finally, GrailQuest will exploit commercial off-the-shelf hardware and the trend in reducing the cost of both manufacturing and launching nano-satellites over the next years. GrailQuest would naturally fit a scheme where production of identical units would follow the development and testing of a first test unit. The development of a engineering and qualification models, and all tests at the level of critical components, will be performed only for the test unit. For the other units only the flight model will be realised, and these units will be tested only at the system level. All this will bring costs down and speed up the realisation of the full mission.

Synergy with other on going projects
Some of the authors of this paper are developing the High Energy Rapid Modular Ensemble of Satellites, HERMES, pathfinder experiment. HERMES pathfinder consists in six nano-satellites of the 3U class equipped each with a payload consisting in GAGG scintillators coupled with SDDs of a collecting area of about 55 cm 2 per payload. The main goals of HERMES pathfinder are to prove that GRB prompt events can be efficiently and routinely observed with detectors hosted by nano-satellite, and to test GRB localisation techniques based on the study of the delay time of photon arrival on detectors in low Earth orbit. The HERMES pathfinder payload want also to test fast timing techniques that are at the core of the GrailQuest project. The design performances of HERMES pathfinder detectors are at the level of 300 nano-seconds, 5-10 times better than most current and past GRB experiment. HERMES pathfinder is funded by the Italian Space Agency and by the European Community through the HERMES-SP H2020 SPACE grant. More information on HERMES pathfinder can be found on www.hermes-sp.eu and hermes.dsf.unica.it.