1 Introduction

1.1 Quantum technologies

Even though quantum physics is over a century old, the last twenty years of progress in science and technology have led to a tremendous level of control over actual quantum systems at the most elementary level. It is now possible to routinely prepare, trap, manipulate and detect single quantum particles, such as artificial and actual atoms, electrons, and photons. Together with the possibility of creating and controlling distinct quantum states, such as superposition states and entangled states, this second quantum revolution facilitates engineering of new classes of sensors, communication techniques, and computers with unprecedented capabilities.

Quantum Technologies (QT) allows the engineering of novel devices and infrastructures with the promise of many new applications in a number of domains that can contribute to the solution of some of today’s most pressing social and economic challenges. These technologies offer capabilities beyond any classical technique. Examples include achieving higher sensitivity, lower power consumption and higher-security and automatic, maintenance-free, quantum-referenced operation for more reliable industrial facilities, etc. Furthermore, QT paves the way for novel methods as for instance for earth surveys in times of climate change, exploration of natural resources as well as information transmission and processing, and, specifically, with respect to the last item, novel methods for unprecedented security in communication. QT-based applications are approaching the market and will be a pivotal factor for success in a wide and diverse range of industries and businesses. These technologies are vital to European independence and safety, as the fields of information processing, storage, transmission and security at large are affected by them.

We argue in this article that standardisation and mapping standardisation opportunities at a relatively early stage of the technology value chain is beneficial. We observe among research communities sometimes a reluctance to engage in standardisation activities at an early stage of the Technology Readiness Level (TRL) scale. Indeed, in bridging research and innovation with markets, standardisation plays a fundamental role in the valorisation of research results and knowledge transfer, particularly for matured research fields. However, for many scientists it remains unclear how standardisation can benefit science at an early TRL. The establishment of the CEN-CENELEC Focus Group on Quantum Technologies (FGQT), which was tasked with a road mapping, took place at a comparatively early moment. Quantum technologies vary stark in regard of TRL among the different technologies packed under the umbrella of quantum technologies, with several at a fairly low level.

This article highlights those domains in which focus group stakeholders see that standardisation activities would be beneficial. While traditionally standards are made by industry for industry, in the here described context also science benefits and is an important beneficiary of standardisation activities. Using standardisation as a mean for valorisation and knowledge transfer, researchers are motivated to perceive their research results in a technology context in which interoperability and system integration are important factors.

1.2 Standardisation as a catalyst for innovation

Traditionally, standardisation has often been perceived as standing in contradiction to innovation [1]. On the contrary, standardisation is one of the most adequate and powerful tools to quickly capitalise and disseminate knowledge and have it implemented in the industry and thus to transfer research results to the market. In addition, the standardisation process, as such, is a knowledge sharing and knowledge production process because it serves as a common platform for actors with heterogeneous backgrounds, capacities and knowledge, i.e. research, industry, academia, public administration, and the wider society.

The European standards developing organisations CEN-CENELEC and ETSI define a standard as “a document, established by consensus and approved by a recognised body that provides, for common and repeated use, rules, guidelines or characteristics for activities or their results, aimed at the achievement of the optimum degree of order in a given context. Standards should be based on consolidated results of science, technology and experience, and aimed at the promotion of optimum community benefits” [2].

Standards bring along a number of benefits. They enable a reduction of costs and an improvement of efficiency, they ensure the quality, safety, and security of products and/or services, and support compliance with relevant legislation including EU regulations. Standards satisfy customer expectations and requirements, enable access to markets and to customers in other countries. Standards achieve compatibility and interoperability between products and components and increase knowledge about new technologies and innovations [3].

Generally, standards are developed by groups of experts from industry and research. However, other interested parties, such as from policy and administration, or environmental or consumer protection, also participate in the development of relevant standards. The development can take place at different levels: If the experts convene on a national level, national standards are developed by a consensus-based process in National Standardisation Bodies (NSBs). However, the NSBs may also delegate experts to the responsible committees at European level (CEN, CENELEC) or international level (ISO, IEC), where they develop the technical content as representatives for the respective countries in consensus with other delegated experts. This makes it possible to combine the positions of different countries in one standard, securing the best possible outcome for all involved parties. Other standards developing organisations, like ETSI and ITU-T, form groups and committees from delegates of commercial companies and research organisations.

There are different types of standards, whereby the type of standards actually being developed often depends on the [4], of the innovation. For example, terminology standards and measurement standards are more likely to be developed at the beginning. As the TRL increases, performance, benchmark or interface standards are required. At the end of the innovation, when the product has reached a certain market maturity, interface or certification standards will be necessary. This is not a rule, but it provides a basis for considering which standards can be developed in which order. Nevertheless, standardisation is a public and voluntary process. This means that anyone who has the knowledge in a specific area can be empowered by their national standardisation body to set informed standards. The actual questions of which topic to standardise and when are decided independently by the community itself in a consensus-based process.

1.3 CEN-CENELEC focus group on quantum technologies

In 2018, the European Commission launched its large-scale (up to one billion euro) and long-term (10 years) Quantum Flagship research initiative to “(…) kick-start a competitive European industry in Quantum Technologies and to make Europe a dynamic and attractive region for innovative research, business and investments in this field” [5]. Its Strategic Research Agenda stresses that to achieve the goals of the Quantum Flagship “(…) it is necessary to accelerate the development and take-up by the market, which would be further enhanced through dedicated standardisation and certification efforts” [6].

In order to coordinate and support the development of relevant QT standards, the European standards developing organisation CEN-CENELEC in June 2020 kicked off its Focus Group on Quantum Technologies (FGQT) [7]. The group is developing its FGQT Standardisation Roadmap (publication planned for early 2023) to systematically address ongoing and prospective standardisation efforts. This activity evolves in conjunction with an identification of relevant use cases, potential QT-related transactions and supply chains, and specifically includes an analysis of aspects of QTs that would benefit most from standardisation, and within which time frame. The FGQT has currently more than 100 members from industry, research, and administration, and operates on a European level. Nevertheless, it aims at interaction with other standards developing organisations and QT-alliances world-wide, including ETSI, ITU-T, ISO/IEC, IEEE, IRTF, QuIC, etc. Another objective of the FGQT is the definition of terms-of-reference that would trigger the actual standards development in technical committees. The authors of this article are all delegates and contributors to the FGQT.

1.4 This review article

This article is a review of insights gained by the FGQT during its work on standardisation relevant to quantum technologies.

Section 2 and Appendix A review related work, including ongoing and planned standardisation and pre-standardisation activities by other Standards Developing Organisations (SDOs) and industry forums. Section 3 provides a view on how the FGQT aims to structure the standardisation discussions. Next, it provides initial analyses in the areas of quantum communication, quantum computing & simulation, and quantum metrology, sensing & imaging. Section 4 and Appendix B present views from individual FGQT delegates, explaining their rationale of why their organisations are contributing to standards work for quantum technologies. Section 5 concludes by highlighting the early-stage work-in-progress nature of standardisation for quantum technologies.

2 Related work

Whereas the CEN/CENELEC FGQT may likely be the first standards developing organisation (SDO) aiming at developing a standardisation roadmap for the entire spectrum of quantum technologies, it is definitely not the first organisation to address standardisation of quantum technologies. Appendix A provides an overview of past and current activities in the field being undertaken by SDOs and other organisations, on a European level and beyond, as well as references to published QT standards. This section summarises this related work, see also Fig. 1.

Figure 1
figure 1

Standardisation activities on quantum technologies. Solid lines: standards development. Dotted lines: pre-standardisation activities

Quantum-computing standards development has been started with ISO/IEC, focussing on quantum-computing terminology. IEEE is developing standards for quantum-computing performance metrics and technical architecture.

Quantum-communication standards are being developed by several SDOs. Both ETSI and ITU-T are telecom-industry driven SDOs. Both have a focus on Quantum Key Distribution (QKD), which may be the first application of quantum communication in general. Security standards for QKD are being developed by ETSI, ITU-T as well as ISO/IEC. IEEE and IRTF work on the management and control of a wider category of quantum-communication networks, including entanglement-based quantum networks and quantum internet.

Quantum-metrology standardisation is still in an emergent phase. EURAMET has recently started coordination activities for this.

Other relevant initiatives in the European context are QuIC, the European Quantum Industry Consortium, which supports European quantum-technology standardisation with industry surveys, and StandICT.eu, which provides funding to individual European ICT standardisers including some of the co-authors.

At the time of writing this article, DIN, the German National Standardisation Body (NSB) proposed to establish a European CEN-CENELEC Joint Technical Committee on Quantum Technologies, JTC22-QT, based on intermediate results from FGQT. This new technical committee will develop and coordinate European standardisation activities on quantum computing, quantum communication and quantum metrology, sensing and imaging. JTC22-QT will kick off in Berlin, March 2023

3 Analysing standards needs for quantum technologies

3.1 General

This section provides ingredients for the standardisation roadmap that FGQT is currently developing.

As mentioned in the Introduction, FGQT has been analysing standards need for QT since mid 2020, and publication of a first release of its standardisation roadmap is planned early 2023. Developing a standardisation roadmap is usually a complex process. There are many inputs to consider, including standards development elsewhere (Appendix A), other relevant roadmaps like that of the European Quantum Flagship [6], and the interests of a plethora of stakeholders. All the work is carried out by volunteers in the sense that CEN-CENELEC is not paying any of the delegates for their contributions. This means the work would be seen as an investment by the organisations that provide delegates. In some cases, the own investment is augmented by European or national grants. The purpose of the roadmap is to coordinate and align interests between delegates to a point that next steps can be made, like the starting of actual standards development.

The “ingredients” to the FGQT roadmap development are of different types. Sections 3.2, 3.3, and 3.4 look at standardisation from the technological perspective for quantum computing & simulation, quantum communication, and quantum metrology, sensing & imaging, respectively. Section 3.5 looks at standardisation from the use-case perspective, discussing a selection of specific applications of quantum technologies. Section 3.6 provides an abstract market model, that aims to connect standards needs to concrete products and services, as well as to their vendors and purchasers. Section 3.7 attempts to converge the storyline of the FGQT roadmap, and to provide some structuring to the various aspects of quantum technologies.

As all of this remains work-in-progress, none of the insights and views are complete or definitive.

3.2 Quantum computing & simulation

Quantum Computing and Simulation as an area covers many different implementations, and several enterprises are developing solutions for a mature quantum computer. The concept of a “Modular Quantum Computer”, well known from digital computing, has created a new market which has attracted many small enterprises to develop dedicated modules which are competing with more monolithic full-stack organisations. The availability of a supply chain of such modules from different vendors will enable research teams to concentrate their research on breaking new grounds, without spending much effort on duplicating known solutions. This is where standardisation can play an important role.

From a standardisation point of view, this market requires a subdivision of the field of Quantum Computing and Simulation into a variety of modules that can interwork with each other through well-defined interfaces (hardware and software), and a consensus on the functional and performance requirements of each module of interest. Instead of communicating such requirements with a single or small number of local suppliers, research teams can save effort by communicating these requirements with relevant standardisation bodies. It will increase the availability of mature hardware and software solutions in return, as well as knowledge on requirements and solutions from others.

It may be worth noting that a Quantum Simulator is a dedicated Quantum Computer, designed for solving specific problems as well as studying well defined quantum systems. They may be programmable up to a certain level. The modularity described in this paper covers both Quantum Computers and Quantum Simulators, since they use the same hardware components.

Hardware Stack. A first standardisation challenge that has been tackled is achieving consensus on how to subdivide the field of Quantum Computing into smaller and less-complex chunks. A convenient way of doing that is via a stack of layers. Figure 2 below illustrates the present state of consensus within FGQT of CEN-CENELEC, covering mainly the lowest-level (hardware) layers. The layering is chosen in such a manner that their functionality can be described independently and that the interworking between different layers can be described through well-defined interfaces at their boundaries. A module from a single supplier may cover functionality within a single layer or span multiple layers. In the latter case, interfaces may be virtual (hidden internally within a module).

Figure 2
figure 2

A possible break-down of quantum computing into layered stacks, accounting for different architectures

So far, the following low-level hardware layers have been identified:

  1. 1)

    Quantum device(s), which may include the housing, shielding, magnets, I/O connection, etc. around one or more holders with quantum devices.

  2. 2)

    Control Highway, which may include the wiring and/or fibres, free-space optics, active devices (amplifiers), passive devices (attenuators, filters, couplers), opto-electronics (photo diodes), thermalisation means, vacuum feed-through and i/o connectors to implement a full i/o chain between quantum devices and control electronics.

  3. 3)

    Control Electronics/Optics, which may include output generators, input analysers, signal processing, i/o connection, as well as low-level firmware to guide the generation of signals and the read-out of their response.

  4. 4)

    Control Software, which may include calibration means and low-level code to translate instructions from higher software layers into commands for guiding the control electronics/optics.

The software layers above the control software later described later further on in this section. Some of the hardware layers are a mix of both hardware and software, which is illustrated via different colours in Fig. 2.

The layered approach allows for using different hardware stacks for specifying the requirements of dedicated architecture families. Each architecture family can have multiple members (A, B, C, …) and the description of its hardware layers (1, 2, 3, 4) may account for differences between these members. This is illustrated symbolically via different “boxes” per layer and family, but may be merged when different members share the same functionality.

So far, the following architecture families have been identified (in arbitrary order):

Cryogenic Solid State based. This family covers superconducting solutions (transmons, flux qubits), semiconductor spin qubits, topological qubits and artificial atoms in solids.

Room Temperature Solid state based, which cover artificial atoms in solids (such as NV Centres) and quantum dots.

Trapped Ions, covering both room-temperature and cryogenic (4K) solutions such as optical qubits, Raman qubits and spin (microwave) qubits.

Neutral Atoms, covering both collision-based and Rydberg-based solutions.

Photonic Quantum Computing, covering solutions based on continuous variables, cluster states/measurement based, Knill-LaFlamme-Milburn and Boson sampling.

This classification is currently work-in-progress within FGQT of CEN-CENELEC, and may be improved/revised as new insights arise.

A next step in standardisation is identifying functional requirements for modules within these layers, and there is no need to wait for the field to fully mature before starting this work. If we take cryogenic solid-state quantum computers as an example, one may consider the following types of requirements:

  • Quantum devices: The modules of this architecture family are typically operating at cryogenic temperatures and may be implemented stand-alone, as chip and/or on printed circuit board. Relevant requirements of general nature are related to shielding, materials compatibility, operating temperature, electrical and magnetic aspects, vacuum properties, and interconnectivity.

  • Control highway: These modules cover all infrastructure needed for routing light and microwave signals, RF and DC signals between the control electronics at room temperature and the quantum device at cryogenic temperatures. It is a mix of transmission lines, filtering, attenuation, amplification, (de)multiplexing, etc. Relevant requirements of general nature are related to low thermal conductance, good thermalisation, materials with low outgassing, vacuum feed-throughs, small footprint (for high-density i/o channels), interconnection, bandwidth, filtering properties, low noise, etc.

  • Control electronics: This covers all electronics for generating, receiving, and processing microwave, RF and DC signals. Some implementations make use of routing/switching and/or multiplexing of control signals at room temperatures and at cryogenic temperatures. It may also have some firmware on chip/board to route signal preparation, control, and processing/readout. Relevant requirements for consideration are dealing with signal shapes and levels, sensitivity and functionality. But also dealing with the instruction set and software interfacing with higher layers.

  • Control software: This covers a mix of hardware and low-level driver software for instructing the control electronics, and should provide means for calibration. Relevant requirements for consideration are dealing with the software interface to higher layers for receiving sequences of instructions about when, where and what pulses or signals are to be generated, how to process and read-out the response, and means for performing calibration.

Software Stack. Another standardisation challenge that has been tackled is achieving consensus on how to subdivide the software stack on top of the hardware stack. The software layers above the control software layer may include operating system, communication primitives, software drivers, hardware abstractions, assembly / register level programming, high-level programming environments and applications/services supporting use cases. The higher software layers are assumed to be more agnostic to differences in hardware.

Figure 3 illustrates the present state of consensus within FGQT of CEN-CENELEC, covering the layers in the software stack. This layering is currently work-in-progress within FGQT of CEN-CENELEC, and may also be improved/revised as new insights arise. So far, the following software layers have been identified:

  • Control Software: The lowest layer in the software stack in Fig. 3 (control software) is the same layer as the highest layer in the hardware stack in Fig. 2. It may include calibration means, low-level code to translate instructions from higher software layers into commands for guiding the control electronics/optics, and comprises the techniques used to define error-robust physical operations and associated supporting protocols designed to tune-up and stabilize the hardware. Control software for quantum hardware is typically stored on digital computers, i.e. there is a very strict separation between the place where the control software is stored and the quantum registers. In the long term, control software may work in concert with Quantum Error Correction (QEC), which is supposed to lay at the assembly / register level programming layer, to provide broad coverage of various error types. More specifically, control software could improve the efficiency of QEC, i.e., reduce resource overheads required for encoding, by homogenising error rates and reducing error correlations.

    Figure 3
    figure 3

    A possible break-down of the software stack into layers

  • Operating system communication systems: A quantum computer must be provided with an operating system (OS), which is a resource manager for the underlying quantum hardware, provided with built-in networking functions allowing multiple users and applications to use the resources as remote clients. To an application, it appears as if it has its own resources and is protected from other applications. Applications can make use of facilities only as offered by the OS. For example, the OS provides communication primitives (e.g., based on the POSIX standard for the sockets interface [8]) and only by means of these primitives should it be possible to pass messages between client applications and the quantum computer.

  • Software drivers: These are components that are plugged into the operating system and allow hardware-abstraction programs to call the control software of the underlying quantum hardware. If the hardware changes, the software drivers must change as well.

  • Hardware Abstraction Layer (HAL): The hardware abstraction layer (HAL) should allow quantum computer users, such as application developers, platform and system software engineers and cross-platform software architects, to abstract away the quantum computer implementation details while keeping the performance. The hardware may change, but programs written in a quantum assembly language such as QASM, or even written in a register level programming layer, should still be able to work. Among all software layers for quantum computing, this is the one that requires the most urgent standardisation effort. The hardware abstraction layer should provide Application Programming Interfaces (APIs) to the upper layer, decoupling from the different types of quantum hardware technologies.

  • Assembly/register level programming: This layer concerns QASM (i.e., quantum assembly) languages that describe quantum computations according to one specific model (e.g., circuit model, measurement-base model, quantum annealing model), with a per-architecture instruction set. An example is OpenQASM [9], which targets IBM Q devices and enables experiments with small depth quantum circuits. OpenQASM can define universal circuits, that are build-up from gates like CNOT, and supports straight-line code that includes measurement, reset, fast feedback, and gate subroutines. OpenQASM possesses a dual nature as an assembly language and as a hardware description language. A different example is NetQASM [10], which is a platform-independent and extendable universal instruction set with support for local quantum gates, digital logic, and quantum networking operations for remote entanglement generation. NetQASM consists of a specification of a low-level assembly-like language to express the quantum parts of quantum network program code. Due to the huge diversity of quantum computing architectures, it is not likely that a unique, widely accepted QASM would emerge and later become a standard.

  • Programming layers: The specification of quantum algorithms using QASM languages is not easy for programmers. Indeed, QASM programs are usually generated by a software library, from a piece of code written in a common programming language, such as Python. In general, the programming layers include all the languages, libraries, and software development facilities (e.g., software development kits, debugging tools, quantum compilers) used by developers for coding quantum algorithms or high-level applications that use predefined quantum algorithms as subroutines. Quantum compilation is the problem of translating an input quantum circuit into the most efficient equivalent of itself, considering the characteristics of the device that will execute the computation and minimizing the number of required two-qubit gates. The most advanced quantum compilers are noise-adaptive, i.e., they take the noise statistics of the device into account.

  • Applications / Services supporting use cases: To effectively support industrial and research use cases, quantum applications must be executed in suitable environments. Currently, some vendors provide access to quantum devices via user-friendly cloud platforms. The quantum programs must be locally compiled for a specific device and submitted for batch processing to the remote platform. However, other paradigms are emerging. For example, the Quantum Internet will enable networked quantum applications, whose execution will involve multiple quantum nodes and will be characterised by interleaved digital and quantum message passing.

3.3 Quantum metrology, sensing & imaging

Quantum Metrology & Sensing and quantum enhanced Imaging (QMSI) exploit the properties of quantum states and peculiar phenomena, such as entanglement and non-classical correlations, to significantly improve the accuracy and precision with which parameters of a wide range of systems can be estimated and to step over limitations related to conventional classical measurement strategies, as the environment-induced noise from vacuum fluctuations (the so-called shot noise), or the dynamically induced noise in the position measurement (the standard quantum limit), or the diffraction limit [1114].

Every aspects of the physical reality, including the measuring devices used to extract information about our real world are governed by quantum mechanics, and although this imposes unavoidable fundamental limits to each measurement process—in the specific defined by the Heisenberg uncertainty principle together with other quantum constraints on the speed of evolution—the aforementioned conventional semi-classical bounds to measurement precision are not the ultimate limits and can be beaten using suitable quantum strategies. To this purpose, these QMSI new paradigms require development of techniques robust to noise and imperfections i.e., fit to real-world scenarios, spanning from very fundamental to very practical applications, and ranging from the nanoscale, by means of localised spins to the planetary scale, based on photons.

QMSI systems and devices are very promising, and in recent years an exponential growth in the interest toward applications of these technologies was observed, and now encompasses a broad range of topics [3, 1518]. However, the diffusion in the industrial or commercial world of QMSI devices is still extremely limited. Prototypes already exist, but a mass market and standardisation of these prototypes do not. This is mainly due to the lack of reliable tools and structured facilities to characterise, test and validate the current and future prototype devices. Moreover, there is a high variety of readiness level among the different technologies in this field and the maturity level is technology dependant.

From the standardisation perspective, as stated in [3], standards in the field of QMSI are understood in different ways. Quantum Technology has enabled the discovery of new magnitudes of fundamental measurements and a redefinition of primary reference standards of the International System of Units (SI) [19, 20]. With the purpose to create a common ground of understanding, a first important aspect to be clarified is that on one side, the metrology vocabulary defines the fundamental reference standards of weight and measurements (SI), while the standardisation vocabulary generally leads to documentary standards.

In general, documentary standards represent a key step for fostering QMSI technology spreading, both for stand-alone sensors and for those to be integrated in more complex systems. The standardisation process that has just began aims to provide the way for a possible construction of a general hardware platform at a reasonable cost, a step that would represent a key enabler for these QTs, in particular in fields in which commercial product are emerging, e.g. quantum sensors based on NV-centers in diamond, atomic-based sensors, etc.

So far, the two major domains of applications and related standardization needs currently envisaged by the CEN-CENELEC FGQT are: (a) novel applications enabled by QMSI devices; (b) characterization, benchmarking, and evaluation for reliable QT.

In analogy with the approach adopted in the Quantum Computing and simulation technologies, also in this case the following layers can be established to subdivide the ensemble of issues related to standardization concerning QMSI:

  1. 1.

    Quantum device(s)

  2. 2.

    Control electronics / optics / opto-mechanics

  3. 3.

    Control software

The first focus would be in the standardization of tools and technologies related to the quantum device, i.e. the quantum sensor. The central concept of a quantum sensor is that a probe interacts with an appropriate system, which changes the state of the probe. This effect known as quantum back action is intrinsically related to the physical nature of the measurement process: in this way, measurements of the probe reveal the parameters that characterise the system.

In quantum-enhanced sensors, the probe is generally prepared in a particular non-classical state. The best classical sensors exhibit a precision that scales proportionally to the square root of the number of particles N in the probe (i.e., the Standard Quantum Limit, SQL) whereas the best quantum sensors can in principle attain a precision that scales as N (i.e., the Heisenberg limit).

From the point of view of the quantum device, a useful classification [15] of the different domains and applications for implementing QMSI protocols (over various QT platforms, diversified from atomic systems to solid state and photonic devices) is schematized in the following:

  • Quantum Electronics: such as, single-electron sources (for the Sl-realization of the ampere and single-electron quantum optics, realization of the “quantum metrological triangle”), Josephson junctions (for quantum voltage standard); quantum Hall effect (for quantum resistance standard)

  • Quantum Clocks: such as, ultra-cold atoms used for optical atomic or lattice clocks atomic clocks (for new time standards and chip scale atomic clocks to address Global Navigation Satellite System resilience, network synchronization, Time Stamping, Basic research, etc.)

  • Atomic Sensors: such as, atom interferometers for gravimeter (for climate research, civil engineering, hydrocarbon and mineral exploration, GNSS-free navigation), magnetometers based on cold atoms or NV-centers in diamonds and nanodiamonds (for brain imaging, heart imaging, metrology, navigation); atomic vapour cells (for high precision electric magnetic and RF measurements)

  • Quantum Photonics: such as, single-photon sources and detectors (for ultraprecise quantum interferometers, phase discrimination for quantum communication, twin beam and squeezed light, super-resolution, Sub-shot-noise imaging, quantum enhanced microscopy, quantum enhanced displacement sensing, quantum illumination and quantum radar/LiDAR, quantum reading, quantum ghost imaging and spectroscopy, quantum photometry and quantum physics based primary standards).

Each individual QT platform can be cross-sectional with respect to the various domains, e.g. NV-centers platform is adopted both for magnetometer atomic sensors and for single-photon sources in quantum photonics; ultracold-atom optical lattice clocks are at the same time genuine atom interferometers and also atomic sensors leading to improved time and frequency standards, etc. Moreover, it is worth to mention that the examples above include both SI measurement standards used for traceability and novel QT-based measurement devices.

The requirements for proceeding towards the standardization are mostly platform specific, for instance, in the field of NV-centers in diamond one can outline the following list:

  • Quantum device(s): requirements are related to technique of synthesis of the base material (native defect concentration spatial distribution, lifetime, coherence, lattice orientation), doping technique, and resolution of the implantation (a major breakthrough being the achievement of deterministic implantation), photo-luminescence properties (photon flux, spectrum)

  • Control Electronics/Optics/Opto-mechanics: requirements are related to the parameters for DC and RF signal generation and processing, microwave power and polarization, geometry of the antenna delivering the microwave, eventual excitation pulse sequences

  • Control Software: requirements are related to the programming of the measurement protocols and sequences, data displaying and storage

Concerning the domain related to the characterization, benchmarking, and evaluation for reliable QMSI devices, the CEN-CENELEC FGQT envisaged that a general aspect to be taken into account in QMSI field refers to the complexity of its devices, which are based on the interplay of components from different partners. Constructing complex quantum devices out of these components requires dependable characterization with a “reliable data sheet” for the component under test as a result. This can be thought of in terms of a supply chain, one typical example being a foundry on one hand and a system integrator on the other hand. Only with this reliable data sheet, the quality and interoperability of QT components can be guaranteed—a prerequisite for practical applications with commercial perspective. Currently, offers for such independent characterization and testing capabilities are being developed in several National and European Programmes [21, 22], NMIs [23] and EMN-Q [15]. Testing and characterization capabilities need to be harmonised between testbeds. To put all this on a basis, standardised measurement protocols are needed. Ideally, this development goes hand-in-hand with the development of measurement capabilities and metrology for QT components not yet existing in many cases.

Benchmarking these devices can be non-trivial since the choice of performance indicators is not obvious and/or incompatible between different architectures. A set of standardised parameters and figures of merit allows benchmarking and comparing different approaches with respect to the application at hand. Quantum sensing devices can then, for instance, be measured against physical standards. Furthermore, standardised measurement protocols facilitate the comparison of quantum sensing/imaging against classical devices, crucial to evaluate perspectives of the technology. Advancing the capabilities above further facilitates future certification of QT components.

In conclusion, from an organisational point of view, given the general and cross-sectional character of QMSI field, the CEN-CENELEC FGQT emphasizes the need for a rationalization of efforts, to ensure that documentary standards that have been already prepared by existing SDO WGs/TCs for first generation Quantum Technologies (e.g. stable lasers, cryogenics, fibre-technology, chip-scale optical frequency comb, high-speed phase-coherent control of RF/microwave fields, integrated circuits, micro- and nano-fabrication capabilities for diverse materials, etc.) will be considered and upgraded taking into account Quantum-enabled technologies. In parallel, novel Technical Committees dedicated to Quantum Technologies should be created to oversee the standardisation process for the ‘native’ second generation QT (e.g. miniature atomic clocks, chip-scale magnetometers, microwave field imaging, atomic gyroscopes and Rydberg gas sensor, waveguide photonics, etc.) [3, 15].

3.4 Quantum communication

A general feature of all classes of Quantum Communication is the utilisation of a quantum channel for communication purposes. A quantum communication channel enables the spatial distribution of quantum states with high fidelity over an optical fibre, or with the help of optical telescopes through free space. Quantum communication can serve two purposes: either to perform more optimally a certain “classical” information communication task using a quantum channel, or to enable the transfer of quantum mechanical states over a distance with the goal of some subsequent utilisation in quantum computing or quantum metrology. While the first of these tasks in principle allows multiple applications, it has turned out that the only realistically efficient application to date is the simultaneous generation of cryptographic secrets at both communication ends. In contrast to the usual methods in conventional cryptography, these secrets (keys) remain secure against potential attackers independently of the attackers’ resources. This cryptographic primitive is well known as Quantum Key Distribution (QKD). In this sense, both seemingly different Quantum Communication tasks are nevertheless based on the distribution of specific resources over a distance using quantum channels. Both tasks build on the transmission of quantum states, while for QKD these states have an auxiliary nature and it is the characteristics of measurements of these states (distant correlations), which are used to create the main resource, the cryptographic key, while the states themselves are being discarded. In contrast, in the second case, the quantum states themselves are the resource that is relayed between two communicating parties, either directly or indirectly. It needs to be underlined that the term “quantum communication” might sound misleading, as if it were a replacement of classical digital communication. Quite the opposite is true: essentially quantum communication needs always to be supported by classical digital communication in order to enable the distribution of additional quantum-enabled resources.

Quantum Key Distribution. Quantum Key Distribution is a general method [2426] entailing two steps:

  1. 1.

    Utilisation of a quantum channel: distribution of quantum states end-to-end between two parties and ensuing distant measurements, yielding non-classical correlations in measured random data;

  2. 2.

    Classical post-processing: subsequent iterative communication rounds between the distant parties, involving local processing of the measurement data to distil identical random bit strings, completely unknown to external third parties. These strings can then be used as cryptographic keys in conventional cryptographic applications.

There is a huge diversity in QKD systems, as realised and even industrially produced already today (several of these can be seen in the equipment rack in Fig. 4). With QKD, there are two main challenges: to ensure that the channel is really quantum (rather than some potentially flawed approximation), and to be able to perform the classical post-processing stage faithfully and efficiently. Often, advanced protocols utilise very clever combinations of both stages to ensure the final objective–an efficient generation of an end-to-end secure key.

Figure 4
figure 4

One of the central nodes in the Madrid QKD testbed, showing several QKD systems by different vendors in an equipment rack (Courtesy Vicente Martín, UPM Madrid)

Traditionally, QKD systems have been using a point-to-point communication medium (e.g. an optical fibre) to transfer quantum states (pulses of light). If we speak of light, ensuring the quantum nature of the signal means that very weak pulses need to be used. Light is unavoidably absorbed after some attenuation, i.e. communication distance, and this means that the range is restricted to typically a few hundreds of kilometres at best. The distribution of cryptographic keys over longer distances can only be achieved by means of composite systems, connecting several QKD links by means of “secure locations” (or trusted nodes) as interconnection points. These composite infrastructures are known as QKD networks of the first generation [27, 28].

Technologically still in the future, but otherwise conceptually realistic, is QKD over long distances by creating indirect quantum channels. This requires long-distance end-to-end quantum teleportation supported by end-to-end entanglement distribution relying on so-called quantum repeaters. These composite systems are the prospective Quantum Communication Networks of the second generation.

Quantum State end-to-end Distribution. At first glance, it appears that this is a task identical to the first step of the QKD general method. Unfortunately this is an oversimplification, as even if a quantum state is successfully distributed to a receiving party, there is presently no quantum mechanism for this state to be “taken over” (actually received in the direct sense) by this party. The existing approaches rely on the mentioned quantum teleportation, in which the receiving party manages to create a state, identical to the one sent, at the price of destroying the first one and thus complying with the quantum no-cloning theorem. Similarly, as discussed above, an indirect quantum channel needs to be created, and only then, the input state can be teleported from the sender to the receiver.

Technically this is identical to the creation of an end-to-end quantum communication channel, as outlined above in the description of a Quantum Communication Networks of the second generation. For such networks, the second step of the general QKD method, namely the measurement of the “transmitted” state and the subsequent post-processing, is not needed.

It is, however, not actually reasonable to distinguish the design of (long-distance) end-to-end quantum communication channels for QKD or “just” for quantum state transfer: they are fundamentally identical. In this sense, all expressions of Quantum Communication Networks can be subsumed as Second Generation Networks (also known as Quantum Information Networks, or more expressively, although also a bit misleadingly, as the “Quantum Internet” [29]).

From the point of view of Quantum Technology, it is the development of these networks involving in turn the development of quantum repeaters, entanglement purification and teleportation protocols, which still represent future challenges of major importance.

It was actually in the field of Quantum Communication, where the standardisation of quantum technologies started in 2008 with the foundation of the ETSI Industry Specification Group for QKD (ETSI ISG-QKD), when the need arose for respective standards in a European funded research project. In the FP6 Integrated Project SECOQC, a first attempt toward security certification of QKD technologies was undertaken, which identified missing standards as a major roadblock. Since then, the group has produced several base standards, standards for interfaces, for components of QKD systems, for security proofs and security certification, and in other areas usually subject to standardisation. For about ten years, the ETSI was the only standards developing organisation active in this field—until other SDOs began to join in. Today, QT standardisation has become a major issue, with many SDOs active in the field, among them ISO/IEC, ITU-T, CEN-CENELEC, IEEE, the IRTF, as well as industry consortia, like the QuIC, and several metrology institutes. See Appendix A for a summary of ongoing standardisation activities.

While several areas in quantum communication standardisation are already substantially covered, there are still gaps and blank spaces where much needed standards are missing. This is especially true for the field of QKD security certification, where the ETSI ISG-QKD has produced a sample ISO/IEC 15408 “Common Criteria” protection profile for a prepare and measure QKD link, and the ISO/IEC JTC1 SC27 WG3 has produced two standards for requirements and test and evaluation methods for QKD products: Any security specification for a QKD product will need to rely on additional so-called background documents, especially where cryptographic protocols and algorithms need to be specified. All these cryptography-related choices will need to be based upon widely recognised and accepted background documents, like standards, or technical specifications. The specification of the quantum optical subsystem too will need to rely on compliance with external standards, e.g. for the employed QKD protocol, including its security proof, for specific components, like photon sources and detectors required by the QKD protocol, for random number generators, for attack methods and for attack rating methodologies. This issue has only arisen recently, and a timely identification of required background documents, i.e. standards, as well as the coordination of an efficient generation of these by standards developing organisations active in the field has become a major issue to prevent a potential major roadblock on the way to certified QKD products. Here the European CEN/CENELEC FGQT, as well as its envisioned CEN/CENELEC Joint Technical Committee successor, intends to assume a coordination function with the maintenance and dissemination of its QT standardisation roadmap.

After all, coordination of standardisation activities is becoming increasingly important in quantum communication (QKD) standardisation: It seems that most of the involved standards developing organisations are trying to cover the entire field of QKD standardisation by themselves, without looking much at the activities of other SDOs. This has currently already led to incompatible standards for security certification in the ETSI and ISO/IEC, as well as to several competing developments towards standards for cryptographic key management in QKD networks. If no effective steps are being taken to counteract that trend, the situation might end up as in the sector of IT cloud standardisation where dozens of standardisation developing organisations have developed virtually hundreds of competing standards, while the mostly used (de-facto) standards are set by major industrial players (Amazon AWS) outside the participative processes of actual standards developing organisations.

While in some fields multiple parallel approaches represent a potential “danger” and inefficiency, other areas, besides that of the mentioned background documents, still remain a blank slate: Currently there are no standards available e.g. for QKD protocols and for specific classes of photon sources and detectors. Networks for transferring quantum mechanical states (“full quantum networks”) are even not at all covered in QT standardisation. Therefore, there is ample room for additional work to be carried out in quantum communication standardisation—with a substantial need for active coordination of activities to avoid the pitfalls of inefficiencies and double work.

3.5 Use cases and applications

With quantum technologies being a very broad field, so are the possible application areas. Developments in all areas of quantum technology bring these applications ever closer. However, even the best technology has little use if it is impractical in operational settings. It is therefore important that different aspects of quantum technology can properly interact and that they align with the classical technology already in place.

We identify two ways for a seamless interaction of new quantum technology with itself and with already existing technology. The first is one party developing everything themselves, which is reserved only for a small handful of parties. The far more common option is one party developing a part of a larger technology pipeline. This requires the input and output of the developed technology of one party to align with that of another party, as otherwise these cannot directly interoperate. Standards help interoperability as shown in the previous section.

This directly poses a challenge, as with the high-paced developments, interoperability and standards usually form a secondary priority and specifications can furthermore rapidly change. Additionally, if applications are still unclear, standard requirements might be hard to define.

Use cases can help identify future applications of quantum technology and different functionalities this technology should offer. Use cases also allow users to envision ideal situations in which the technology is used and the accompanying functionalities.

Below we identify a few use cases and we identify why they might help developing new future standards.

Historically, the first application of secure-key distribution for enabling subsequent secure communication is one of the first use case of quantum technologies. From a practical point of view key distribution has to be performed without distance restrictions. To avoid the intrinsic limitation that absorption poses on quantum signals, the enabling quantum technology has been and will be augmented by the utilisation of composite quantum communication networks (of different generations as outlined below). First realisations are already near practical utilisation in telecom and government domains. For practical usage, it is also important how the generated secure key material is used in securing the future communication, as incorrect usage might leak (part of) the generated key.

Cloud-based quantum computing is expected to become important in a number of future use-cases. The quantum computers will act as a secondary processor hosted in the cloud and integration with classical high performance computers, allowing many computationally hard problems to be solved efficiently. This requires standard protocols for interaction between the quantum computer and classical devices. We also require standard methods to decompose workflows in smaller steps, each of which can be implemented on their respective device [30]. Finally, different components of the underlying quantum hardware have to interact, which requires standards if these components come from different vendors.

Examples of applications of Quantum Metrology & Sensing include: (i) quantum magnetometers (based on cold atoms or Nitrogen-vacancy centres in diamonds and nanodiamonds) for highly sensitive magnetic-field sensors, which have the potential to improve navigation and various imaging applications in arenas such as healthcare and brain imaging [31]; and (ii) Quantum-enhanced imaging, exploiting non-classical states of light to enable improved imaging performance, which is at the basis of ghost imaging (improving signal-to-noise ratio for applications at few photons level), quantum multiphoton microscopy, quantum coherence tomography, quantum interferometry and quantum lithography [16]. Standards for these applications include metrics to quantify the quality of the devices and calibration routines.

3.6 Supply-chain model for quantum technology standards

This section provides an abstract supply-chain model for QT standards. The purpose of this section is to provide inspiration for the initiation and development of standardisation for quantum-technologies.

The markets for quantum technologies are mostly still embryonic, and it is not known what these different markets or their full supply chains may look like. However, some markets are already emerging. In particular, there are emerging markets of scientists and technology developers who are building quantum platforms and systems from components that implement enabling technologies. Associated are markets for measurement equipment and measurement methods.

Let us start with an example from Sect. 3.2 on quantum computing.

Example Supply Chain

(Cabling of quantum computers)

Cryogenic quantum computers (e.g. transmon type) require a lot of cabling to control and read out the potentially thousands qubits. Whereas cabling and connectors are existing products, this extreme environment and specific applicant poses requirements that are different from regular (e.g. computer) cabling. Transmission requirements are different (e.g. on attenuation of control signals, and possible embedded passive or active functions), there are additional requirements on signal density, thermal isolation, vacuum insulation, outgassing and likely even more.

As a consequence, this cabling (“control highway”) is a specialised product with its specialised vendors, and it has a dedicated supply chain to purchasers that are developing and integrating cryogenic quantum computers.

Standards will help both vendors and purchasers to characterise the relevant parameters of this product. Vendors can use the standard for marketing through benchmarking of their product against competitors. Purchasers refer to the standard to reduce the risk of a mispurchase.

This example can be abstracted into a more general supply-chain model for quantum technologies, see Fig. 5. This model includes two or more counterparties (“Party A”, “Party B”) that engage in a transaction, e.g. a purchase. The transaction involves a product or service. Several aspects of that product or service may be standardised.

Figure 5
figure 5

Abstract supply-chain model: a business transaction related to standards

The following are further examples, albeit a bit more abstract, of where standards could be relevant in supply chains for quantum technologies.

Example 1

(Purchasing an ion trap)

Ion traps are a versatile quantum-technology component that can be applied to a variety of platforms and systems. A hypothesis is that there will be a market, where ion traps can be purchased as a unit that could be integrated into a platform or system. Standards may help enhance this emerging market for ion traps. One standardisable aspect is the characterisation of ion traps that enable comparing and benchmarking between different ion-trap products and vendors. Another standardisable aspect may be the connection interfaces to integrate an ion trap into a platform or system.

The product in this example is an ion-trap product. The counterparties are the vendors and purchasers of these ion traps. Standardisable aspects are technical characterisation and interface specification of ion traps. A benefit of standardisation would be the creation and growth of a market for ion-trap products.

NOTE: This example may be applicable to many more enabling technologies that could be productised.

Example 2

(Configuring a quantum simulator)

Quantum simulators enable the simulation of e.g. chemical processes, for the study of electronic and magnetic properties of particular systems or even design of new materials. A quantum simulator needs to be configured for a specific simulation task. Standardisation of the hardware components of a quantum simulator may provide information on its technical performance (e.g. quantifying with which fidelity the simulator can prepare an initial state and reproduce a specific Hamiltonian). On a more generalised level, non-system-dependent standards may provide a language to configure a quantum simulator.

The product in this example is a configurable quantum simulator. The counterparties are a chemical engineer who orders a quantum simulation, and a quantum engineer who configures/constructs the requested quantum simulator. A standardisable aspect is the language that describes the configuration. A benefit of standardisation would be the creation and growth of a market for quantum simulators.

Example 3

(Developing an algorithm for a cloud quantum computer)

Similar to the previous example would be quantum-computing capacity that is offered as a cloud service by multiple competing cloud-quantum-computing service providers.

The service in this example is a cloud-computing service. The counterparties are a developer with a quantum-computing task and a cloud-quantum-computing service provider. A standardisable aspect is an application programming interface. A benefit of standardisation would be the creation and growth of a market for cloud-quantum-computing services.

Example 4

(Installing a QKD ground station)

Quantum Key Distribution (QKD) enables a highly secure generation and distribution of cryptographic keys that does not rely on the unbreakability of mathematical algorithms. QKD has already been demonstrated in the space domain, involving orbital satellites. It is envisioned that a QKD network could be constructed with multiple satellites, multiple ground stations and multiple operators that operate different parts of the QKD network.

The product in this example would be the installation of a new QKD ground station. The counterparties are the operator of the ground station and the equipment vendor that provides the QKD equipment for the ground station. A benefit of standardisation would be the creation and growth of a market for QKD equipment, as well as avoidance of vendor lock-in towards a specific QKD technology provider.

Example 5

(Purchasing quantum-technology measurement equipment)

The specifications of quantum-technology platforms and systems, as well as their enabling technologies need to be verifiable. Measurement equipment is required for this. Standards would specify the measurable aspect, and what are the technical requirements that a specific measurement should satisfy.

The product in this example would be measurement equipment that is used in a quantum-technology context. The counterparties are the purchaser and vendor of the measurement equipment. Standardisable aspects are the measurement method itself, as well as characterisations of the fidelity of the measurements. A benefit of standardisation would be the creation and growth of a market for measurement equipment for quantum technology set-ups.

Example 6

(Submitting a scientific article for peer review)

Related to the previous example are scientific articles that involve measurements. If a measurement method is specified in a standard, then the author can reference that standard, instead of completely detailing it.

The product in this example is the submitted scientific article. The counterparties are the author and the peer reviewer. The standardisable aspect would be the applied measurement method. A benefit of this standardisation is that it would make scientific articles on a specific measured feature better comparable, enabling benchmarking the results from the scientific article against its cited references.

3.7 Structuring standardisation of quantum technologies

This section concludes with an attempt to converge the storyline of the FGQT roadmap, and to provide some structuring to the various aspects of quantum technologies.

Quantum Technologies are commonly structured in domains. The European Quantum Flagship roadmap [6, 17, 18], for instance, features four domains or ‘pillars’ covering the whole range of QT. For the identification of standardisation needs, however, this classification brings the disadvantage that there are strong commonalities between those pillars, as for example some enabling quantum technologies that are equally applicable to all domains with associated parallels in their standardisation perspective. Within discussions in FGQT, this challenge was tackled and an overarching framework for the identification of standardisation needs was developed that underlies the work-in-progress roadmap. The remainder of this section highlights the view on this framework developed by FGQT during 2021.

A common way to categorise the range of QT is to think in terms of pillars, namely quantum communication, quantum computing & simulation and quantum metrology, sensing & enhanced imaging. This is, however, an oversimplification due to many matrix-like connections as we discuss below.

Indeed, this is common for all QT underlying hardware or “enabling technology” that facilitates the design and prospective manufacturing in the QT pillars/domains mentioned above. Accordingly, the tools, as for instance software, used for controlling quantum states are typically universal. Combining elements from enabling technologies and tools facilitates the assembly of subsystems, which can be combined to create QT platforms, systems and higher-level composite systems or infrastructures. These various technological levels then give rise to societally relevant applications to be grouped in general use cases.

We propose to consider these horizontal layers common to several or all fields of QT, representing different levels of technological complexity and proximity to applications and use cases jointly with the traditional pillars. Naturally the latter are still highly relevant, but their connections to each individual horizontal layer, representing a “matrix structure”, need to be included. In Fig. 6, we illustrate this idea of a hierarchy or matrix of QT with the architecture of a temple. In this structure, standardisation needs can naturally be identified by connections between the different horizontal and vertical layers. For instance, interoperability between the different layers requires well defined interfaces. This idea is further reflected in Fig. 7, where we explicitly show some connections, which are natural points of identification of standardisation needs. Working in this picture that includes all fields of QT requires a group of experts with the corresponding breadth of expertise. We note that these connections strongly depend on the development of QT components and might not be obvious to identify, but the matrix structure helps to guide the development. Furthermore, the nature of the standardization need might also depend on the layers in Fig. 7. For instance, whereas in the “enabling technology” layer, specifying characteristic properties (for example, material properties like NV center purity or density; or heating rate of an ion trap, or noise figure of a TWP amplifier) might be typical, in the “platform” layer a performance characteristic might play a significant role, as for instance, sensitivity of a sensing device.

Figure 6
figure 6

Quantum technology “temple” structure (Source: [32])

Figure 7
figure 7

Connections between different technology levels (Source: [32])

There is currently a multitude of standardisation activities worldwide in quantum technology and some but not all sub-areas. In many cases, these activities are constrained to a certain sub-field and, correspondingly, to a specialised expertise. While this has the advantage of providing a relatively straight-forward means to arrive at specific standardisation actions, the larger picture of QT as sketched in this document cannot be addressed. The underlying matrix structure as discussed above cannot easily be represented. In particular, standardisation needs common to several of the sub-fields might be addressed in an incompatible way, potentially leading to roadblocks later on. Consider a component from the “enabling technologies” layer that is used in two QT domains/pillars. For example, a certain material might require standardized parameters to qualify for a quantum sensing application (could be a NV-center). The same material could be used for a quantum computing application, however, the relevant parameters are not necessarily identical. Another example could be a control software, or a device (laser) for manipulating a quantum state. In all those cases, one common standardized set of parameters (of course, with values ranges corresponding to the application at hand) constitutes a sufficient and complete description. At the same time an incoherent, incompatible or incomplete description, valid for one application but not for the other, is prevented. In the worst case, different and specialized standards are developed specifically and independently for each application and domain which might limit the usefulness and thus create a roadblock for effective standardization.

4 Organisations perspectives

4.1 General

Organisations may have many reasons why to contribute to standardisation. Standardisation is a good place for networking with other organisations that have similar interests. Academic and research organisations may use standardisation to showcase their technologies and competence. Organisations that purchase technologies may use standardisation to coordinate use cases and requirements, making sure that upcoming products address their market needs, and less need for expensive proprietary solutions and their associated vendor lock-in risk. Service providers may use standardisation to coordinate with their suppliers, using the regulated SDO environment to assure fair competition. Suppliers of technologies may coordinate with competitors on non-competitive aspects of their product, to reduce market fragmentation and achieve critical mass for new product categories. Regulators may contribute to standardisation, to ensure that regulatory requirements can be technologically met. Other organisations may have other reasons, which include consultancy and patent licensing.

Section 4.2 provides some statistics on FGQT and its delegates. Section 4.3 summarises the rationale of the co-authors and their organisations to actively contribute to FGQT. Appendix B provides more detailed insights from these organisations, why they have been contributing to FGQT, and what they would like to achieve.

4.2 FGQT statistics

When FGQT kick-off mid-2020, the delegates were asked about their affiliation from which they participate in FGQT. More than 50% of the delegates had an academic affiliation, 30% industry, and 11% standards, see Fig. 8. This highlights the early stage that QT standardisation is still at, as standardisation for mature markets is dominated by delegates with an industrial affiliation, contributing to standards with a specific product or service interest.

Figure 8
figure 8

Background of FGQT delegates (mid 2020)

Most European countries are represented in FGQT, with some dominance of Germany, Italy, UK, Switzerland and Netherlands, See Fig. 9.

Figure 9
figure 9

FGQT interest from the various European countries (mid 2020)

Mid-2022, the FGQT member list counts over 110 delegates. FGQT has had 27 meetings to this point, mostly via video and two physical meetings. There have been over 250 contributions to the FGQT roadmap and use-cases deliverables.

4.3 Rationale of FGQT delegates

Figure 10 summarises the main area of contribution of the most active FGQT contributors, the co-authors of this article. Appendix B provides more detailed about each. Note that the organisations labelled as “academic/research” each contribute to one or more of the areas quantum computing, quantum communication and/or quantum metrology.

Figure 10
figure 10

Active FGQT contributors (mid-2022)

5 Conclusion: work in progress

This review article makes clear that standardisation for quantum technologies is still at a very early stage. Most quantum technologies still have low technology readiness levels. Nevertheless, supply chains are already emerging for some quantum technologies. For example, research infrastructures for quantum computing are quite capital intensive, and standardisation may help rationalise and speed up this research. Commercial products are also becoming available in quantum sensing, metrology and imaging, where standardisation may help structuring and growing the market. Published standards on quantum key distribution are already available. On the other hand, entanglement-based interconnection between quantum computers only exists as a concept.

More analyses are needed to determine “standardisation readiness” of the different quantum technologies, and when the market is expected to be ready for formal SDO development of technical guidelines (best practises), technical reports (use cases, requirements), and technical specifications (interoperability). For this purpose, the CEN-CENELEC FGQT has developed a roadmap for standardisation relevant to quantum technologies. Its work includes coordination with other SDOs and industry forums, and actual standardisation work will be spun off from this in CEN-CENELEC JTC22 on Quantum Technologies.