1 Introduction

Quantum computing adoption maturity varies across industrial and technical domains. Industries dealing with great uncertainties and materials simulation have been the first to seriously invest in the adoption of quantum computing technology. Intractable problems with multiple variables, missing data, and unknowns, such as risk and demand prediction in the finance and utilities industries, are difficult to solve using classical techniques. Manufacturing industries have reached the limits of current computational simulation capabilities and are looking to improve precision in simulations to move away from costly physical hardware development cycles. In contrast, quantum computing applications in databases and data science are in an earlier stage of development.

This article introduces key technologies and discussion points revolving around the evaluation of quantum computing technology readiness and adoption. Particularly, the notion that quantum computing seemed to be an unattainable concept until recently, due to concerns that disturbances at the macroscopic level would never allow for the concept of entanglement to be implemented in a quantum computer. The logic and experimental observations of entanglement, which shifted critics’ rhetoric from “it will never happen” to “it will never be commercially viable in my lifetime”, were published around 2015 and were the subject of the Nobel Prize for Physics in 2022.

In the coming sections, we provide an introduction of key concepts, insights into the rapid advancement of the state-of-the-art, an overview of how companies in Germany are approaching the development of competency, and an overview of resources and networks to start exploring. The paper is organized in following sections: Sect. 2) Fundamental concepts of Quantum Computing with more details on the data stages; Sect. 3) What to consider when selecting a technology; Sect. 4) Latest Technology Advancements—with a focus on superconducting qubit technologies; 5) Use case areas; Sect. 6) The steps companies and governments have implemented internally to start building competency and learning references with a focus on materials for German speakers.

In conjunction with the other articles in this edition of Datenbank-Spektrum, we intend to provide inspiration for new research and development topics in database engineering and science that could be augmented through quantum computing. A version of this article will be made available in German language on arxiv.org later.

2 Fundamental Concepts of Quantum Computing

Gate-based quantum computation relies on five key concepts—Programming Quantum Information using Gate Operations, Superposition, Measurement, Interference, and Entanglement. Quantum gate operations, analogous to logic gates in classical computers, are used to write information into the qubit, the quantum equivalent of a bit. A quantum gate operation enables the creation of superposition, where instead of having a single piece of digital information, such as a 0 or 1, at the location, the information exists as a combined probability of being 0 or 1.

The quantum information is collapsed into a readable value, such as a 0 or 1, when we measure and therefore quantum computation requires the calculation to be repeated multiple times to build the statistics. This allows any value between 0 and 1 to be represented at that physical location. (Somewhat analogous to a coin-toss with a weighted coin. It is both heads and tails in the air, and the coin-toss is repeated to identify the probability of it landing heads-up).

The loss of qubit information due to unwanted interactions, is known as decoherence, and is quantified by two lifetimes, T1 & T2: T1—the time in which the excited state, used to represent, 1, relaxes into the ground state, 0; T2—the time for which a superposed state can last. The calculations must be completed within these times.

The information exists in a wave-like state in the qubit, therefore we can engineer constructive and destructive interference of information with gate operations, which is not possible in classical digital computers, and opens the possibility of new algorithms, such as Grover’s search algorithm, where a query programmed into a computer relies on interference to search all possible solutions and amplify the correct solution simultaneously.

A different type of quantum gate operation can be used to create entanglement of information between qubits. This special kind of correlation, not available in classical computers, results in a linear combination of all permutations of the qubits readout values. Entangled qubits share a state that cannot be decomposed into a product of individual states. In other words, three entangled qubits can have weighted probabilities of the combinations of 000, 001, 010, 100, 110, 101, 011, 111. Therefore, we can store all these 9 combinations using only 3 physical locations. This exponential increase in storage and unique behavior of entangled qubits opens a new realm of computational capability.

These are the concepts that differentiate gate-based computing, currently the most versatile type of quantum computing, and the focus of this paper.

The combination of these fundamentally different concepts and their application to computation provides new methods to view and interact with information. Assimilation of these skills can take anywhere from one to three years and continuous learning is required due to the fast pace of developments. In the past year claims of “quantum utility” have been reported, where quantum computers are performing with comparable results to High Performance Computing (HPC) on simulations of specific physical models. It is expected that the first quantum applications will be hybrid algorithms, also known as embedded algorithms, where a quantum computation is a subcomponent of a larger calculation with the remainder being calculated classically.

Technology has advanced significantly in the past two years. According to current roadmaps of multiple vendors, it is optimistically expected that quantum computing hardware that will be capable of computations with value for business within the next 5 years. Scenario planning by governments recognize risks to data encrypted with standard algorithms, to be decipherable by quantum computers as soon as 2030 and are pursuing new encryption algorithms standards [1].

Some problems will continue to be more suitable to be run on classical computers. Post-quantum cryptography, also referred to as “Quantum Safe”, uses mathematical functions which are difficult to map into a quantum computer in an efficient way to develop encryption algorithms to protect data on classical computers.

The quantum industry and society are also conscious of the ethics of the technology development and have adopted best practices, processes, and guidelines from Artificial Intelligence. The unimaginable potential applications, and potential benefit for society is a tremendous prospect.

2.1 Insights on Data Workflows

The data workflow in Quantum Computing bears resemblance to classical computing at the abstracted level. It also consists of Data Acquisition and Preparation, Data Analysis and Data Validation.

2.1.1 Data Acquisition, Preparation, and the Need for Application Knowledge

Data is acquired in a classical form; the quantum properties are then applied to the data in the quantum computer. Data that will be used in subsequent quantum computations has a limited lifetime; the information degrades with time. The current state of quantum computing is referred to as the noisy intermediate-scale quantum (NISQ) era. These processors are sensitive to their environment, prone to quantum decoherence, and not yet capable of continuous quantum error correction. This is improving significantly with advancements in materials, and there are techniques that can be applied to refresh the information, such as, “Dynamical Decoupling”.

Data encoding is the process of mapping classical data to quantum states that can be manipulated in the qubits. Various encoding schemes exist, such as amplitude encoding, phase encoding, and quantum support vector machine (QSVM) encoding.

Example

Take the example of the simulation of the ground state energy of a molecule or spins in a simple diatomic molecule. How do you map this problem into qubits? For example, spin orbitals have different shapes, how do you decide how many qubits are needed to represent these? There are different rules for doing this. These are called transformations or mapping and are typically named after their inventors. The two common methods are Jordan-Wigner mapping to map a fermionic system into a spin system and Bravei-Kitaev mapping [2, 3].

In the original experiments performed in 2017 [4], the experts were able to simplify the algorithm by understanding symmetries in the molecule and applying domain knowledge.

2.1.2 Data Validation

Data validation is an essential step in any data preparation process. It involves checking the quality, consistency, and integrity of the data to ensure that it is suitable for subsequent computations. Many of the methods used to validate data are not implementable in a quantum computer because quantum information cannot be copied since reading the qubit collapses the quantum state.

2.1.3 Interoperability and Standards

The technology and standards required for hybrid calculations where the HPC and quantum computer need to share data resources is a topic with much potential for development. It may be too early to set standards for architectures and technologies as this may stifle the flexibility that is required. International, EU and national bodies are developing a taxonomy and common vocabularyFootnote 1 in order to establish a shared baseline from which all can build.

2.1.4 Data Preparation, Processing and Storage

Classical data is stored in bits. Only one bit of information either—0 or 1—can be stored in a location. A quantum computer’s equivalent of a qubit can store the linear superposition of both 0 and 1; an ensemble of qubits can hold even more information: 2^n, where n is the number of qubits.

Present-day systems have serial data inputs. The increased data capacity of a qubit does not result into an ability to process Big Data in short amounts of time because of these input restrictions.

Real world data needs to be encoded into qubit logic, analogous to the way words are converted into binary strings in a classical computer. The “state preparation” of an entangled state to represent the initial conditions of a simulation, can either exactly or approximately represent the initial state. State preparation is a popular research field, as advancements could reduce processor resource requirements [5,6,7].

Quantum Machine learning (QML) is one of the most popular areas for data scientists to enter quantum computing. Structured examples for data preparation to proceed on a quantum computer bear similarities with classical methods where, feature selection and normalization, and dimensionality reduction are also applied. The way the data is prepared [8] significantly affects the effectiveness of the QML algorithm [9,10,11]. For example, the data encoding affects the expressivity of quantum machine learning models expressed as partial Fourier series [12]. Examples of data prepared for a use case of QML to aid in antibiotic drug discovery [13], and publicly available sonar and diabetes sets [14] are available.

An advantage of QML is the ability to map the data into other dimensions [15]. A reduction of effective dimensionality allows data classification that would not have been possible in a classical arrangement [16]. Methods to train a quantum model are under development [17, 18]. It is suggested to have the future potential to speedup of training models [19], which would reduce the carbon footprint of the initial training of models.

An additional challenge is that a full quantum state cannot be cloned, making copying information for random access memory difficult or impossible. Quantum information can be transported and temporarily stored in another physical location in certain quantum computing technologies, for example in ion-based computers, as a form of Quantum Random Access Memory. A realization of a fast RW-QRAM would allow calculations to be made with less qubits. Current concepts and challenges are illustrated here: [20].

2.2 Types of Quantum Computing Technologies

There are many different technologies for quantum computing hardware. Quantum Computers can be made of many different materials, such as superconductors, ions, cold atoms, silicon dots, topological materials, photonic crystals, nitrogen vacancies, nuclear magnetic resonance. The two most advanced are superconducting and ion-based systems [21].

They all fundamentally have qubits—two-level energy systems—that can have superposition and entanglement. Multiple energy levels called qutrits [22,23,24] and qudits [25,26,27] for three or four energy levels respectively are being investigated in some material systems. Quantum Computers can roughly be categorized into four types according to the way they reach the final solution:

  1. 1.

    Gate-Based Quantum Computing uses in discrete gate operations and measurements to calculate a logical solution of a quantum algorithm;

  2. 2.

    Analog Quantum Computing [28, 29]—the quantum state is physically represented using continuous variables and continuous transformations, e.g., fermionic atoms are physically trapped in a physical lattice to behave like electrons;

  3. 3.

    Measurement Based Quantum Computing is a more nascent technology. A popular implementation consists of a large, entangled state is created in a photonic lattice, and the extraction of photons from the lattice behave as a gate [18, 30];

  4. 4.

    Quantum Annealers—that are restricted to solve specific types of optimization problems which are describable by Ising Hamiltonians [31,32,33].

When entering the field of quantum computing for the first time, it can be daunting to understand which technologies to use and which areas to focus on. Different quantum computers arriving at the solution using different mechanisms. For example, ion-based computers use ions as qubits and can physically be moved from one location to another, whereas in a superconducting qubit the information travels between static qubits along resonators. Each of these technologies come with their own advantages and they are also at varying degrees of maturity. In the next section, we discuss the main metrics and features to consider.

3 Choosing a Technology—Metrics and Features

When the first quantum computers became available on the cloud, around 2015, there was a race for manufacturers to lead in the number of Qubits. There is much more to a quantum computing chip than the number of qubits. How they are addressed and connected, can be equally as important for successful implementation.

3.1 Metrics—Benchmarking

The challenge of comparing disparate systems through benchmarking is known from the High-Performance-Computing domain. The comparison of systems with different architectures and accelerators becomes challenging as certain techniques are specific for certain architectures and may not necessarily measure the qualities that are most relevant to the problem you are trying to solve. For example, do you need accuracy, or speed?

Application benchmarking is particularly difficult across machines as an in-depth knowledge of the layers of the stack and special methods suited to the qubit technology. Consortia, such as the Quantum Economic Development Consortium (QED-C), are exploring ways to benchmark fundamental subroutines and functions and qualify the lacunae in current benchmarks [34]. Business leads are of the opinion that the cost and effort of going into benchmarking for applications only is of interest when use cases that can generate higher revenue are found, and that efforts would be more wisely invested in developing new algorithms, conceiving new applications for use cases.

3.2 Metrics—for the Era of Utility

Quantum Volume was developed to reflect that a qubit ensemble maintains its state for long enough to complete a computation program without being disturbed. The definition of quantum volume has evolved over time [35].

It is a function of the number of qubits and circuit depth. Circuit depth is a measure of effective computation layer, or in other words a quantum program length that can be computed before the error becomes too large due to the corruption of data [36, 37].

The challenge with such a metric is that a machine with a few qubits of great quality can have the same quantum volume as a system with many qubits and a short circuit depth. Consequently, this metric would not help you find which of the two computers is best for your problem. This metric has most value when applied to a single machine with a constant number of qubits, it can be a way of measuring progress on the quality of the system with respect to itself. New metrics, such as Error per Layerd Gate (EPLG), and extensions of metrics such as Circuit Layer Operations, have been recently announced to improve their scalability and coverage (video: [38]).

In the last two years, experts have been looking at a combination of metrics in addition to quantum value to select the quantum computer. These are generalized as performance metrics and can be grouped into three categories:

  • Scale—number of qubits;

  • Quality—decoherence times, circuit fidelity, stability;

  • and Speed—circuit execution speed.

There is typically a trade-off between quality and speed. A qubit that interacts less with its environment, and therefore has a longer decay time, also typically takes longer to read-write information. Refer to the manufacturer’s websites for the latest metrics values, e.g., [39].

Additional features to consider when selecting a system are “capability” and “frictionless experience”.

3.3 Features—Capability

Quantum computing is currently in its infancy with certain respects of software engineering practices. Validation and verification concepts need to be redesigned due the collapse of the quantum state holding the information upon measurement. Capabilities of a quantum computer not only require advancements in quantum technology, they have significantly increased in the past two years due to improvements in the classical components across the system, such as redesigning the software stack architecture, introduction of new assembly languages [40], and improvements on the control and system electronics, as well as significant developments and innovation in the technologies used to connect qubits.

Capabilities that we take for granted, such as conditional statements—“if this, then that”, are not implemented in all types of quantum computer. For example, the ability to change the course of a program based on intermediary results was introduced in 2022 in IBM’s computers [41] and was used to create a large entangled Greenberger-Horne-Zeilinger (GHZ) state with a significantly shorter circuit length of fixed circuit length (without dynamic circuits, GHZ creation in a gate-based quantum computer requires circuit length to increase with number of qubits) [42].

Figure 1 is an evaluation made by an independent market analyst of the quantum computing vendors which offer access to their system over cloud services to the public and arranges the vendors along a scale of capabilities and strategies. This assessment also includes their roadmap and ability to deliver on it, as well as ecosystem and services and that they provide, such as learning materials.

Fig. 1
figure 1

Independent analyst assessment of quantum computing manufacturers with quantum services and networks—Mapped on a Graph of Capabilities vs. Strategies. Reproduced with permission of IDC. IDC MarketScape vendor analysis model is designed to provide an overview of the competitive fitness of ICT suppliers in a given market. The research methodology utilizes a rigorous scoring methodology based on both qualitative and quantitative criteria that results in a single graphical illustration of each vendor’s position within a given market. The Capabilities score measures vendor product, go-to-market and business execution in the short-term. The Strategy score measures alignment of vendor strategies with customer requirements in a 3–5-year timeframe. Vendor market share is represented by the size of the icons (IDC MarketScape:Worldwide QuantumComputing Systems 2023 Vendor Assessment, doc #US49607923, August 2023)

3.4 Features—Frictionless

Frictionless refers to the ease of use and programing the system, and the ability to integrate it into other computing systems, such as “serverlessFootnote 2 cloud services and the concept of open-sourced software concept. Open-source movement, such as Red Hat and Openshift on Linux, dramatically accelerated software development and capabilities, and demonstrated opportunity to monetize on the services provided on open source systems. Customers do not wish to be locked into technologies, and this is why many vendors prefer opensource frameworks, such as Qiskit, a python-based programming framework for quantum computers. It provides an easy entry point for data scientists by being based in python language, can be used as an interface for all types of quantum computer [43, 44], as well as having extensive functionalities [45]. Qiskit is used by various vendors for frictionless integration with other systems. For example, you can program and call quantum computers from numeric computing environments such as MATLAB [46, 47]. For a more in-depth review of languages, frameworks, and tools consult [48,49,50].

Frictionless also refers to the ease of entry, ease of use, support, predictability, and availability. As systems become frictionless, more libraries and packages will make new software functions available that people can apply to solve problems, without needing intimate knowledge about how the system works nor processing data in its rawest form.

Previously one had to download all the binary outputs of the calculation and then analyze them to convert them into eigenvalues. Now primitives, such as estimator and sampler, allow people can receive results in a more meaningful form where the binary outputs are returned in an evaluated form of probabilities instead of binary outputs which require post-processing. This reference provides a walkthrough of the coding steps [51].

3.5 Choosing a Technology—Industry Choices

When choosing a technology & research area, it ultimately depends on a mixture of parameters that depend on situation, specialization, resources and priorities. In the past year we have seen increased availability of ion-based computers. These also have demonstrated two-qubits with fault tolerance [52,53,54] and are used of simulation of quantum dynamics [55]. One of the challenges for ion-based computers are the gate-speeds and scalability of the lasers used to control the qubits [21].

Rapid advancements have been made possible based on lessons learned from previous generations of computing paradigms, not only have we adopted concepts such as opensource and fabrication and development of enabling hardware and fabrication techniques. Several companies such as IBM, Google, Rigetti, and D‑Wave are investing in superconducting qubit technologies because they are based on fabrication techniques and technologies that overlap greatly with the traditional semiconductor industry and they can accelerate advancement leveraging their expertise in the field. Here the main challenge is the reduction of error per gate layer. In the near term, they are seeing the potential to find the first wave of utility from this technology. Many companies are exploring potential application of these technologies and making initial findings available publicly, a database of over 1000 publications is available online [56].

4 Advancements

4.1 Advancements in Quantum Computing Technologies

Recent advancements of enabling technologies over the past two years are advancing the timeline expected for significant quantum advantage for various end-users.

To put the advancements into context, we take the example of the simulation of the ground state energy or spins in a simple diatomic molecule. In the original experiments performed in 2017, the experts were able to simplify the problem by understanding symmetries in the molecule (i.e., expert insight with domain knowledge) to simplify the algorithm. The results are easy to calculate analytically on a classical computer and through chemical experiment, this allows us to verify the precision of the quantum computer. The answers correlated well, though were not as accurate [4].

In 2021, advancements in algorithms, error mitigation techniques [57] and hardware lead to a x120 speed up in the calculation and improvement in accuracy [58].

In June, 2023, a further milestone was reached, where IBM Corporation in collaboration with Lawrence Berkeley National Lab’s National Energy Research Scientific Computing Center (NERSC) and Purdue University has demonstrated for the first time that quantum computers can produce accurate results at a scale of 100+ qubits reaching comparable or even in friendly competition with High-Performance Computer (HPC) simulations of certain material properties [59,60,61]. These results also inspired new classical algorithms for HPC simulation of materials, that also lead to an increase in precision in classical simulations. It is important that the HPC field advances together with Quantum computation as many material simulations require both technologies.

4.2 Error Mitigation and Correction

Ubiquity of conventional classical computers came with the scaled ability to store and process information reliably. Small fluctuations of an electric charge or current in a microchip are tolerated due to a highly redundant representation of logical 0 and 1 states by a collective state of many electrons. Current quantum hardware is subject to various sources of noise, the most well-known being the following: qubit decoherence, individual gate errors, measurement errors and many techniques previously used in semiconductor systems are not implementable due to the fundamentally different nature of computation. Error mitigation and correction are methods to address this [62].

Quantum error mitigation is a technique that compensates for the noise generated during a quantum computation. It is an essential ingredient for scalable quantum computing and is the path that gets quantum computing to usefulness. A common technique involves modeling the device noise at the time of execution and using software to compensate for errors in the raw results. Techniques such as zero noise extrapolation (ZNE) and Probabilistic noise cancelation (PNE) are two that were recently implemented in the quantum utility paper.Footnote 3

Error Mitigation reduces the error of the final result, often through postprocessing. Quantum error correction guarantees the data stays at the value required or is automatically corrected back to the value it should be at that moment. To achieve a computational advantage in larger scale systems, the qubits will require to have fault tolerance, through quantum error correction mechanisms. There are hardware and software solutions under development for error correction.

The concept of error correction Is not new [63] and many advances have been made in the last 25 years [64]. The challenge with the error correction is a high number of the correcting (physical) qubits n comparing to the productive (logical) qubits k. There are various methods attempting to improve the rate k/n. A promising class of quantum codes used for quantum error correction is known as Quantum low-density parity-check (LDPC). The LDPC codes are designed to correct errors that occur during the transmission of quantum information. Recent codes are able to function with 10 times fewer qubits; [65, 66] preserve more data by introducing mid-circuit measurements [67, 68] or improve the fidelity of state preparation [69].

There are many approaches under development and scalable methods are an urgent requirement, that in time will be automatically and intelligently implemented by the systems, “mitiq” [70] and “Quantum Error Correction Zoo” [71] are comprehensive repositories of the state of the art.

Quantum error mitigation and correction are a collection of techniques and technological developments that will take us from today’s quantum hardware to tomorrow’s large-scale fault-tolerant quantum computers needed for quantum advantage on scale.

4.3 Mid-Circuit Measurements and Dynamic Circuits

A quantum circuit is a sequence of quantum operations, including gates, measurements, and resets, acting on qubits. In static circuits, none of those operations depend on data produced during running of the program. Static circuits might only contain measurement operations at the end of the circuit. Dynamic circuits [72], on the other hand, incorporate classical processing within the coherence time of the qubits. This means that dynamic circuits can make use of mid-circuit measurements [73, 74] and perform feed-forward operations, using the values produced by measurements to determine what gates to apply next. This technology has enabled several of the newest error correction codes, as well as reduced the circuit length for algorithms.

4.4 Circuit-Knitting

Circuit knitting [75] is a technique that partitions large quantum circuits into subcircuits that fit on smaller devices. It incorporates classical simulation to “knit” together the results and achieve the target answer. This method is particularly useful when we want to run a circuit consisting of more qubits than the number of qubits available on the device. By dividing the circuit into smaller subcircuits, we can execute them on multiple smaller quantum processors and then combine the results. Figure 2 contains an example of code demonstrating the advancement of runtime services and provision of multi-cloud environments for circuit-knitting.

Fig. 2
figure 2

A program illustrating the advancements and simplification of programing a quantum computer in a muti-cloud environment. The Qiskit runtime service automatically using three difference cloud providers of the user’s choice. In this code, a quantum circuit designed for more qubits than are available on the chip is decomposed using classical resources on an Azure backend, the results are sent to the IBM Quantum computing providers and the IBM providers return the results to an aws cloud to recompose the results. This is a solution one may implement to optimize cost and services offered by different providers

4.5 Advancements Coming

In the longer-term horizon, academics are investigating ways of inputting quantum information into qubits directly from quantum sensors and sources, opening further applications of quantum computing.

In the mid-term, the creation of a computer with tens of thousands of qubits and enough error correction would progress us beyond the NISQ era into the fault tolerant era. Such devices would be capable of implementing algorithms like Shor’s algorithm at a scale can tackle a larger set of problems such as break RSA encryption. An introductory course was recently launched here [76].

In the near-term, the estimated timeline of the advent of significant quantum advantageFootnote 4 has been accelerated by the implementation of the advancements in this section. A fidelity of Quantum Processing Units (QPUs) to 99.99% or higher is the target for early 2025 to provide a chip suitable breakdown larger problem statement into smaller solvable subcomponents of the order of 100 qubits. The creation and implementation of novel modular architectures designs that allow parallelization (with classical communication) of circuit execution; novel error correction methods combined with the latest advancements of circuit knitting would allow the implementation of use cases that can be implemented on superconducting quantum processors that are 100 qubits in size with a circuit depth of 100 layers. The community is focusing on exploring applications of quantum computing that will be possible on systems with these enhanced systems by late 2024–2025.

5 Applications

Which brings us to the key question industry leaders are racing to answer: When will we reach quantum advantage in different application domains? This will occur gradually for different domains, and that certain areas are attracting more focus as they lend themselves well to current quantum computing paradigms.

They are looking to gain experience in how to integrate these technologies safely into their workflows. They are also driven to identify problems that admit a super-polynomial quantum speedup and advancing theory to design algorithms based on intermediate depth circuits that can outperform state-of-the-art classical methods.

Broadly these are categorized into three areas:

  1. 1.

    Problems looking to simulate nature at the microscopic dimension-scale level where it follows quantum-behavior,

  2. 2.

    Linear algebra (e.g., Machine-learning & Artificial Intelligence)

  3. 3.

    Certain types of optimization problems [77] but not all [78].

5.1 Emerging Research Fields in Quantum Computing for Data Bases

Optimization, simulation of systems with innate quantum effects, and the solution of liner systems of equations are technological areas that are expected to benefit from quantum computers; resulting in the ability to solve problems considered to be NP-hard, improvements in accuracy or speed depending on the application. Industrial analyses of companies have heretofore focused on product oriented usecases. These are also beginning to expand their experience to prepare their business and data workflows.

Three reviews prepared specifically for data base engineers are: this 20 min video [79] introducing quantum computing, and illustrating a use for join order optimization with coded example; and this review on opportunities for optimization of queries and transaction schedules which includes a table comparing 7 approaches for optimization of databases and the current estimates for resource sizes and types required to solve the problem [80], and this review that summarizes literature in the fields of Database search, Database Manipulation, Database Query Optimization, Transaction Management with Quantum Machines [81].

Recent advancements after this review have occurred in the areas of optimization for join reordering [82,83,84]; multi-query optimization [85, 86]; index tuning [87]; database transactions [88]; and schema matching [89]. The importance of data mapping formulations is illustrated in reference [86], where a method to directly map from the mathematical formulation to the quantum implementation lead to an algorithm on a gate-based quantum computer that is more efficient than solutions previously suggested on quantum annealer computers for multi-query optimization. It shows promise that can be verified once the quantum computing technology improves to allow larger sized problems to be solved.

Quantum Machine Learning (QML) [90] features heavily in data base research. A comprehensive review on QML [91] identifies and classifies algorithms and applications presented in 94 papers, from a pool of over 5000 publications. QML faces challenges in outperforming classical machine learning algorithms, areas. Current debates and theoretical proofs probe the question whether it is possible for quantum computers to outperform classical state-of-the art machine learning methods [92]. The scalability of variational quantum algorithms (VQAs), a type of optimization algorithm that incorporates QML, appears to be impeded by barren plateaus in the data, however recent advancements propose paths to address the challenge of baren plateaus [93, 94] the possibility of super-polynomial advantages [95] and identify a subset of problems that may have the potential to be learned faster on a quantum system [96].

I/O bottlenecks and circuit length restrictions are the main hardware bottlenecks. Alternative data representations, such as vector distance representations, would enable more data ingestion, and current solutions will focus on hybrid architectures where the activity being resolved by the quantum computer can be returned in a few minutes or hours. Integration of research advancements in qubit device materials, quantum processor architecture, and mitigation into production will increase total expected gate counts to 5000 gates possible per calculation by the end of 2024; 7500 gates in 2026, 10,000 gates in 2027; 15,000 gates in 2028, an estimated 100M gates when error correction is fully implemented (planned 2030); and one billion by 2033.

5.2 Partnerships and Networks Exploring Use Cases

Industry pioneers have been developing partnerships with quantum computing service providers and industrial consortia to pool resources and to jointly explore the value and potential of these new techniques and algorithms. The modes of collaboration are influenced by the industry. An industry insights report by McKinsey [97] provides an overview of common industries.

QUTAC, the German Industry Consortium, has been looking at applications [98] on 1) material science 2) engineering & design 3) production & logistic 4) post-quantum security.

Use cases in finance and logistics [97, 99,100,101] may also overlap with use-cases in data science and management, such as looking for better ways to deal with incomplete data or reducing the complexity of models.

Over 200 industrial and academic End-users of IBM Quantum Partner Network have created four expert focus groups to identify area of most immediate potential and value, and have published these in four white papers:

  1. 1.

    Healthcare and Life Sciences [102]

  2. 2.

    High Energy Physics [103]

  3. 3.

    Optimization [104]

  4. 4.

    Material Science and Quantum-Centric-Supercomputing [105].

They are defining the potential areas of interest and value that will be possible to solve with the enhanced capabilities of the systems expected in the next two years and have published these open innovation topics in white papers referenced in the list above. There is a subset of topics that lend themselves to open innovation, such as to solve regulatory necessities and to share cost burdens, such as sustainability and the replacement of toxic materials.

In Germany, the National Ministry for Economy and Climate protection (BMWK) is centralizing the results of usecase research and accelerating the economic development through a sponsored project “PlanQKFootnote 5 through the creation a platform and ecosystem activities. The BMWK invested 740 million Euros into the “Quantum Computing Initiative” (QCi) at the German Aerospace Center (DLR)Footnote 6 to accelerate development through new research and engineering teams and partnerships with Industry, Startup and Academia. The Ministry for Education and Research (BMBF) has invested significantly in usecases and enabling technologies [106]. As well as a Quantum Computing User Network (QuCUN) [107], with a total investment of 1.1 billion Euros planned by the end of 2025.

Collaborative research centers have come together in the “Quantum Alliance” [108], and various regional hubs and networks have also been established, e.g., Q‑Lab at Universität Bunderwehr Quantum [109], Competence Center Quantum Computing Baden-Württemberg [110], Center for Quantum Technology and Applications (CQTA)—DESY, Zeuthen [111] “Munich Quantum Valley” [112], “QuantumBW” [113], Quantum Technology Brandenburg [114], Quantum Valley Lower Saxony [115], and Berlin Quantum Alliance [116].

6 Getting Started

6.1 How to Start—Technology Adoption Journey

We can draw parallels to the adoption journey of companies introducing quantum technologies with that of the adoption of data science and machine learning, to help the reader interested in introducing quantum computing in their organization.

In certain industries, being the first adopter of a technology can create a significant advantage over competitors, for example how internet service and content streaming companies embraced machine learning for their analytics transformed the way we purchase and consume services. It takes several years before the technology is fully introduced. The process typically follows:

Socialization—where the leadership and/or members of the organization are made aware of the technology and potential capabilities and colleagues or leadership need to be convinced of the value of investing time and resources now. This can typically involve advanced business development, technology and engineering teams and the internal IT service providers. A senior executive sponsor with a long-term view which can remove procedural and organizational barriers in introducing a new technology and new business stress such as a CTO, CIO or CEO of an organization is vital for a serious exploration. Gaining their support is done by communicating how current investments would help gain competitive advantage in the market when quantum advantage arrives.

Exploration—A small team explores maturity and potential applications part-time or through external consultants, partnerships with quantum computing manufacturers or startups. Tools and technology are still in a graded-maturation process. The final section of this document refers to open-source learning materials that would be of value. Further support in the organization is enabled through practical exemplary use cases published by competitors.

Pioneer/Expert Onboarding—The small team is expanded into a larger team of technical experts in the organization identified for cross-skilling. The organization sets aside resources to build internal competence. These obtain induction courses in the technologies to skill them to identify potential use cases in their daily work. The cross-skilling of internal domain experts and business veterans to gain quantum competence is invaluable in the identifying technical problems that may be capable to address with quantum computers soon and prioritize those of significant value to the business.

Dedicated team/Partial adoption—The organization’s leadership finds enough value or recognizes that the timeframe required to have a skilled workforce that can integrate into their organization. These may still need to create their own tools and are experimental. They enable adoption in parts of their organization. Technical expertise and internal technical acumen are vital for effective algorithm development.

Change-management—The organization’s processes that are streamlined to focus efforts on existing business may impede progress. A sponsor within the organization needs to provide systems, modifications to assessment criteria and resources to create space for employees to be able to progress. Often this is supported by Human Resources or a senior director in the company with budget decision authority, in some companies there is also a culture or provision in place for continuous technical development of employees.

Widespread adoption—The tools and technology mature so that non-specialists can produce a result of value without intimate knowledge of the system. In data science and machine learning we are beginning to see this in some organizations that started their adoption journey several years ago.

A detailed technology and application roadmap accelerates innovation by helping identify skills gaps, preparing expectations and seeding invention of new capabilities. national advisory committees have released roadmaps for Quantum Computing and Enabling technologies, e.g. UK [117, 118], Netherlands [119], Germany [120], France [121], Spain [122]. IBM’s recently updated and detailed roadmap available is available here: [123], it is the most comprehensive to date, with detailed expected gate and qubit count of devices until 2033+. It separates research and development roadmaps to clearly communicate the research that feeds into the development roadmap of production features in hardware, software libraries and middleware [124]. Experts and leadership in organizations consuming quantum computing services monitor these roadmaps and internalize this knowledge to identify unique value proposition and areas to apply quantum computing. In the next sections we provide an overview of resources to help organizations start their quantum journey.

6.2 Getting Started—Areas that do not Need an Advanced Degree in Quantum Science

There are many opportunities to contribute to this era of quantum computing with little or no-training in quantum computing.

A significant improvement in speed and capabilities in the machines has come from improvements in classical control electronics, software engineering and business process.

6.3 Getting Started—Building a Competency Framwork

The introduction of a new technology requires the development of a competency framework, change management and learning resources. A successful framework will reflect corporate values and should also include consideration to ethical technical development, and an overview of soft skills as well as technical competencies and the levels of competency that are required within the different layers of the organization.

In addition to a range of classical engineering disciplines, technical production and support roles are an increasing necessity for reliable system design, product design and management, quality control, security, hybrid cloud and compute, to name a few.

More importantly we need to be more flexible in our hiring and role definition. We are still thinking in silos and layers in terms of competency and expertise. Whereas there is a need for people with one specialization that can augment it with a second. For example, we require mathematicians with the skill to develop efficient code, to make algorithms that can make the most of limited resources—on the boundary between middleware and circuits. We also need skilled assembly language level programmers and control automation experts with an ability to understand the physics of a qubit and optimize its performance for efficacy—on the boundary between circuits and hardware. People with a technical background and industry knowledge are also required, this requires years of experience and is best solved by training experienced employees in collaboration with fresh graduates. For example, a machine learning expert with knowledge in finance or working on a production line, and the interest to upskill to be able to apply quantum machine learning.

6.4 German and European Activities for Competency Building and Frameworks

How to build the future quantum workforce has become a research interest in several groups in the last years, in Europe [125,126,127,128] as well as in the United States [129,130,131]. The coordination of European quantum activities is in the responsibility of the Quantum Flagship Coordination Action and Support (QUCATS). Educating the workforce is one important aspect, with activities in supporting the development of programs and the establishment of best practices. Therefore, the European Quantum Readiness CenterFootnote 7 is in the making, currently providing, e.g., a playlist of selected videos on quantum technology topics. Contents are structured using the European Competence Framework for Quantum Technologies [132]. This framework is thought as providing a common language for quantum technology education, to enable planning, mapping and comparison of trainings, study programs or similar. It will also be the basis for developing a certification scheme within the QUCATS project to support standardization of quantum technology training.

Other projects focus on providing courses for the industry, like QTInduFootnote 8 In the EU-founded project, modularized courses to “make the European industry quantum-ready” are developed by different content creator partners, for Germany this are the Technical University Braunschweig and the Physikalisch-Technische Bundesanstalt (PTB), the national metrology institute of Germany.

For Germany, the Bundesministerium für Bildung und Forschung (BMBF) started the initiative Quantum Futur Education [133] with nine projects to develop and implement quantum educational strategy.

One big challenge for all these initiatives is to understand the concrete skilling needs from the industry: identify the different roles and training needs. In contrast to existing university courses, the aim is not necessary on understanding the mathematics and physics or even on understanding any details. Currently the industry focus is on enabling two main tracks: business and technical. Technical leads seek help from consultants and industry consortia to find ways to communicate the value of such an unfamiliar und unrelatable technology to the leadership, and the urgency to upskill existing employees to ensure that the workforce has the domain and organizational knowledge to successfully introduce a high technology into the organization. There is a particular interest in upskilling existing technical experts in corporations with the competency and methodology for identifying suitable applications and quantum technologies. Quantum technologiesFootnote 9 span beyond quantum computing to include sensing and communication. Advances in these technologies complement each other.

6.5 Getting Started—Resources

Academia, enthusiasts, professional bodies, and companies that are developing quantum computing have created a wealth of content. Massive Online Courses have been made available for different levels of expertise on popular learning platforms such as Coursera, Edx.org, MITxPRo. There are even courses for highschool level entry through the QubitXQubit coding school with over 10,000 learners.

Hackathons and challenges are a playful and hands-on way to start. These offer the opportunity to work with others on interesting problems and learn. Various other companies invite participants to solve interesting challenges. IBM publishes previous summer school content online for people to learn through videos and exercises. It created the largest opensource textbook through the Qiskit.org and has replaced it with a new interactive learning platform learning.quantum.ibm.com, to improve the learning experience into its platform and documentation. The learning platform has newly created courses and tutorials by renowned quantum experts and ability to earn digital badges [135].

There are also efforts based on research originally from the University of Oxford in making the methods for simplifying circuits more intuitive through formally proven pictographic formalism. Effectively you can perform complicated linear algebra and tensor mathematics using pictogram to simplify circuits [136].

For a comprehensive directory of algorithms visit the Quantum Algorithm Zoo [137] and other tips of where you can get started are referenced here: [138, 139].

A few good places to start for introductory information in German language are:

  • The open lectures by the Hasso Platner Institute [140]

  • An introduction to quantum computing for security experts [141]

  • The introductory book on Quantum computing by Matthias Homeister [142]

  • The German government page on Quantum Technologies [143]

The public can access to real quantum computers on the cloud since 2015, recently there has been an upgrade to the offering by the largest provider of machines [144].

7 Conclusions

Myriad quantum computing technologies are available. Whilst multiple attempts to compare them are made, the differences between them and their implementation result in the need for applied exploration. Considerations to assist the user in deciding between technologies and identifying lucrative research areas have improved and were outlined in the article. Improvements in the precision and speed of the technology have been illustrated through an example in chemistry over time. Recent and impending advancements raise confidence in the preponement of the estimated advent of significant Quantum Advantage. Networks and Industrial consortia are working together to identify problem statements, in areas where it is of business priority. Information about to resources to assist in the entry to quantum computing are provided.

The important point here is to recognize that it will take time for domain experts to acquire the skills to be able to identify suitable and apply the technologies, depending on the application this is likely to take around two to three years. The question is will you have the experts that are ready once these technologies are mature enough to apply in your domain and how are you preparing for this?