Heike Riel

Rainer Alt

Background information

Quantum computing is considered as one of the major disruptive technologies that are believed to fundamentally influence existing as well as new applications. This refers to advances of this technology in solving complex mathematical calculations, which are important for optimizations and simulations being present in many application fields in the scientific as well as in the business world. In addition to these application potentials, quantum computing is converging with other technologies to develop further momentum. For example, the combination DARQ was formulated for the convergence of distributed ledger technologies, artificial intelligence, extended reality and quantum computing (Alt, 2021). However, quantum computing is still at its early stages and many challenges remain regarding the technology’s performance in terms of speed, scale and quality. To obtain insights into the evolution of quantum computing and the current status of developments regarding the technology and its applications, a leading researcher in the field of quantum computing shares her views. Obviously, the interview employs various terms and concepts from computer science and physics, which will not all be well-known within the information systems community. Therefore, an overview article was published in Electronic Markets’ Fundamentals paper series (Rietsche et al., 2022), which explains the basics of quantum computing and distinguishes three layers for a quantum computing system (hardware, systems software, application). Besides explaining the structure of a quantum computing system, this architecture serves to discuss the implications of quantum computing on the role of data as well as on the level of the individual, the organization and the broader ecosystem. Several aspects on the hardware and the application layer are taken on in the following interview. Based on the distinction of quantum annealers and gateable quantum computers, the interview sees the former associated with important shortcomings regarding their scalability and their applicability to a broader set of problems. Therefore, the interview elaborates on how IBM has improved the latter (i.e., gateable quantum computers) and presents their advantages in meeting the so-called DiVincenzo criteria that comprise functional requirements for any practical quantum computer (see Table 1). Similar to the contribution in the Fundamentals article, the interview discusses the application potentials of quantum computing and, by identifying a broad spectrum of applications in many industries, it confirms quantum computing’s appraisal as broadly applicable (i.e., general-purpose) technology.

Table 1 Main concepts in quantum computing

Personal information

Dr. Heike Riel joined the IBM Research Lab in Zurich in 1998 and studied physics at the Universities of Erlangen-Nuremberg and Bayreuth in Germany. In 2002 she received a Ph.D. in physics from the University of Bayreuth and in 2011 a MBA degree from Henley Business School, UK. In 2013 she was appointed IBM Fellow, the highest honor a scientist can achieve at IBM. She has authored more than 150 peer-reviewed publications and filed more than 50 patents. Her research has been recognized by several awards including an honorary doctorate received from the University of Lund in Sweden in 2015, and the IEEE Andrew S. Grove award in 2022. Further, she is an elected member of the National Academy of Engineers and the German National Academy of Sciences Leopoldina. The IBM Research Lab in Zurich, Switzerland, is one of the company’s twelve research labs worldwide and was founded in 1956. The lab focuses on innovations in information technology and cultivates close relationships with academic and industrial partners.

What is IBM’s approach to quantum computing?

Let’s first clarify on the types of quantum computers. A universal quantum computer is very different from a quantum annealer, which lacks gates and can be used only for specific types of problems, such as optimization tasks. At IBM we build gateable quantum computers, which allow a continuous development path from first utilizing error mitigation towards error correction of high-quality circuits that can become universal and can thus provide a fundamental and intrinsic speed-up. In our approach, the qubits fulfil specific criteria, which were defined by David DiVincenzo in 2000 who was at IBM Research in Yorktown Heights at that time. He formulated the characteristics that a physical system needs to have to act as qubit for quantum computation. The definition of the so-called “DiVincenzo criteria” has been an important milestone because it describes the properties experimental physicists need to search for to build qubits. Before, it was mainly a theoretical topic. From 1981, when proposed by Richard Feynman at a conference, quantum computing was just a brilliant idea leveraging the laws of quantum physics to solve quantum mechanical problems, which are omnipresent in physics and chemistry. Feynman thought it would be much easier than using classical physics to approach a quantum–mechanical problem, but at that time it was not clear how it should work. Until 2000, much work was done to figure out the theoretical concept of how to build such a quantum computer.

How have quantum computers evolved?

Quantum computing is a very active area of development. There are different physical implementations to build qubits that are currently developed and differ regarding their maturity. The most mature are quantum bits based on superconducting Josephson junctions, which are rather simple structures consisting of an insulator sandwiched between two superconducting metal layers. The Josephson junction is a non-linear inductor possessing an anharmonic energy spectrum that creates a two-level energy system needed for a qubit. Qubits based on ions and atoms have also made great progress over recent years. Other approaches use the spin of an isolated electron or hole – spin up or spin down represent hereby the two energy levels – in a transistor or in a quantum dot as a qubit, while another approach envisages qubits based on Majorana fermions. These Majorana particles are a direction Microsoft has been pursuing, but such a qubit is not yet demonstrated and many fundamental physical problems remain to be explored for this approach. IBM Research has been very active on quantum information theory and the theoretical side of quantum computing already in the 1970s. In the early 2000s, we have also started to work experimentally on qubits and quantum computers based on superconducting Josephson junctions. In 2015, we were able to build a five-qubit quantum processor and made it available in the IBM cloud in May 2016. This made quantum computers broadly available and within a couple of days we saw publications on experiments with this processor from around the world, even from the Arctic. For the first time easy access to a quantum processor has been given.

What were the main application areas for this cloud quantum computer?

This open cloud access gave researchers the opportunity to run simple operations and algorithms on a real quantum processor. While before only a “paper and pencil” exercise was possible, theoretical ideas could now be tested experimentally. You can apply gate operations to qubits, run model systems and execute certain algorithms and check what the output is. For example, an early experiment was to simulate a lithium hydride molecule using six qubits, characterize the influence of the qubit noise and apply error mitigation. This real device allows much experimentation to develop and run algorithms, and test them on small model systems. In that regard, various experiments in the area of simulating quantum systems solving problems in physics and chemistry, quantum machine learning, quantum Monte Carlo simulations, as well as optimization problems are pursued. These five and seven qubit systems are still in use today, although we have steadily been able to scale the number of qubits and improve the performance of the quantum systems which is essential to solve problems classical computers cannot solve.

How has this changed since 2016?

Since 2016 we have continuously increased the number of qubits and the performance of quantum computers, demonstrating a 27-qubit processor in 2019, called Falcon, launching the Hummingbird chip in 2020 – you see that we call our chips like birds – which is a 65-qubit system, and in October 2021 we have introduced a 127-qubit chip called Eagle. Just a few weeks ago the latest quantum processor in this series, called Osprey, with 433 qubits was released. And for the next year we have even bigger plans, to build a 1000-qubit processor and by 2025 we plan to scale to more than 4000 qubits (Gent, 2022). Scaling the number of qubits is crucial to be able to solve more complex problems in the future, but having just more qubits is not sufficient. Scale is only one key metric for increasing the performance of a quantum computer, the other two are quality and speed. Let’s first discuss scale. To build working quantum processors with a high number of qubits you need to have a high yield in fabricating qubits and you must solve other technical problems, like wiring and connecting all of them by concomitantly reducing crosstalk and other possible errors. This requires solving technical and fabrication challenges. For example, the Eagle processor with 127 qubits required novel packaging technologies with multi-level wiring and through-substrate vias to connect the qubits. At the same time, critical parameters like coherence time and gate fidelity must be improved further to make use of the scale. The coherence time – in which qubits behave like a quantum system – together with the gate operation speed define how many gate operations can be done before noise masks the result. A great advantage of superconducting qubits is that their gate operation time is very high in the range of 50–500 ns and that the coherence time has continuously increased over the years reaching even 1–2 ms today. The gate fidelity defines how well gate operations work and the higher the gate fidelity, the less errors occur. Simply spoken, the lower the errors that occur, the easier it gets to apply error correction methods. We are now very close to an error of 10–4, which is a good start for testing error correction methods. The coherence time and gate fidelity contribute to the second key metric: quality. Error correction or techniques like error mitigation that we currently use, are key for increasing the complexity of the calculation that can be done on quantum computers.

Are there other parameters besides scale and quality?

Yes, speed is another key parameter for measuring the performance of quantum computers. Speed is measured in the number of primitive quantum circuits that can be calculated within one second, called CLOPS, which stands for Circuit Layer Operations per Second. This metric is also system-agnostic and captures the full dependencies across hardware and software of circuit executions. Simply, it defines how fast users receive the results of their calculations. There are two major opportunities of a quantum computer: on the one hand, you can obtain more accuracy in your results and on the other hand, calculations become feasible, which were impossible or would last too long in classical computing environments. You must be aware that today’s quantum computers always work together with classical computers. They act as classical input systems that are responsible for generating the microwave pulses in the case of superconducting qubits, to initialize the quantum state, to operate the gates and to read out the final calculated state. Further, certain quantum algorithms like the variational quantum eigensolver, require a huge number of iterations between classical and quantum computation. Therefore, the interaction and orchestration between classical and quantum computers is crucial to increase speed.

What are your main expectations for these gateable quantum computers?

Currently it is still difficult to demonstrate practically a comprehensive advantage for quantum computing systems. What we can do, is to show new algorithms and demonstrate the advantage of these algorithms in model systems also on hardware. For example, we have developed new quantum algorithms for the Monte Carlo simulation where you see that the quantum approach is superior to the classical Monte Carlo algorithm. There are other algorithms that have been proven theoretically to provide an advantage on a universal quantum computer. We believe that quantum computers can offer value before we have the fault-tolerant system and the path from today to then can be continuous. Therefore, we focus our efforts on error mitigation to understand the error and mitigate it. This can allow us to obtain noise-free results from noisy quantum computers, at a runtime cost lower than classical computers and can work before quantum error correction is in place. We also devise new algorithms and new ways of dividing a complex problem into smaller sub-problems. This means a large computation is divided into smaller circuits that can be run on smaller quantum processors and then stitched together classically. In general, this is called circuit knitting. In fact, finding the right problems, the right algorithms, dividing tasks into sub-pieces and using error mitigation and some other embedding ideas, are key issues for science that will fundamentally influence the future of gateable quantum computers.

What types of problems do you have in mind for universal quantum computers?

As you can imagine, there are many different complex mathematical problems in science and in business. Classical computers are very good in multiplying numbers and the like, but they are overwhelmed by complex problems. A simple example is factorization: for finding the prime number, classical computers need to go through every possible solution and for a very large number, this takes forever. The same applies to optimization problems with many parameters, because the complexity of the problem scales exponentially with the number of parameters. Quantum computers use a much smarter way to reach the result. Large numbers can be factorized by orders of magnitude faster via the well-known Shor algorithm: what would take 300 years, becomes feasible in only 30 seconds! Keep in mind that today, we do not yet have a quantum computer that can run Shor’s algorithm. One chooses quantum computers for problems that scale exponentially with the number of parameters. If you look closely, you will find many of them in our daily life, for example, finding new materials for curing diseases, for batteries, fertilizers, catalysts and the like. This requires the calculation of molecules where many electrons interact with each other and the corresponding equation cannot be solved by classical computers. However, quantum computers make use of the fact that a qubit is an artificial atom and you can map the problem on the quantum computer. Take the example of caffeine, which is a rather simple molecule, a little more complex than water but not as complex as proteins: it is built of nitrogen, oxygen, carbon, and hydrogen as elements. All 99 electrons in this molecule interact with each other. To store the equation would need already 1048 bits – you see that this is an impossible task for a classical computer. An ideal quantum computer requires about 160 qubits for this. At present, by using classical computers we rely on approximations and chemically accurate calculations are only possible with very simple molecules like H2 or lithium hydride, which have only a few atoms. Solving complex tasks in physics, chemistry, and material science with quantum computers would allow to design new molecules, to understand the reaction pathways, to calculate the full energy spectrum and to discover materials and predict their properties. This can impact applications in many industries such as the chemical industry, pharmaceutical and life science industry, energy production, fertilizer production and more.

Could you also think of use cases in other industries?

In many industries, we see problems where machine learning, optimization methods, differential equations and Monte-Carlo simulations are applied. For example, applications of quantum computers exist also in the financial industry where risks are calculated for insurance policies or investment decisions, in the financial and retailing industries where prices and product portfolios are optimized or in the field of logistics and manufacturing, where supply and production plans are determined. We also looked at transaction settlement, which represents an interesting problem I was unaware of before. In this area, transactions could be bundled to packets of transactions, which allows a much faster settlement of these transactions. Today, it is divided into smaller problems that classical computers can manage, but with the parallelization possible with quantum computers, more real-time transactions are feasible at larger scale. In addition, we can also add the improved performance of machine learning applications in fraud detection, which is an important source for economic losses. In general, quantum computing may enhance classical algorithms, which might lack sufficient accuracy and speed. If you are able to obtain similar or even higher accuracies at faster speeds – take the example of quantum Monte Carlo simulations that require only 10,000 instead of one million iterations – then you can think of solving problems in real-time, which could accelerate many decision processes. Many more examples and use cases are currently investigated.

How much quantum physics knowledge is required to use a quantum computer?

Basically, you just need to know Python to use a quantum computer. We want to make it easy for people to use quantum computers in their workflow. Therefore, a full software stack is currently developed on top of the hardware as indicated in our IBM Quantum Development Roadmap shown in Fig. 1. We define different types of developers: kernel, algorithms and model developers who need to know less about the specifics of the hardware the higher up they are in the stack. The kernel developers build quantum circuits and Qiskit Runtime. These tools have been used by the algorithms developers to build quantum application modules for machine learning, optimization and natural sciences; whereas model developers apply these tools to solve problems in their domain. For example, you might use your optimization software where you currently applied classical solvers, but now there is a quantum solver you can select for your mathematical problem. If you aim to design a molecule, you might use the quantum solver Qiskit Nature in your tool where you want to calculate your molecule. This may include calculating the reaction pathways, the excited-state spectrum, the full energy spectrum of the molecule. Ultimately, the users should not worry about the underlying algorithms since this would require you being a quantum physicist or chemist yourself. You should be able to obtain the solution “as a service” and forget about the details. In the best case, developers should be able to use the same code and the same tools they used so far when developing their software. We have not reached this goal yet, but we are working on it.

Fig. 1
figure 1

IBM Development Roadmap for quantum computing (updated, based on Gambetta et al., 2021)

Could you elaborate on these offerings?

These developments are included in a roadmap that describes the technology, i.e. our quantum hardware and software stack as well as the time we want to make it available. At the bottom of Fig. 1, you see the hardware roadmap indicating our plan to scale the number of qubits in a modular approach linking quantum chips with classical and quantum communication. An example is the quantum processor unit with more than 4000 qubits by 2025 as mentioned before. One layer above, you see the layer of the kernel developers who are familiar with this hardware. For the development environment, we offer Qiskit Runtime primitives that abstract away the details unnecessary for exploring real-world use cases. Just recently, more capabilities like dynamic circuits have been released that seamlessly incorporate real-time classical communication into quantum circuits. This feature increases the variety of circuits that can be run on near-term quantum hardware, while reducing the number of gates required. More capabilities like error suppression and mitigation techniques are planned for next year and are already available as alpha-version. Our efforts are targeted to increase the three performance measures scale, quality and speed and improve the available capabilities of quantum computers. On top of this layer, there are algorithm developers who use these Qiskit Runtimes and capabilities to implement quantum algorithms and to build circuit knitting techniques and libraries that can be utilised by the model developers above who make use of these tools for building quantum software applications and services. This allows users with problems in the areas of natural sciences like physics or chemistry, but also in the fields of finance and machine learning, to use these tools and plug them into their workflow.

Finally, we also know that quantum computing is associated with risks. What is your opinion on these?

If we have a fault-tolerant universal quantum computer with millions of qubits – it is a little unclear how many qubits are required – then it will be possible to apply the Shor algorithm and solve current encryption algorithms, which clearly represents a risk. It will be unlikely that this occurs in the next years since it will require some time to build fault-tolerant universal quantum computers. Nevertheless, we already started working on mitigating that risk and apply new encryption algorithms that cannot be broken by a universal quantum computer in the future. Quite some work has been already done by IBM Research on the so-called quantum safe cryptography. Six years back, the National Institute of Standards and Technology (NIST) in the US initiated a competition to find encryption algorithms that cannot be broken by quantum computers and to select a new standard for encryption. The world’s cryptographers devised and submitted more than 70 algorithms and vetted them during the last 1–2 years by the global community. In August, four quantum-safe cryptographic algorithms have been selected and will become part of NIST’s post-quantum cryptographic standard. Congratulations to my colleagues who have contributed to three of the four selected standards. We are also actively pursuing various application scenarios for quantum-safe cryptography. This notably includes their application in the areas of systems and cloud where we have successfully demonstrated quantum-safe access to clusters deployed in the IBM Cloud and where the IBM Z16 systems contain the latest quantum-safe capabilities. It is important to start thinking about appropriate data security strategies. This holds also for cryptocurrencies like Bitcoin where the choices of the best algorithms for encrypting and mining are still vague. Another interesting aspect in this context refers to energy consumption. You might think that a large and complex computer always consumes a high amount of energy. For example, the top-one high performance computer has a peak power of 29 megawatts. Actually, our quantum computers are running in a lab and require about 36 kilowatts, which is clearly below. In fact, quantum computers require energy for cooling them to very low temperatures, but beyond that the superconducting nature makes them rather efficient. They differ from classical transistors, where thermal power is more or less wasted. Although superconducting qubits work dissipationless, when you scale your qubits, you also need to scale the control electronics. In our research, we work on integration and miniaturization of the control electronics to reduce power consumption and costs. There is still much work to be done to fully analyze, but there is a good chance that relevant problems can be solved with much less energy consumption.

Could we have a quantum computer under our desk in 20 or 30 years’ time?

I would not exclude it, since in the early 1970s people would not have believed that we all use mobile phones which are able to perform mobile banking, videoconferencing, automatically translate language and the like. I would, however, predict that it will be unlikely to have quantum computers under our desk within the next ten to twenty years. Quantum computers are still very bulky and will first be mainly applied to solve special and complex tasks. Therefore, it is best to use them and fully exploit them via the cloud in the next years. History has shown that a lot of things are possible that we would have not imagined before. Would we have believed that building chips with 10 nm gate lengths is feasible? That we can built stacked nanosheets where the gate fully surrounds the channel on 300 mm diameter substrates with optical lithography? Probably not, since the body of knowledge in physics and optics told us that it does not work. Over time and with many iterations of invention and engineering it nevertheless happened. Progress in research requires many smart minds and luckily, quantum computing currently attracts many of them.

What are your expectations from information systems researchers in this context?

As we know, innovation is happening at the intersections of scientific disciplines and the applications in practice. Coming up with a new computer system allows to solve problems in different and new ways. Enabling what has not been possible in the past will for sure be a source of business advantage. Therefore, we collaborate with a broad network of partners, including some of the top 500 companies in different industries as well as start-ups and academia. Bringing our computing knowledge and insights on future technological innovations together with a deep understanding of the challenges of the industry sectors provides rich opportunities for innovation and future business solutions creating competitive advantage. This is reflected in our IBM quantum network and our endeavor to jointly create positive impact for businesses and the societies.

Dear Heike, thank you very much for this interview!