Prof. Michielsen gave a keynote talk on integrating quantum computing in HPC in SC22. According to the talk, the requirements for quantum computing are:
- Quantum computer hardware
- Quantam software stack
- Quantum algorithms
Similar to current HPC systems enables state of the art research to academics and industry, I want to gauge some insight on what is the state of each of the above requirements that would ultimately enable research. Are there research being conducted in academic/industry that is using current available quantum computing resources. What is the current pain points of using such resources? What is the learning curve to learn and use these technologies?
You bring up a number of wide-ranging questions in leveraging quantum computing. Let me first briefly comment of the three requirements for quantum computing that you listed:
-
Current quantum computing hardware is limited by the number and quality of available qubits. Current devices are often referred to as “noisy intermediate-scale quantum devices” (NISQ). Gate-based devices with several hundreds of qubits are available now, e.g. from IBM.
-
On the software side programming ultimately operates at the gate-level (think “assembly language”) and is thus very technical, although there are libraries which provide higher-level abstractions and implement quantum algorithms which can be used as building blocks to address a computational problem. Libraries include IBM Qiskit https://qiskit.org, and higher-level meta-libraries such as CLASSIQ www.classiq.io.
-
Usually only quantum algorithms which provide super-linear speedup over classical computation are of interest because of additional overheads at the I/O stage. A number of quantum algorithms offer quadratic speed-ups (amplitude amplification, phase estimation, quantum walks), while a select few provide exponential speed-up. The latter are the most promising and include the quantum Fourier Transform and Hamiltonian simulation.
What requirements are needed to enable research which benefits from quantum computing depends strongly on the type of problem under consideration. Is the application fundamentally amenable to acceleration via a suitable quantum algorithm? Here it is important to understand that quantum computers will likely never replace classical computers as a faster general purpose computer; rather, they are envisaged as specialized accelerators. If the problem can be solved by a known quantum algorithm that is expected to accelerate the classical computation then one needs to study at which problem size the quantum advantage becomes manifest and how this relates to the capabilities of near-term quantum computing hardware.
Now let me turn to your questions about research in academic/industry using current available quantum computing resources and the pain points of such resources, and finally the learning curve that is involved in using these technologies.
-
Quantum simulators and current NISQ devices are being used to explore the feasibility of particular applications in academic/industry. Among the most promising are optimization problems such as the airplane “tail assignment” problem, traffic optimization, or optimization of financial portfolios, as mentioned in the SC22 keynote talk, as well as quantum simulation to determine the ground state energy of a molecule. However, current quantum devices likely have too few qubits and do not yet implement quantum error corrections which is crucial for accurate computations. Therefore, applications are usually tested at much smaller than targeted problem sizes.
-
A further fundamental limitation that should be pointed out concerns I/O from a classical to a quantum computer and back. Classical data first must be encoded into an appropriate quantum state which is then evolved in the quantum computation according to the rules of quantum theory, until a measurement collapses the quantum state back to a classical output. Encoding classical data can be expensive and repeated measurements of the stochastic quantum system will be necessary to obtain a solution equivalent to a deterministic classical solution, unless a single scalar output from the measurement is sufficient. Therefore, quantum computers are not suited for big data applications.
-
Unsurprisingly the learning curve to use these technologies is still fairly steep. Some knowledge of quantum mechanics (and/or probability theory) is necessary in order to understand how quantum computers and quantum computing algorithms operate. This theoretical knowledge paired with familiarity with software libraries is necessary to design and implement new applications. The design of new quantum algorithms is even more difficult.
2 Likes