probably quantum? NISQ and where we are today
In October, my colleagues Lars Fjeldsoe-Nielsen, Maxime Le Dantec and I were honored to co-host an awesome crowd of thinkers and builders in Quantum Computing at Balderton HQ alongside the UK's National Physical Laboratory, just a night before Google announced their achievement of quantum supremacy.
I won't get into the fray as to whether Google's result amounts to supremacy or speedup, and I think this blog post by Leo at Rahko does a succinct job of summarizing the result and placing it in context. (For a more detailed take see Scott Aaronson's post) Needless to say, these are exciting times for the future of computing and for achieving a greater capacity to understand Nature.
Our gathering was motivated by John Preskill's paper Quantum Computing in the NISQ Era and Beyond. NISQ is an acronym that describes the current available quantum computing devices. They are Noisy Intermediate-Scale Quantum Computers that represent huge advances compared to the available technology a few years ago, but are still a far cry from a truly Universal Quantum Computer. In the paper, Preskill writes that "Now is an opportune time for a fruitful discussion among researchers, entrepreneurs, managers, and investors who share an interest in quantum computing." As capital has surged into this still-highly experimental field in ever greater quantities (from $70M in total quantum-focused VC in 2015 to $560M so far in 2019), it becomes critical to gather disparate viewpoints within four walls and try to separate signal from noise. (We were also inspired by BlueYard and Google's 2017 Munich gathering, A Quantum Leap.)
Over the course of the day we were lucky to have vigorous debate from company leaders like Christopher Savoie, CEO at Zapata, Ilyas Khan, CEO at Cambridge Quantum Computing, Leo Wossnig, CEO at Rahko and Justin Ging, CCO at Honeywell Quantum. These voices were complemented by many researchers from Oxford, Cambridge, UCL and other universities, by investors, and also by representatives of the UK government, including Roger McKinlay, the Challenge Director for Quantum Technologies at UK Research and Innovation.
Through the course of the afternoon we uncovered some of the challenges associated with measuring progress within quantum computing. What are the right metrics? The oft-reported total qubit number is almost certainly not a fair metric. One also has to look at measures of connectivity, fidelity, and circuit depth. Similarly to when you look at the specs for your new laptop, there is no one metric to rule them all.
We had a debate about the benefits and drawbacks of the various hardware approaches for quantum computing, including superconducting qubits, ion traps, and spin qubits. Most notably, we had agreement that superconducting qubits are easy to design with microwave electronics, but can be inherently unstable and there can be calibration issues. Ion trapped qubits have high fidelity and connectivity, but can be difficult and inaccurate to control. Spin qubits in silicon have the benefit of a pre-existing fabrication supply chain that is already manufacturing silicon chips at massive scale and low cost.
To varying degrees, all approaches are experiencing challenges scaling devices to many high quality qubits. We also lack any sort of infrastructure to allow interoperability between different QCs with different types of qubits.
A recurring theme was the necessity of teams working on hardware, software, and end-users (customers) to maintain an open dialogue. A preference one place in the stack could turn into a specification somewhere else.
On the software side, the discussion largely focused on what degree quantum algorithms would need to combine with classical and machine learning algorithms in order to be usable in the near term. Many of us were excited by the scope of using quantum computing and machine learning to augment one another, (as an example of a hybrid approach see this recent paper). All that said, we still have a ways to go in terms of demonstrating concrete value to customers.
Finally, we discussed the need for a deeper talent pool in quantum; quantum chemistry and other potential areas of near-term applications; and how quantum computing might best be regarded as a new frontier of generalized computation that is well-suited to problems requiring high dimensionality rather than high throughput.
Gathering perspectives from academia, industry, investors, and government is an important way to drive technologies further in a thoughtful fashion and we look forward to continuing the conversation with all those who joined us.