State of the Art in Quantum Computing

Following are the lectures on quantum computing in increasing complexity order. The lectures give good overview of the science and technology as it works now.

In the popular press, quantum computers are often presented not just as an exciting frontier of science and technology (which they are), but as magic devices that would work by simply trying every possible solution in parallel.

  1. Quantum Computing: Untangling the Hype
  2. Quantum revolution in science and technology
  3. Quantum Computing with Trapped Ions
  4. Black Holes, Firewalls, and the Limits of Quantum Computers
  5. Quantum Computing and Quantum Supremacy
  6. Quantum Supremacy: Checking a Quantum Computer with a Classical Supercomputer
  7. Closing the Gap between Quantum Algorithms and Hardware
  8. Quantum Computing Standards

Quantum Computing: Untangling the Hype

Panel at the Royal Institution

Quantum technology has the potential to revolutionise whole fields of computing; from cryptography to molecular modelling. But how do quantum computers work?

Join leading experts to untangle the quantum computing hype, at this event supported by the Embassy of the Kingdom of the Netherlands.

  • Artur Ekert works on information processing in quantum-mechanical systems. His invention of entanglement-based quantum cryptography in 1991 triggered an explosion of research efforts worldwide and continues to inspire new research directions. As well as showing that Bell’s inequalities can be used to test for eavesdropping, he has contributed to many important advances in the foundations and experimental realisations of quantum communication and computation. He has played a leading role in transforming quantum information science into a vibrant interdisciplinary field.
  • Harry Buhrman got his PhD in Computer Science from the University of Amsterdam. Buhrman built the quantum computing group at CWI, which was one of the first groups worldwide and the first in The Netherlands working on quantum information processing. Buhrman’s research focuses on quantum computing, algorithms, and complexity theory. He co-developed the area of quantum communication complexity (quantum distributed computing), and demonstrated for the first time that certain communication tasks can be solved (exponentially) more efficient with quantum resources. This showed that quantum computers can not only speed up computations, but also communication – which opened up a whole new application area of quantum information processing. Buhrman co-developed a general method to establish the limitations of quantum computers, and a framework for the study of quantum algorithms, which is now textbook material. In 2001, Harry Buhrman became professor of algorithms, complexity theory, and quantum computing at the University of Amsterdam (UvA) and group leader of the Quantum Computing Group at the Center for Mathematics and Informatics (CWI). Buhrman co-founded QuSoft in 2015, a research center for quantum software, for which he is also co-director. During his career, Buhrman obtained various prestigious awards. Buhrman also has a leading role in the national Quantum Software Consortium that was awarded an NWO Gravity grant in 2017.”
  • The event is chaired by award-winning science writer Philip Ball, whose latest book is entitled ‘Beyond Weird: Why Everything You Thought You Knew about Quantum Physics Is Different.’

Here’s a test to find out if you are an engineer or a logician (mathematician). There are three bulbs in the room on the left and three switches in the room on the right. Can you figure out which switch is associated with which bulb. If you think that it is not possible then you are a logician. Engineers know how to do it on the other hand. To find out (if you are not an engineer) watch the talk…

Quantum Software Manifesto
Research Center for Quantum Software
Quantum Software Manifesto


Quantum revolution in science and technology

Talk by Edward Hinds at the University of Melbourne

Professor Edward Hinds provides a beginner’s guide to quantum signaling and outlines the current state of quantum science, and the exciting prospects for transforming society through the technical innovations that are emerging. This Miegunyah Lecture for the University of Melbourne was delivered on 26 April 2018.


Quantum Computing with Trapped Ions

Bill Phillips from NIST says that a Quantum Computer is more different from classical computer than classical computer from abacus. His meaning is that classical computer and abacus are Turing machines in the abstract, you can model them as moving classical information around even though they look different. But quantum computers are not Turing machines, they work with quantum information via superpositions and entanglements, it’s so radically different. Atomic systems have not yet found their way into behemoths of corporate structure, I am including Microsoft, Intel, IBM, and Google. They will try their hand in superconducting circuits because you can print them on the chip and design control systems, but my bold prediction is that if they are in the field at all they are going to be investing in atomic systems in the future.

Individual atoms are standards for quantum information technology, acting as qubits that have unsurpassed levels of quantum coherence, can be replicated and scaled with the atomic clock accuracy, and allow near-perfect measurement. Atomic ions can be confined by silicon-based chip traps with lithographically-defined electrodes under high vacuum in a room temperature environment. Entangling quantum gate operations can be mediated with control laser beams, allowing the qubit connectivity graph to be reconfigured and optimally adapted to a given algorithm or mode of quantum computing. Existing work has shown 99.9% fidelity gate operations, fully-connected control with up to about 10 qubits, and quantum simulations with over 50 qubits. I will speculate on combining all this into a single universal quantum computing device that can be co-designed with future quantum applications and scaled to useful dimensions.


Black Holes, Firewalls, and the Limits of Quantum Computers

Talk by Scott Aaronson from the University of Texas at Austin
Abstract and slides

Quantum computers are proposed devices that would exploit quantum mechanics to solve certain specific problems dramatically faster than we know how to solve them with today’s computers. In the popular press, quantum computers are often presented not just as an exciting frontier of science and technology (which they are), but as magic devices that would work by simply trying every possible solution in parallel. However, research over the past 25 years has revealed that the truth is much more subtle and problem-dependent: for some types of problems, quantum computers would offer only modest speedups or no speedups at all. These limitations are entirely separate from the practical difficulties of building quantum computers (such as “decoherence”), and apply even to the fully error-corrected quantum computers we hope will be built in the future. In this talk, I’ll give a crash course on what computer science has learned about both the capabilities and the limitations of quantum computers. Then, in a final section, I’ll describe a remarkable and unexpected connection – made just within the last five years – where the conjectured limitations of quantum computers have been applied to issues in fundamental physics. These include Hawking’s black-hole information puzzle (in its modern incarnation as the “firewall paradox”), and understanding the growth of wormholes in the so-called gauge/gravity duality that emerged from string theory.

From: Black Holes, Quantum Mechanics, and the Limits of Polynomial-Time Computability

By Stephen P. Jordan

DOI: 10.1145/2983539

Which computational problems can be solved in polynomial-time and which cannot? Though seemingly technical, this question has wide-ranging implications and brings us to the heart of both theoretical computer science and modern physics.

. . .

The insensitivity of polynomial-time computability to the details of the underlying model of computation led to the following conjecture, known as the complexity-theoretic Church-Turing thesis: “A Turing machine can simulate any realistic model of computation with polynomial overhead.

. . .

In the 1980s, Richard Feynman observed certain quantum systems of many particles apparently take exponential time to simulate with conventional, classical computers. Feynman suggested these systems may be fundamentally impossible to simulate in polynomial time by conventional Turing machines, and therefore by harnessing these systems, one may be able to perform certain computations that are intractable on conventional Turing machines. To mathematically capture the computational power of quantum systems, various formal models of universal quantum computation were defined, including quantum Turing machines and quantum logic circuits. Subsequently, these models were all proven to simulate one another with polynomial overhead. The complexity class of problems solvable in polynomial-time by any of these models is the same, and is called BQP (which stands for Bounded-error Quantum Polynomial-time). Based on this, Deutsch argued the complexity-theoretic Church-Turing thesis was now defunct and must be replaced by a new quantum Church-Turing thesis: “A quantum Turing machine can efficiently simulate any realistic model of computation with polynomial overhead.”

. . .

Black holes are a class of physical systems in which quantum-gravitational effects are expected to be important, and which have been studied extensively from a phenomenological point of view. In the case of black holes, the arguments that can be made by applying various well-established symmetries and physical principles seem to yield results that are in conflict. The most recent version of this conflict is called the “Black Hole Firewalls Paradox,” which descends from a long line of arguments about whether information is destroyed after falling into black holes.

Some phenomenological models of black holes, which are currently being debated in the context of the firewalls paradox, contain features that, at least at first glance, appear to have shocking consequences for polynomial-time computability. As an example, in the final-state projection model of Horowitz and Maldecena, the quantum states of black holes may evolve nonlinearly in time. Generically, nonlinear transformations of quantum states can be used to obtain polynomial-time solutions to NP-hard problems, as was earlier shown by Abrams and Lloyd. Not even quantum computers are believed to solve NP-hard problems in polynomial time. A physical device achieving this would be shockingly powerful. For any problem whose solution can be verified efficiently, such a device would efficiently find a solution. In particular, algorithms for verifying formal mathematical proofs could be bootstrapped to yield efficient algorithms finding proofs, making human mathematicians obsolete [2].

Could black holes be carrying out computations exponentially more complex than those allowed by our current theories of quantum computation? Recent analysis throws some cold water on this provocative suggestion [3]. In examining several models of black holes for which quantum mechanical principles are modified, it is found if the models are pushed into parameter regimes that imply efficient solution to NP-hard problems, they also imply the ability to send superluminal signals using quantum entanglement. According to special relativity such superluminal signaling allows messages to be sent backward in time, leading to numerous logical paradoxes. Thus, most physicists believe in a principle called “causality,” which states that superluminal signaling is impossible and its presence in a theory is a sign the theory is incomplete, wrong, or pushed beyond its limits of applicability.

Physical arguments cannot resolve the mathematical problems of complexity theory, such as proving that neither P nor BQP contains NP. However, taking for granted that these standard assumptions about complexity theory are true, one is left with an interesting physical question: Which complexity class corresponds to the problems solvable with polynomial resources in our universe? The results relating superluminal signaling to the efficient solution of NP-hard problems suggest the impossibility of the latter is a robust consequence of basic physical principles such as causality, rather than a fragile inference dependent on all of the precise details of quantum mechanics [3]. The quantum Church-Turing thesis once again comes out looking strong. Testing it further remains an outstanding open problem and a grand adventure promising to yield new insights into both physics and computer science.

  1. Jordan, S.P., Lee, K.S.M. and Preskill, J. Quantum algorithms for quantum field theories. Science 336, 6085(2012), 1130–1133. arXiv:1111.3633.
  2. Fortnow, L. What if P = NP?. Computational Complexity. Blog. May 25, 2004;
  3. Bao, N. Bouland, A and Jordan, S.P. Grover search and the no-signaling principle. 2015. arXiv:1511.00657.

Quantum Computing and Quantum Supremacy

John Martinis from Google
@ Inside HPC
John Martinis research
Preview of Bristlecone, Google’s New Quantum Processor
Slides

The goal of the Google Quantum AI lab is to build a quantum computer that can be used to solve real-world problems. Our strategy is to explore near-term applications using systems that are forward compatible to a large-scale universal error-corrected quantum computer. In order for a quantum processor to be able to run algorithms beyond the scope of classical simulations, it requires not only a large number of qubits. Crucially, the processor must also have low error rates on readout and logical operations, such as single and two-qubit gates.

The purpose of Google’s gate-based superconducting system is to provide a testbed for research into system error rates and scalability of our qubit technology, as well as applications in quantum simulation, optimization, and machine learning.

Bristlecone is Google’s newest quantum processor.
Bristlecone is Google’s newest quantum processor.

The guiding design principle for this device is to preserve the underlying physics of our previous 9-qubit linear array technology1, 2, which demonstrated low error rates for readout (1%), single-qubit gates (0.1%) and most importantly two-qubit gates (0.6%) as our best result. This device uses the same scheme for coupling, control, and readout, but is scaled to a square array of 72 qubits. We chose a device of this size to be able to demonstrate quantum supremacy in the future, investigate first and second order error-correction using the surface code, and to facilitate quantum algorithm development on actual hardware.

Before investigating specific applications, it is important to quantify a quantum processor’s capabilities. Our theory team has developed a benchmarking tool for exactly this task. We can assign a single system error by applying random quantum circuits to the device and checking the sampled output distribution against a classical simulation. If a quantum processor can be operated with low enough error, it would be able to outperform a classical supercomputer on a well-defined computer science problem, an achievement known as quantum supremacy. These random circuits must be large in both number of qubits as well as computational length (depth). Although no one has achieved this goal yet, we calculate quantum supremacy can be comfortably demonstrated with 49 qubits, a circuit depth exceeding 40, and a two-qubit error below 0.5%. We believe the experimental demonstration of a quantum processor outperforming a supercomputer would be a watershed moment for our field, and remains one of our key objectives.

Googles believes Bristlecone would then be a compelling proof-of-principle for building larger scale quantum computers. Operating a device such as Bristlecone at low system error requires harmony between a full stack of technology ranging from software and control electronics to the processor itself. Getting this right requires careful systems engineering over several iterations.

We are cautiously optimistic that quantum supremacy can be achieved with Bristlecone, and feel that learning to build and operate devices at this level of performance is an exciting challenge! We look forward to sharing the results and allowing collaborators to run experiments in the future.


Quantum Supremacy: Checking a Quantum Computer with a Classical Supercomputer

One more talk by John Martinis, at the UC Santa Barbara
Slides

As microelectronics technology nears the end of exponential growth over time, known as Moore’s law, there is a renewed interest in new computing paradigms such as quantum computing. A key step in the roadmap to build a scientifically or commercially useful quantum computer will be to demonstrate its exponentially growing computing power. I will explain how a 7 by 7 array of superconducting xmon qubits with nearest-neighbor coupling, and with programmable single- and two-qubit gate with errors of about 0.2%, can execute a modest depth quantum computation that fully entangles the 49 qubits. Sampling of the resulting output can be checked against a classical simulation to demonstrate proper operation of the quantum computer and compare its system error rate with predictions. With a computation space of 2^49 = 5 x 10^14 states, the quantum computation can only be checked using the biggest supercomputers. I will show experimental data towards this demonstration from a 9 qubit adjustable-coupler “gmon” device, which implements the basic sampling algorithm of quantum supremacy for a computational (Hilbert) space of about 500. We have begun testing of the quantum supremacy chip.


Closing the Gap between Quantum Algorithms and Hardware

Talk by Fred Chong at Yale University

Quantum computing is at an inflection point, where 72-qubit (quantum bit) machines are under test, 100-qubit machines are just around the corner, and even 1000-qubit machines are perhaps only a few years away. These machines have the potential to fundamentally change our concept of what is computable and demonstrate practical applications in areas such as quantum chemistry, optimization, and quantum simulation. Yet a significant resource gap remains between practical quantum algorithms and real machines. Programming, compilation and control will play a key role in increasing the efficiency of algorithms and machines to close this gap. In this video, Fred Chong will outline several grand research challenges in closing this gap, including programming language design, software and hardware verification, defining and perforating abstraction boundaries, cross-layer optimization, managing parallelism and communication, mapping and scheduling computations, reducing control complexity, machine-specific optimizations, learning error patterns, and many more. He will also describe the resources and infrastructure available for starting research in quantum computing and for tackling these challenges.


Quantum Computing Standards

Talk by William Hurley

Quantum computing is poised to pave the way for groundbreaking technological achievements across nearly all verticals. Join the IEEE Standards Association Corporate Program for a complimentary one-hour webinar on the state of quantum computing and why standards are needed. William Hurley a.k.a. whurley, founder and CEO of Strangeworks, and chair of the IEEE Quantum Standards Working Group will provide an in-depth look at the state of quantum computing and why standards are needed.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: