Limits of the Efficiently Computable

Two interesting and related talks, one on the physical limitations of computational power and the other on cryptography and computational puzzles presented to us by Nature herself.

  1. Quantum Computing and the Limits of the Efficiently Computable
  2. Breaking Codes and Finding Patterns

Quantum Computing and the Limits of the Efficiently Computable

Talk by Scott Aaronson, 2011 Buhl Lecture at Carnegie Mellon University
Shtetl-Optimized ‒ Scott Aaronson’s blog
Quantum Computing Explained
What Quantum Computing Isn’t

Where we are: After 18 years and more than a billion dollars, I’m proud to say that a quantum computer recently factored 21 into 3*7, with high probability. (For a long time, it was only 15.) Scaling up is incredibly hard because of decoherence – the external environment, as it were, constantly trying to measure the QC’s state and collapse it down to classical. With classical computers, it took more than 100 years from Charles Babbage until the invention of the transistor. Who knows how long it will take in this case? But unless quantum mechanics itself is wrong, there doesn’t seem to be any fundamental obstacle to scaling this up. On the contrary, we now know that, IF the decoherence can be kept below some finite but nonzero level, then there are very clever error-correcting codes that can render its remaining effects insignificant. So, I’m optimistic that if civilization lasts long enough, we’ll eventually have practical quantum computers.

“A quantum possibility is less real than a classical reality, but more real than a classical possibility.”

‒ Boris Tsirelson. Quantum information processing lecture notes, 1997

Notwithstanding accounts in the popular press, a decade of research has made it clear that quantum computers would not be a panacea. In particular, we still do not have a quantum algorithm to solve NP-complete problems in polynomial time. But can we prove that no such algorithm exists, i.e. that NP⊄BQP? The difficulty is that we can’t even prove no classical algorithm exists; this is the P versus NP question. Of course, we could ask whether NP⊄BQP assuming that P≠NP — but unfortunately, even this conditional question seems far beyond our ability to answer. So we need to refine the question even further: can quantum computers solve NP-complete problems in polynomial time, by brute force?

LS: So you believe quantum mechanics?

Me: Of course I do!

LS: So a thousand years from now, people will still be doing quantum mechanics?

Me: Well. . . um. . . I guess so. . .

‒ Conversation between me and Lee Smolin

The major problem [with quantum computing] is the requirement that basic quantum equations hold to multi-hundredth if not millionth decimal positions where the significant digits of the relevant quantum amplitudes reside. We have never seen a physical law valid to over a dozen decimals. Typically, every few new decimal places require major rethinking of most basic concepts. Are quantum amplitudes still complex numbers to such accuracies or do they become quaternions, colored graphs, or sick-humored gremlins? [165]

‒ L. Levin. Polynomial time and extravagant models, in The tale of one-way functions. Problems of Information Transmission, 39(1):92–103, 2003. cs.CR/0012023

In my view, the central weakness in the arguments of quantum computing skeptics is their failure to suggest any answer the following question: Exactly what property separates the quantum states we are sure we can create, from the states that suffice for Shor’s factoring algorithm?

‒ Scott Joel Aaronson. Limits on Efficient Computation in the Physical World, 2005

Scott Aaronson, an expert in the realm of computational complexity theory and the founder of ComplexityZoo.com online encyclopedia of computational complexity theory delivered Carnegie Mellon University’s 2011 Buhl Lecture.

In his lecture titled “Quantum Computing and the Limits of the Efficiently Computable,” Aaronson discusses what quantum computers are, whether they can be built on a large scale, and what’s known today about their capabilities and limitations. He goes beyond quantum computers to touch on speculative models of computation, including closed time-like curves and nonlinearities in the Schrodinger equation — an equation that describes how the quantum state of a physical system changes in time.


Breaking Codes and Finding Patterns

Talk by Susan Holmes at Stanford University

… toutes les choses de ce monde ne sont qu’un vray chiffre, … toute la nature n’est qu’un chiffre et secrete escriture du grand nom et essence de Dieu, et de ses merveilles …

‒ Blaise de Vigenère

… all nature is nothing else but a secret cipher …

I do not want it perfect, I want it Thursday.

Bitcoin is very much dependent on having a very strong cryptography but most of us believe that nothing is secure, that is, it is pretty much things that we think are secure in the banks and are now in transmission, they’re actually not as secure. There’s always a weak link and weak links can be very naïve or the weak links if you have somebody who’s really trying to break something and has a lot of information they can certainly listen in on a lot more things than you think. So I think that you should always consider than anything that you do in any an email that you send is open so there is no cryptography which works for that.

‒ I’d like to ask could you comment on the difficulty of breaking for example the Apple encryption compared to the difficulty with the Enigma.

‒ Well, the NSA knows how to do it right but the FBI doesn’t so you have a lower bound and an upper bound.

In her Mathematics Research Center Public Lecture, “Breaking Codes and Finding Patterns,” Professor Susan Holmes will discuss what we can learn from the master codebreakers who solved the intricacies of the Enigma encryption machine during World War II and how to leverage patterns using mathematics and statistics.

We can learn from the master codebreakers who solved the intricacies of the Enigma encryption machine during World War II how to leverage patterns using mathematics and statistics. In the same way that the Poles mined hexagram patterns to discover the Enigma machine’s wiring, today’s scientists mine DNA patterns to unravel the cancer genome.

Turing and his fellow codebreakers used graphs and alignments to prepare their data for the use of the Bombe machine, and his use of Bayes factors and diversity indices are still part of today’s modern data analytic toolbox. I will show how these are even relevant to the study of complex biological systems such as the human microbiome or the immune system.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: