So, I would like to see if there’s some other way out, and I want to emphasize, or bring the question here, because the discovery of computers and the thinking about computers has turned out to be extremely useful in many branches of human reasoning. For instance, we never really understood how lousy our understanding of languages was, the theory of grammar and all that stuff, until we tried to make a computer which would be able to understand language. We tried to learn a great deal about psychology by trying to understand how computers work. There are interesting philosophical questions about reasoning, and relationship, observation, and measurement and so on, which computers have stimulated us to think about anew, with new types of thinking. And all I was doing was hoping that the computer-type of thinking would give us some new ideas, if any are really needed. I don’t know, maybe physics is absolutely OK the way it is. The program that Fredkin is always pushing, about trying to find a computer simulation of physics, seem to me to be an excellent program to follow out. He and I have had wonderful, intense, and interminable arguments, and my argument is always that the real use of it would be with quantum mechanics, and therefore full attention and acceptance of the quantum mechanical phenomena – the challenge of explaining quantum mechanical phenomena – has to be put into the argument, and therefore these phenomena have to be understood very well in analyzing the situation. And I’m not happy with all the analyses that go with just the classical theory, because nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy. Thank you.
‒ Simulating Physics with Computers’ by Richard P. Feynman, 1981
Recap of the talk by Mark Mattingley-Scott from IBM explaining the main three parameters to make quantum computers a reality. These are: 1) number of qubits, 2) quantum error rate which is temperature to which quantum computers are cooled down, 3) decoherence time or how long the qubits remain stable to be useful. See more below and watch his talk.
In this video from the 2018 Swiss HPC Conference, Mark Mattingley-Scott from IBM presents: Quantum Computing, Its Principles, Capabilities and Challenges.
“Since the invention of the transistor in 1947 we have seen a number of incremental innovations – feature size, speed, architectural scalability, to name the most important. Quantum Computing represents a completely new approach to computing though, as we find ourselves in the world of qubits, superposition, entanglement, complex vector algebra and complex probabilities – and never mind the philosophical question of what it means to observe. Quantum Computing is here, right now – and we are at the start of a new way of computing, which will impact us the way the revolution started by Shockley, Bardeen and Brattain did in 1947.
In this talk I will introduce Quantum Computing, its principles, capabilities and challenges and provide you with the insight you need to decide how you should engage with this revolutionary technology.”
Dr. Mark Mattingley-Scott is an IBM Q Ambassador. IBM Q is an industry-first initiative to build commercially available universal quantum computers for business and science.
If you’re talking about quantum computing then you also have to talk about refrigeration because quantum computers typically operate at quite low temperatures and you’re not using hot and cold water but you’re using helium 3 and helium 4 and one of the three interesting thing about helium is cooling.
Helium 3 is the stuff that’s in the atmosphere, there’s not a lot of it but there’s some of it there. Helium 4 is really, really rare. There’s a lot of it on the moon but otherwise you can only produce it when you make nuclear weapons.
Caffeine has 95 electrons. To simulate a single energy state of one caffeine molecule we would require 1048 classical computer bits. Number of atoms in our planet is estimated to be to be somewhere between 1049 and 1050 so we would need as much memory as between 1 and 10 percent of the Earth’s mass. We could actually represent this using 160 qubits. 160 because log to the base 2 of 10 to the 48 is 160. 160 qubits represent 1048 classic bits. That’s the theory.
Five qubits give us 25or 32 possible simultaneous states. We put those individual qubits into superposition. Each qubit is initialized as a 0 or 1 and then mathematically speaking we apply a transformation, something called a Hadamard transformation. A Hadamard transformation basically rotates a little sphere called the Bloch sphere through 90 degrees. It takes that 1 or 0 and puts it in a state of superposition and if you do that with all 5 bits you then have a system which is simultaneously in all 32 states. You could also entangle bits to make two different qubits dependent on each other which interestingly forbids you from looking inside those two qubits. You can’t look inside any of the qubits anyway but in entangled qubits and an entangled pair the state of the pair of qubits is even within the quantum system indeterminate so they cannot be described independently of each other. Now you’ve got a system in 32 states simultaneity and the trick is to then work out a way of mapping your problem into specific set of transformations which operate on those 32 states or dependent on the number of two-bit qubits.
There’s an algorithm called Shor’s algorithm which is designed to take a number and determine its prime factors. For small numbers it’s extremely inefficient. You can go on IBM’s quantum computing cloud service and can play with the Shor factoring algorithm with five cubits or with 16 cubits and watch it underperform your laptop. But if at some point in the future we have maybe a few thousand cubits than traditional cryptography which is based on factoring large numbers will start to be a little bit crackable. I was asked last night when’s that will happen. I couldn’t really answer because it might be twenty years, maybe fifteen. Not that RSA will be broken in fifteen or twenty years but that’s the point where we’re going to have to start to really have some viable alternatives to traditional cryptography.
If you go on the IBM quantum experience website it tells you what temperature the particular quantum computer is at, they’re typically somewhere between 10 and 20 milliKelvin – that’s where the helium 3 and helium 4 come in and incidentally in order to boot a quantum computer the bootstrapping process which is basically cooling takes about 24 hours, so running a quantum computer is not trivial but it’s not impossible and we have several of these devices in Yorktown. For comparison, the temperature of intergalactic and interstellar space where there is nothing out there is 2.7 Kelvin due to microwave background radiation which represents equivalent heat. Even intergalactic space which is as cold as it gets is significantly warmer than the business end of the quantum computers.
When we talk about quantum computers we have three important quality measurements which make a good or not so good quantum computer.
- The first one which everybody talks about is the number of qubits. That’s the one you see in press announcements. We know we’ve now got 50 cubits or we’ve now got 55 cubits and it’s important but it’s not the most important.
- As important is the error rate. At 10, 12 or 14 milliKelvin there’s still finite temperature. That means the microwave resonator or the atom or the wave gap that contains a microwave standing wave, there’s a certain amount of thermal noise in there and that thermal noise leads to errors, and managing that error rate is of critical importance in fact it’s probably more important than making more qubits. If you can lower your error rate by 50 percent then that’s like that’s like adding another 50% more qubits, so a 70 cubed system with an error rate of x in terms of its ability to perform algorithms is more or less equivalent to a 50 qubit computer with an error rate of x over 2.
- The other important measure of quality is the so called decoherence time. Decoherence means that even the best qubit maybe one day we can get down to 1 milli-Kelvin or maybe even micro kelvins, these qubits will still at some point the thermal noise will just destroy superposition. That electron will have a determined state at that point, there’s no way around it. This decoherence time for the traditional programmer among you that’s kind of like saying every quantum program you write has a finite maximum execution time, you cannot run a program which is longer than 10 milliseconds. If you link that to your clock rate that gives you the maximum number of quantum operations you can perform.
Those three values: number of quantum bits, quantum error rate and decoherence time – they’re all of key importance and all the money that IBM and Google and Intel and Microsoft and all the rest of us are all pumping into quantum computing, basically it’s about improving those three numbers because improving those three numbers gets you closer to the goal, the holy grail of a universal quantum computer.
Where we are right now, there are companies doing quantum annealing which is kind of almost like a cheat a way of doing quantum computing which is cheating. Basically we have approximate quantum computers where we are now at several tens of qubits but we’re pretty clear about how to get to hundreds or even thousands of qubits and then at some point we’re going to have a universal false tolerant quantum computer.
HPC Advisory Council
Simulating Physics with Computers
by Richard P. Feynman, 1981
The Principles of Quantum Mechanics (International Series of Monographs on Physics)
Quantum Computer Science: An Introduction
by N. David Mermin, Cornell University, New York
Mutual Causality in Buddhism and General Systems Theory: The Dharma of Natural Systems
by Joanna Macy
What is Life?: With Mind and Matter and Autobiographical Sketches