top of page
Search
Writer's pictureCharles Edge

A Brief History Of Quantum Computing


As with much of what we work with today, the basic science or mechanics (or both) go back much, much further. To set the stage for the emergence of quantum computing, let’s go back in time to when we first started to uncover theories about the tiniest of the things.

German physicist Max Plank discovered quantum mechanics in 1900, which he won the Nobel Prize in Physics for. Here, we see breaking everything in nature down to atoms and particles. He built on work done by John Dalton on the law of multiple proportions and Robert Brown, who noticed that grains of dust in water moved erratically, a phenomenon we now call Brownian motion.


Albert Einstein proved that theory in 1897 and Jean Perrin built on that work to start measuring the mass of those molecules and provide evidence and thus the proof of particles. And they were able to build on top of the work from British Chemist John Newlands to add to the periodic table he started back in the mid 1800s. And our very idea of how atoms were comprised was emerging as fellow Brit JJ Thomson realized cathode rays weren’t electromagnetic waves but instead far, far lighter than the lightest atom, hydrogen. And thus the electron was the first of the subatomic particles to be discovered and were in fact the particles that carried electrical current over wire. Einstein introduced the special theory of relativity in 1905 and over time we began to realize one of the definitions of a particle is in fact that it is a quantum excitation of a field. And another is that particles could be measured in a detector.


Many of these discoveries were made possible by the fact that our tools to measure objects were getting smaller and more precise. After Plank, Schrödinger, Bohr, Heisenberg, and others explored and continued the development of quantum mechanics, giving us matrix mechanics and wave mechanics. John von Neumann, Hilbert, and others then formalized the theories and unified them and in some ways predicted the use of electronic flows in semiconductors.


World War II comes along, many of the researchers involved up to this point helped build a nuclear bomb and World War II was put to rest. Max Plank passed in 1947 after his son was hung for attempting to assassinate Adolf Hitler, having had his home destroyed during an air raid and all of his papers lost to history.


Again we saw more precision. Our ability to measure miniaturized and our ability to predict the movement and state of particles improved and hastened. And we saw the combination of Boolean logic lead to the emergence of computing after first Turing and then others broke down every problem known to humanity into simple Turing machines. And we saw the standardization of computing in the Von Neumann Architecture. And the rise of solid state electronics, so transistors then semiconductors and finally microchips. The number of electrons we can shoot over smaller and smaller spaces, getting to 10, the latest ARM chips including Apple’s M1 and on using 5 nanometer technology, and even the recent discovery of 2-nanometer chips.


But while computing using tinier and tinier wires in solid state semiconductive materials was getting smaller and faster and the languages were evolving there were people hard at work on the intersectionality between quantum mechanics and computing, or quantum computing. That began with understanding quantum entablement, initially written about by Einstein, Boris Podolsky, and Nathan Rosen (or EPR) in 1935 and then explored much more thoroughly by Erwin Schrödinger and his cat. Here, we see a group of particles that can be generated and share proximity such that each can be described and potentially measured.


Others worked on that, such as David Bohm, who argued that the EPR theories couldn’t yet be formalized as we hadn’t yet worked out how to prove the theories with physical precision. The issue wasn’t that we couldn’t measure entanglement but instead that there were what John Stewart Bell described as loopholes, or certain types of entanglements that have a problem with their design or findings. After all, when he was working on them in the early 1960s, experiments were more well, experimental, than practical. Thus he developed the idea of entanglement witnesses, or functionals that can tell the difference between entangled states.


Classical computing has us wiring up logical gates or flip flop circuits to route Boolean logic. So here’s where things can get a bit harder to understand in quantum computing. There’s basically an infinite possible number of quantum logic gates. They are unitary operators, or bound on a Hilbert space, and are represented by unitary matrices. This was all incredibly mathematical intensive but quantum circuits give us Bell states, a two-state system synonymous with the class computational bit. And this is where quantum information theory becomes possible.


Stephen Wiesner was the son of Jerome Wiesner, who served as John Kennedy’s chair of the Science Advisory Committee and one of the scientists studying microwave radar during World War II. The younger Wiesner was a graduate student at Columbia in the 1960s where he studied quantum mechanics and that research is where quantum information theory was born.


Wiesner published a paper with the ACM in 1968 introducing the world to conjugate coding, the first quantum coding application for cryptography. Here, Wiesner introduced quantum multiplexing, where photons were polarized as cubits to transfer numbers in such a way that each destroys the next. And so quantum cryptography was born. If only in theory. And yet theory can easily become reality with a few decades of engineering work.


So that was 1971, three years before the Intel 4004 put the microchip on the map. And James Park brought us the no-cloning theorem in 1970. Charles Bennett showed the reversibility of quantum computation in 1973 and in 1976, Polish Roman Ingarden wrote a paper called Quantum Information Theory that would look at the classical Claude Shannon branch of information theory and attempt to generalize it in a way that it worked with quantum mechanics.



1980 comes around and Paul Benioff establishes how a quantum computer could be used as a Turing machine. This is when the quantum computer was born. It’s as though he was the Von Neumann of quantum computing. Even though Von Neumann himself plays a role in this story. And he wasn’t alone. Yuri Manin in Russia was looking at the same thing and Tommaso Toffoli described what we now call the Toffoli gate, adding the concept of reversible gates. Quantum mechanics requires reversibility. The next year Richard Feynman joined Benioff in a talk on quantum computing where they looked at classical computing models and basically said “let’s start over” because of how different it is from Boolean logic. Benioff then spent the next few years developing his theories and William Wooters, Dennis Dieks, and others join him. By 1985 Richard Feynman said “... it seems that the laws of physics present no barrier to reducing the size of computers until bits are the size of atoms, and quantum behavior holds sway.”


Benjamin Schumacher coined the term qubit in 1995. It turns out that the quantum analog to a binary bit, or a qubit, is much more valuable than just processing faster. The qubit can have an on and off state but also a both state, meaning it’s much closer to a triode and so able to run 50% faster per cycle given completely rewritten code. By 1998 the initial prediction of quantum teleportation were verified and by 2020 quantum teleportation was done over a 27 mile distance with a 90% fidelity. While these are simple operations such as a basic CNOT gate, there are more and more people and organizations experimenting with quantum computing every day. In fact, over 100 organizations now have quantum computers or are working on it in countries ranging from Iran to Russia to the US to Germany. In fact, researchers in Tibet published a paper in 2017 demonstrating quantum teleportation using a ground-to-satellite uplink with an 80% fidelity. And the concept of fidelity here is important as an arbitrary unknown quantum state can still not be, as that paper put it “measured precisely or replicated perfectly.”


But back to the Hip to be Square quantum physics of the 1980s. Gerard Milburn proposed an optical version of a logic gate using quantum optics and as the research was rethought and peer reviewed in the 1990s, we got closer and closer to demonstrating a true quantum computer. David Deutsch, Richard Joss, Dan Simon, Peter Shor, and by 1994 NIST organized the first government sponsored workshop on quantum computing. Since much of quantum computing involves classical computing analogs, think of this as the quantum equivalent of the Moore School Lectures.


Information needed to be set free and the next year we saw the US Department of Defense looking into quantum cryptography, the first quantum logic gate from NIST’s Chris Monroe and David Wineland, and given that we were getting closer and realizing the fidelity wasn’t going to be 100%, the basic math was worked out for error correction. New algorithms for quantum processing was proving the algorithms could work faster continued to come along but in 1998 Jonathan Jones and Michele Mosca from Oxford University demonstrated the first quantum algorithm in an experiment, using a 2-qubit quantum computer to solve a problem. And that year we went to 3-qubits, and the first quantum search algorithm known was Grover’s algorithm was successfully run.


But all of the experiments used NMR quantum computers. These used hydrogen and carbon 13 nuclei and what amounts to an MRI machine to observe the nuclear magnetic resonance, or NMR. The spin states of the nuclei in molecules then become qubits. Ensemble not entangle. So in 1999, Japanese Yasuobu Nakamura and Taiwanese Jaw-Shen Tsai successfully experimented with a superconducting circuit as a qubit.


As we moved into the 2000s, we moved into a 5-qubit NMR and then 7-qubit NMR computer and IBM Research along with Stanford were able to successfully get order-finding algorithms working. And with all the advances some started to think quantum computing was ready for commercial workloads. D-Wave was spun out of the University of British Columbia in 1999. They demonstrated a 16-qubit annealing processor in 2007 and finally shipped a 128-qubit chipset in 2011 for the D-Wave One, marketed as the first commercially available quantum computer at the time.


2000 is also when the Technical University of Munich and Los Alamos National Laboratory demonstrated 5 and 7 Qubit NMR computing respectively. By 2002 we picked up enough steam in the field to establish a Quantum Computing roadmap. The next year, DARPA brought a quantum network online, connecting the various research systems - much as those old mainframes were connected when they brought ARPAnet online in the 1960s. The same year we saw experiments spread to Johns Hopkins, University of Innsbruck, University of Queensland, and Vienna. And the next year Chinese researchers at the University of Science and Technology of China demonstrated the first five-photon entanglement. That scientist, Pan Jianwei, would later lead the project to deploy the first quantum satellite in 2016.


Harvard, Georgia Tech, the University of Illinois Champaign-Urbana, University of Leeds, University of York, University of Bonn, MIT, Cambridge, Camerino Italy, Copenhagen, University of Arkansas, Utah, Sheffield, Delft in the Netherlands, Bristol and many others (even Google in the private sector) came up with new uses and theories and practical implications over the next few years. We could teleport quantum entangled particles, perform most every time of basic computational operation in a modern silicon-based microchip using quantum entanglement, extended the life of a qubit, and had a blueprint for quantum random access memory. By 2009 we had the first quantum computer scaled down to a chip size.



While the majority of work being done is at research institutions, it’s not just research teams at universities. IBM released the first circuit-based quantum computer to the commercial market in 2019. Developed at IBM Research, organizations can now access the system in a time sharing capacity for other uses. And ion-trap quantum computing has been commercialized by IonQ, spun out of research done at the University of Maryland by Chris Monroe, that guy helped demonstrate the first quantum logic gate all the way back in 1995. For the record, he looks a little like Val Kilmer did in Real Genius and in fact began his career trying to cool lasers. Guessing he eventually used his wily genius to destroy a house with popcorn and lasers. And with nearly half a billion in funding they seem primed to be the Control Data Corporation or Digital Equipment or Amhdal of Quantum Computing. Which is to say solid revenues for a decade. Unless of course they become the next thing. And with new predictive algorithms to tell us what the next thing is, who’s to say they aren’t already on that. Just as Val Kilmer transitioned to Top Gun and the Doors and Willow and Batman from Weird Science? Unless you’re thinking what I’m thinking? Tombstone. I’m your huckleberry.

Yes, as we approach singe nanometer or smaller devices, quantum computing seems to be the huckleberry that keeps Moore’s Law moving in the direction it is supposed to be. Sure, there’s no ARPA grant like there was with RISC computing, which effectively gave us the ARM chip. But ironically DARPA just signed an agreement with ARM. Only problem with that is these das ARM stands for Autonomous Robotic Manipulation in the eyes of DARPA.mil, rather than the organization putting the fastest chips to have ever been mass produced in our phones. But hey, DARPA is way more about changing the world than mass production. That comes when the huckleberry knuckleheads are all gone and the practical uses show up on social coding sites.



Tensorflow has a hybrid quantum-classical machine learning branch. Microsoft has QuantumKatas for learning their Q#, Microsoft has a full Quantum development kit, Qiskit is an assembly language for quantum workloads, there’s code for IBM’s Quantum Experience, Qutip is a fantastic library in python, as is Cirq from Google. In fact, Github has nearly 1,500 projects tagged with quantum computing. And so while the uses are minimal today we do see actual workloads shipping. JP Morgan has published work on quantum work to price option contracts and do financial modeling. Daimler AG looking to harness better battery simulation, Volkswagen worked with D-Wave on the traveling salesman problem. Goldman Sachs is writing algorithms, Honeywell is writing algorithms.


And yet most of the actual outcomes still fall in the “we need a sufficient qubit count” or “we need to stabilize noise” and so most shipping game changers are still a decade away. And yet in 2019 Google had a quantum computer perform a calculation in a couple of days that a classical transistorized supercomputer would have taken 10,000 years to complete. These machines still have names like Sycamore as they’re one-offs. But there are plans and tests and even results. 2014 saw the leak of information about the Penetrating Hard Targets project where the US National Security Agency was working on quantum computing to reverse cryptography. Going back to their work with MIT to bring forward new types of memory and cooling in the 60s, and with the development of a network of machines and miniaturization, we can actually see a future where some day - some day…


But then, that’s what I said when I used the word Quantum in my first avatar back in the 90s. Yes quantum computing still isn’t outpacing classical computing. Sometimes “some day” spreads and becomes mainstream faster than we think, it just might take longer to get here than we’d of thought. Good science takes time. Qubits are still incredibly fragile. They require a massive amount of cooling. We are still deciding which is the right type of qubit to do more standardized computing for scaled commercialization. They aren’t entirely accurate.


But solid state electronics, which we rely on in our current devices, took time as well. The first transistor in 1947, integrated circuits in 1958, and the microprocessor in 1971. That’s 24 years not including the decades of science that led up to it. And yet if it fulfills its promise - the impact could be even greater than the impact the microchip had on the world.

4 views0 comments

Recent Posts

See All

Comments


bottom of page