Your browser may have trouble rendering this page. See supported browsers for more information.

This page shows the source for this entry, with WebCore formatting language tags and attributes highlighted.

Title

Quantum Computing

Description

If you've been looking for an introduction to Quantum Computing and how it surpasses our current binary computing, the article <a href="http://arstechnica.com/science/guides/2010/01/a-tale-of-two-qubits-how-quantum-computers-work.ars/" source="Ars Technica" author="Joseph B. Altepeter">A tale of two qubits: how quantum computers work</a> is a great place to start. The language is about as accessible as it's going to get and there are helpful diagrams sprinkled throughout. For example, the engine of a quantum computer---entanglement, and its result: "action at a distance"---is analogized thusly: <bq>Imagine if someone showed you a pair of coins, claiming that when both were flipped at the same time, one would always come up heads and one would always come up tails, but that which was which would be totally random. What if they claimed that this trick would work instantly, even if the coins were on opposite sides of the Universe.</bq> <img attachment="qc-7.png" align="left" class="frame" caption="Partial and Full Decoherence">The final two pages delve into the quantum physics and present some of the main concepts---and equations---which is where things get a good deal hairier, if it's been a long time since you've seen notation of this sort. However, any discussion of quantum physics soon blurs the line between hard, measurable physics and philosophy. At some point, the equations abandon us and it becomes very difficult to know what's going on or even to know what we know about what's going on or to be able to trust that which we observe or that which our carefully planned experiments observe because even our most careful selves are still influenced by us being ourselves and being constrained by the physical system in which we enjoy degrees of freedom. Scientists have already reached the point where they are presented with an <iq>equation [that] means that every part of the experiment, <i>even the experimenter</i>, are all part of a single quantum superposition.</iq> (Emphasis in original.) Heisenberg showed long ago that an observation influences that which it observes; in the world of quantum computing, the systems---or superpositions of states---being observed are so delicate and involve such miniscule energies that the measuring instrument exerts an even greater influence because the energy introduced into the system by the act of measurement is disproportionate to the energy of the system itself. The really strange thing is that if, as stated above, the experimenter is part of the superposition, all attempts to follow the chain of superposition to find an end where there is a so-called collapse of the waveform and things are decided one way or the other---à la Schroedinger's cat---have failed to find it. The article concludes: <bq>Maybe, at some point, it all gets too big, and new physics happens. In other words, something beyond quantum mechanics stops the chain of larger and larger entangled states, and this new physics gives rise to our largely classical world. Many physicists much smarter than myself think that this happens. Many physicists much smarter than myself think it doesn't, and instead imagine the universe as an unfathomably complex, inescapably beautiful symphony of possibility, each superposed reality endlessly pulsing in time to its own energy. To be honest, we just don't know yet. But as far as we've looked, it's turtles all the way down.<fn></bq> How the hell are you supposed to build a computer based on that? We know how to build 2- and 3-qbit computers, but the 100-qbit computer will likely have to wait until we can answer some of the seemingly unanswerable questions outlined above. As with every generation that looks up toward the next, just before a quantum leap of intuition and reasoning, it seems impossible. But imagine how impossible all that we take for granted today would seem to someone from just a century ago. Maybe humans in just a few generations will take "acting outside of the superposition of reality" for granted and be able to perform the most breathtaking calculations in no time at all. More likely, though, they'll be taking quantum computing for granted and most will be using it without even knowing it---perhaps to make the Genius mode on their iPods seek out much cooler playlists. <hr> <ft>A phrase from the book <i>Yertle the Turtle</i> by <i>Theodore Geisel aka Dr. Suess</i>.</ft>