I have recently been web-researching quantum computing.
Will we see these in our lifetimes (ever?) (The error correction issue, for example, seems intractable to me).
I have recently been web-researching quantum computing.
Will we see these in our lifetimes (ever?) (The error correction issue, for example, seems intractable to me).
Just looking at the results from one website, I'd say it's not that impossible:
http://arstechnica.com/journals/science.ars/2008/03/28/encoding-more-than-one-bit-in-a-photon
http://arstechnica.com/journals/science.ars/2008/10/28/scalable-quantum-computing-in-the-next-5-years
http://arstechnica.com/news.ars/post/20080729-finding-lost-qubits.html
http://arstechnica.com/news.ars/post/20080509-new-quantum-dot-logic-gates-a-step-towards-quantum-computers.html
http://arstechnica.com/news.ars/post/20080626-three-dimensional-qubits-on-the-way.html
http://arstechnica.com/news.ars/post/20080527-molecular-magnets-in-soap-bubbles-could-lead-to-quantum-ram.html
For a more technical overview of why it's not as hard as it used to be, there's a four-part series on self-correcting quantum computers:
http://scienceblogs.com/pontiff/2008/08/selfcorrecting_quantum_compute.php
Error correction and loss of coherence are the big problems in quantum computing, as I understand it. Lots of smart people are hard at work on solving these problems, but last I read, it was looking like error-correction requirements might be exponential over the number of qbits, which really detracts from the "we'll solve NP problems in an instant!" attraction of quantum computation.
I vote: Hype.
...but hope I'm wrong.
Randy
Quantum computing isn't much past the "idea" stage. Sure, they can multiply two 2-bit integers, but it takes a dozen grad students a week to set up for the run, and another week to validate the results.
Long-term it's probably got a lot of potential, though it may never be stable enough for use outside of a highly controlled lab-based "supercomputer" environment.
At this point I'd classify it more as Physics than Computer Science. In a way, it's as if Charles Babbage got his hands on one of Michael Faraday's papers and started thinking about maybe, possibly, someday, being able to use electromagnetism as a basis for calculation.
There's been a fair amount written about Quantum Computing over the last couple of years in Scientific American, much of it by the primary researchers themselves: http://www.sciam.com
Quantum computing is a tool; it's just a tool too raw to have any sort of useful application as of this moment, but who knows.
Nice, I get to re-use my answer from another SO question word-by-word. :)
A few answers mention quantum computers as if they're still far in the future, but I beg to differ.
There were vague mentions of possibility of quantum computers in 1970s and 1980s (see timeline on Wikipedia), however the first "working" 3-qubit NMR quantum computer was built in 1998. The field is still in infancy, and almost all progress is still theoretical and confined to academia, but in 2007 company called D-Wave Systems presented a prototype of a working 16-qubit, and later during the year 28-qubit adiabatic quantum computer. Their effort is notable since they claim that their technology is commercially viable and scalable. As of 2010, they have 7 rigs, current generation of their chips has 128 qubits. They seem to have partnered with Google to find interesting problems to test their hardware on.
I recommend this short 24-minute video and Wikipedia article on D-Wave for a quick overview, and there a lot more resources on this blog written by D-Wave founder and CFO.