More progress in quantum computing?

I just came across an article that claimed that researchers from the University of Virginia and the University of Tokyo have demonstrated interesting progress in creating a quantum comptuter. The researchers recently published their findings in Physical Review Letters, but you can get a preprint of their article here. Here's the abstract from this paper:

Scalability and coherence are two essential requirements for the experimental implementation of quantum information and quantum computing. Here, we report a breakthrough toward scalability: the simultaneous generation of a record 15 quadripartite entangled cluster states over 60 consecutive cavity modes (Qmodes), in the optical frequency comb of a single optical parametric oscillator. The amount of observed entanglement was constant over the 60 Qmodes, thereby proving the intrnisic scalability of this system. The number of observable Qmodes was restricted by technical limitations, and we conservatively estimate the actual number of similar clusters to be at least three times larger. This result paves the way to the realization of large entangled states for scalable quantum information and quantum computing.

I haven't thought physics in a serious way for over 15 years, so my understanding of this may not be perfect, but I looks to me that to make a useful quantum computer you need lots of fully-entangled qbits. Lots of them. To use Shor's algorithm to factor a big integer, for example, you need roughly 2n qbits to factor a n-bit integer.

And because it's very hard to actually get lots of fully-entangled qbits that don't interact with the outside world at all for the duration of a calculation, there hasn't been much progress in this area since IBM managed (PDF) to factor 15 using a quantum computer back in 2001.

Does this new research really change things much?

It looks to me like this paper describes creating 15 groups of qbits, each of which had 4 qbits in them, for a total of 60 qbits. That sounds very impressive, but it doesn't look like it's really the sort of thing that you can use to implement the quantum algorithms that might one day make lots of existing public-key algorithms obsolete. Having a single set of 60 entangled qbits seems like a better step in that direction.

So while some of the research in quantum computing is producing some physics that looks interesting and that I wish that I had the time to actually understand, it doesn't look it's going to produce a quantum computer that's capable of cracking keys like today's public-key algorithms use any time soon. The 2,048-bit keys that most standards now call for would take a quantum computer with 4,096 qbits to crack them, and it looks like building one of those will be infeasible for may years to come. Perhaps even forever.

Leave a Reply

Your email address will not be published. Required fields are marked *