Good topic.
Interesting how fast is tech and science evolving.
Recently I read this NSA related article: http://www.hpcwire.com/hpcwire/2012-03-19/nsa_employs_cutting-edge_supercomputing_for_domestic_surveillance.html (from 2012 March 19th, less than 3 yrs after the original question in this post).
Much of the data is encrypted though, and that’s where the
supercomputing comes in. To extract the information, the NSA had to
employ brute force algorithms, and that required a lot of computing
power. Bamford reports that the Multiprogram Research Facility was
built at Oak Ridge National Laboratory to house a supercomputer for
such work. That facility, known as Building 5300, spanned 214,000
square feet and cost $41 million to build back in 2006. While the
unclassified “Jaguar” supercomputer was being deployed on the other
side of the Oak Ridge campus, the NSA was installing an even more
powerful system in Building 5300. Writes Banford:
The NSA’s machine was likely similar to the unclassified Jaguar, but
it was much faster out of the gate, modified specifically for
cryptanalysis and targeted against one or more specific algorithms,
like the AES. In other words, they were moving from the research and
development phase to actually attacking extremely difficult encryption
systems. The code-breaking effort was up and running.
According to Binney, a lot of foreign government data the agency was
never to break (128-bit encryption) might now decipherable.
How efficiently NSA is doing this, I guess will be quite difficult to know for normal mortals like us (hey, it's NSA! :-) )
Still we should consider that they are not planning in breaking 1 key but a huge number of them... So breaking a 128-bits AES message seems not anymore science fiction or theoretical math.