On Fri, 3 Oct 2003 mcintosh@servidor.unam.mx wrote:
Quoting "David P. Moulton" <moulton@idaccr.org>:
On Thu, 2 Oct 2003, Henry Baker wrote:
Does anyone know any more details about this announcement?
You mean, besides the fact that the author's understanding of even the basics of the factoring problem is appalling? I factored his "32-minute" number in about 30 seconds in my head!
David Moulton
Not so long ago the astounding news was that someone had used a five bit quantum computer to factor 15!! Much longer ago I recall when people were demonstrating a little plastic cube which had a !transistor! embedded in it along with a little speaker and a theromocouple. It squealed when you held it between your fingers. Totally, absolutely amazing!
Don't forget Moore's Law.
- hvm
I think some people were missing the point of my comment. I'm not saying anything bad about the actual research, which may be quite interesting; I'm saying that the writer was botching things to an astounding degree. And remember, the "32-minute" number was supposed to take 32 minutes to factor "using conventional computer algorithms". Even naive trial division with such a number will take a tiny fraction of a second. Richard Schroeppel says
I'm inclined to give the science writer the benefit of the doubt on explaining factoring time. It's so oversimplified it's wrong, but for a target audience that needs an explanation of what factoring is at all, What can you do? Any article with the subexpression O(e^(1.923*cbrt(logN (loglog N)^2))) isn't going to be read by this audience.
"It's so oversimplified it's wrong" is quite an understatement! I would complain less if the author had used "microseconds", say, rather than "minutes" in his estimates. And I would complain less if he had used a number to be factored that didn't have so many small factors, or even was odd! And while he shouldn't necessarily give the correct running times, he states the ones he uses, which are totally ludicrous, with such authority and definitiveness. While it's good that a general audience can read a story about factoring and quantum computers, I think it's a mistake to give such an incorrect picture of the truth that it might interfere with other understanding that could result from reading some other article. David Moulton