[math-fun] Can we make fast divisibility tests for binary computers?
After all that about how we humans working in base 10 can quickly test divisibility by some small primes by some ingenious tricks... who cares about base 10? Think binary so you can be cool. * N divisible by 2 <==> has a zero at the end <==> (N&1)==0. * Note that 63 = 3*3*7. If you add the consecutive 6-bit blocks of N together, result is congruent to N mod 63. Similarly for other 2^k-1 such as 3, 7, 15=5*3, 31=prime, 63=3*3*7, 127=prime, 255=5*3*17, 511=7*73 etc. * Note that 65=5*13. If you add and subtract alternately the 6-bit blocks of N then you get something congruent to N mod 65. Similarly for other 2^k+1 such as 33=3*11, 129=3*43, 257=prime, 513=3*3*3*19 etc. The numbers 2^k +- 2^j +- 1 would seem to be particularly important for this kind of game. The numbers 2^(2^k)-1 are especially important since computer wordlengths are powers of 2, and they have a lot of nice factors. Related question to "Can we make fast divisibility tests for binary computers?" is "...and should anybody really want to?"
="Warren Smith" <warren.wds@gmail.com>
If you add and subtract alternately the 6-bit blocks
Of course this isn't strictly binary, but rather uses "fat digits" encoded in base 2. Anything interesting happen if the blocks aren't the same size? If they overlap? Adding digits and/or alternating can be generalized by accumulating products scaled by some sequence of small constants besides +1 and -1. Different sequences can be used to produce a suite of divisibility tests (and IIRC this has been discussed on this list anon). One way I've used to test quickly for divisibility by a bunch of small divisors: multiply enough of them together to get P just shy of the word size, then use binary gcd(N,P). You can of course use a table of P's. Another hack I've pulled in microcode on awkward architectures is multiply by approximate fixed-point reciprocals "normalized" to lie in 0.5..1.0. Sometimes the result can be shown to be close enough to detect by an easy test, or whacked with Newton's method (like the famous square root bum).
Related question to "Can we make fast divisibility tests for binary computers?" is "...and should anybody really want to?"
I dunno, but on this list that's a bit like responding to G. H. Hardy's famous riposte by asking "And should anyone really want to make babies?"
Marc's suggestion re GCD is really a bin-packing problem. That is, we have a set of primes. Their "sizes" are their logs to base 2. We wish to pack the primes into as few bins as possible (size of bin is machine wordlength in bits). We then can test N for divisibility by taking the gcd(N, bin-products). One could try the "first fit decreasing" bin packing heuristic... or others...
="Warren Smith" <warren.wds@gmail.com> ... We wish to pack the primes into as few bins as possible ... One could try the "first fit decreasing" bin packing heuristic... or others...
I think I eventually decided to not worry about this and just pack them in increasing order, for several (perhaps muddle-headed) reasons: * The smaller primes divide more N, so "greedily" trying 3*5*7*11... first seemed more likely to find a factor sooner. (Your mileage may vary, depending on what your input distribution is like). * Once you get out into the "long tail" of larger primes that are about 1/K of the word size, I didn't see how even with a lot of packing effort one could have much hope to improve significantly over just packing most of them K to a word. * Moreover, since the distribution of sizes is relatively smooth, it didn't seem like the difference between the greedy order and some "optimal" packing would be more than a handful of bins anyway. I now recall however that my goal then was actually to test for primality, rather than find divisors. I think I ultimately wound up doing a small number of GCDs and then siccing Little Fermat on the recalcitrants. I also think I read somewhere in Knuth that Carmichael numbers that fool Fermat have a factor at most cube-root N, so it wasn't necessary to GCD for divisors greater than a third of a word. Back in the day that was ~32 bits, so quite tractable. Today at ~64 bits it might be an interesting exercise for a cloud.
participants (2)
-
Marc LeBrun -
Warren Smith