[*begin discursiveness warning*] In trying to mentally factor a 4-digit number (7989, if you must know). After dividing by the obvious 3, the next step was to find the least prime factor of 7989/3 = 2663. I tried the only factoring algorithm I know: naively try each prime in order no greater than the square root of the number (in this case < 52). I soon realized (what everyone else probably knows well): that if 2663 were not prime, then the more primes that were eliminated, the more likely it became that the next test prime <= sqrt(2663) would divide it. But of course there was always the chance the 2663 is in fact prime. So: suppose each prime 2,3,5,7,...,p_k < sqrt(n) does not divide n while there remain primes p_r in the range p_k < p_r <= sqrt(n). The greater such k, the greater the chance that such p_r | n. But also the greater such k, the greater is the chance that n is prime. How do these two possibilities relate to each other? (I.e., what should I expect, or bet on, as k increases -- that n is prime, or that some larger test prime <= sqrt(n) divides n ???) (For starters, let p_r denote the largest prime <= sqrt(n).) [*end discursiveness warning*] --J.R. "Dan" Dobbs