The random-gen paper LeBrun mentioned is here http://www.thesalmons.org/john/random123/papers/random123sc11.pdf Nothing is inherently wrong with their philosophy of F[0], F[1], F[2], etc rather than F[0], F[F[0]], F[F[F[0]]] etc, but it is just moronic to go as far as they do in literally adding 1 then re-encypt. Instead, adding a better constant than 1, such as 93db2dffa70bac57 mod 2^64, which has the property that 93db2dffa70bac57 / 2^64 has all partial quotients 1 or 2 in continued fraction, would be equally fast, more random. And the source code Salmon et al distribute is very long and hairy. And presumably best is to use a mildly-decent randgen of the F[F[F[]]] type combined with a mildly-decent post-encrypt of their type, to get the best of both worlds. Which actually, is exactly what Melissa O'Neill does. Part of the problem is the incredible asininity of most (all?) programming languages in refusing to give you the mullo(a,b) and mulhi(a,b) primitives for double=single*single multiply. Seriously. This is the year 2016. Why haven't programming language designers figured out yet, that computer hardware can multiply? Will it happen before I die?