Dan Asimov <dasimov@earthlink.net> wrote:
Since useful quanputers might be a long ways off, maybe some other stunt could speed up current computation as we know it.
If trits could be stored, retrieved, and copied as easily as bits, there would be less *space* needed by a factor of of log_c(2)/log_c(3) = .6309+, where c = zeta(3).
It's hard to tell with text whether someone is joking. I assume you know that log(a)/log(b) always has the same value regardless of the base of the log. For instance I was recently debating with myself whether to express a number as log2(10), 1/log10(2), or as log(10)/log(2). In the last case, I don't have to worry about the base. I'd think you'd also know that changing the radix of a computer may change its speed by some constant, but wouldn't change the big-O category of any algorithm. Replacing it with a quantum computer would, at least for a few special algorithms, such as Shor's. I've read that the Russians once had a computer that was based on balanced ternary. As for space, that's true of a trit takes the same space to store as a bit, but why should that be?
Perhaps someone knowledgeable can estimate how much trits could actually speed up computation. Surely this has been much studied by now.
I can't see why it would speed it up at all, or increase the precision, so long as the word size in bit-equivalents remained the same. Computation is generally done one word at a time. A decimal computer would be slightly faster if you want the results in decimal. (I assume few people want results in ternary.) And it also avoids some rounding errors that binary doesn't, since it can exactly represent more fractions. But to maximize that, you'd want base 30, base 210, or some other primorial. Of course a binary (or ternary, etc.) computer can be programmed to emulate any other radix, at the cost of some speed and some space. Or to do exact rational calculations by storing everything as integer numerators and denominators.
But the practical question is, Is there a practical way to build trit-based processors?
Certainly. But fewer states are always easier, and two is the fewest you can have, since one state isn't good for anything. Similarly, life anywhere in the universe ought to evolve to have two sexes, not more, despite that novel your uncle wrote where they had three.
Would it be easier if they could be as large as a room, or larger?
Easier for what? Easier for hand assembly of individual transistors? Sure, for that you'd want components that are a comfortable size to handle. And it would result in a computer with the power of an iPhone filling, not just a room, but a large building. And the speed-of- light delay means that it would run very slowly by today's standards. Tom Knight <tk@mit.edu> wrote:
Binary is optimal for information storage and transmission. This is because for a fixed amount of noise (from either environment or fabrication issues) binary requires only a single distance between states, whereas higher radix storage requires n-1.
That doesn't sound right to me. Only the most primitive modems used on-off keying or two-tone frequency shift keying. Information theory says that the most efficient way to use bandwidth involves having a vast number of subtly different states. Of course each state can encode multiple bits, which can be unpacked at the receiving end.
Energy dissipation is now the main issue for most computation. The way forward is not higher radices, but reversible low-dissipation computing, which for reasons I don?t understand, no one is taking seriously.
Maybe because we're still many orders of magnitude away from the kT log(2) limit. Reversible computation is only useful for breaking that limit. Also it generally involves computing very slowly. By the time we've approached that limit, it might be practical to do most of our computation far from the sun, where T is much lower. (Running a computer in an air conditioned room doesn't really gain you anything.) Dyson claims that we, or rather our cybernetic descendants or uploads, ought to be able to live forever in an expanding universe by running slower and slower, colder and colder, doing infinite computation in infinite time using finite total energy. Critics claim that that won't be possible since clocks can't be made arbitarily slow due to quantum limits, or because the background temperature of space isn't asymptotic to absolute zero due to Unruh radiation or other effects.