Some thoughts from the physics side of the house. 1) If the multiple levels are represented by physical quantities like charge, voltage, etc., the energy tends to go as the square, so stacking bits increases energy faster than it increases information. (the way to beat this is to use multiple independent degrees of freedom, but if you thought ternary was complicated, try to build a computer with multiple independent degrees of freedom). 2) Low temperature sounds great, but energy tends to go down with temperature much slower than heat transfer. Low temperature computers are really hard to cool. 3) Practical implementation of adiabatic computation is limited by the fact that there are no perfect switches at finite temperature. To work, adiabatic schemes have to slosh energy from one reservoir to another with some kind of a switch. At finite temperature, these switches have "soft thresholds" that depend on T -- think of a diode where I=Io*exp(qV/kT) The result is that the switches have an effective finite voltage drop when they are suppose to be in the "on" state. Practically, this is 0.3 to 0.6V at room temperature--tough to win big when CMOS can operate at voltages just above this with far less complexity. --R On Sun, Jan 6, 2019 at 6:56 PM Tom Knight <tk@mit.edu> wrote:
Dan Asimov <dasimov@earthlink.net> wrote: Tom Knight <tk@mit.edu> wrote:
Binary is optimal for information storage and transmission. This is because for a fixed amount of noise (from either environment or fabrication issues) binary requires only a single distance between states, whereas higher radix storage requires n-1.
That doesn't sound right to me. Only the most primitive modems used on-off keying or two-tone frequency shift keying. Information theory says that the most efficient way to use bandwidth involves having a vast number of subtly different states. Of course each state can encode multiple bits, which can be unpacked at the receiving end.
There is a huge difference between signaling over a telephone ine with a modem and signalling between two computer components. The phone line is in a high signal to noise environment with very limited bandwidth. The computer chip is a in low signal to noise envirnoment with very high bandwidth. You could employ the modem like techniques, but the cost in latency and complexity would be very high, and the benefit quite low. It’s far easier to improve the channel rather than to optimize coded bit translation across it. And with those constraints, binary is still best.
Energy dissipation is now the main issue for most computation. The way forward is not higher radices, but reversible low-dissipation computing, which for reasons I don?t understand, no one is taking seriously.
Maybe because we're still many orders of magnitude away from the kT log(2) limit. Reversible computation is only useful for breaking that limit. Also it generally involves computing very slowly.
This has little to do with the kT log(2) limit. Energy can be recovered from macroscopic signalling with inductive power recovery. Watts of it. And the speed limits are only applicable to capacitive only circuits, not to inductive/capacitive ones.
By the time we've approached that limit, it might be practical to do most of our computation far from the sun, where T is much lower. (Running a computer in an air conditioned room doesn't really gain you anything.) Dyson claims that we, or rather our cybernetic descendants or uploads, ought to be able to live forever in an expanding universe by running slower and slower, colder and colder, doing infinite computation in infinite time using finite total energy. Critics claim that that won't be possible since clocks can't be made arbitarily slow due to quantum limits, or because the background temperature of space isn't asymptotic to absolute zero due to Unruh radiation or other
effects.
I’ll let others worry about arbitrarily slow clocks. I like ones running at multiple GHz.
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun