When reading Knuth a long time ago, I never cared much for his MIX code; I just skipped over it. Why should I waste my time trying to understand some assembly code for a non-existent computer?
--I was kind of annoyed by MIX too. One of the reasons C language was originally successful, was it tried to make a high-level language which also enjoyed the benefits of,and unashamedly tried to connect to, assembler. (As opposed to pretending assembler did not exist.) So Knuth would have been better off using C than MIX in many ways. However, C fell short of its design goal plus now with C++ has become horrible mess. But anyway, apparently with MMIX, Knuth is trying to be a bit of a prosyletizer for what instruction sets ought to have, rather than just trying to approximate what they are. Also a bit of a bizarre move for a textbook, but I hope it'll influence people a bit to overcome the energy barrier.
I used to like elegant instruction sets: the PDP-10 and the Motorola 68000. National Semiconductor came out with a yet better instruction set in their 32000 microprocessor. We seriously thought about using the 32000 in our medical lasers at Coherent, but when we talked to the National Semiconductor rep, they had totally inferior in-circuit emulators and such development hardware. We went with the 68000.
As hardware technology advanced, we had prefetching, simultaneous instruction execution, pipelining, and so on. The nail in the coffin for me (or should it be the stake through the heart?) was when I was told by someone I trusted that modern compilers can produce faster executing code than all but the most expert hand coding of assembly. At that point, I changed over completely, and now I just couldn't care less about instruction sets.
--umm. Well, it depends: do you want performance? If you do, then you should care about instruction sets and architecture. But here is the thing: 90% of computer customers have never heard of the POPCOUNT instruction, much less want to use it and know they want to. So as a business matter, intel does not care about making good instruction sets. What a heck of a lot of computer buyers DO care about, is: can I run some piece of software for shooting monsters with great graphics, or watching porno movies, that was originally written in assembler for performance? And I want that software right now, I do not want to wait a few years for somebody to re-code it. So the fact that, if better instruction sets & architectures are provided, then in 5 years we'll get better performing software, is not a matter of business importance. As a total guess I'd think we could get 2X speedup and/or electrical power reduction with better architectures, but businesswise they do not care because that speedup is for FUTURE software, not today's software. Almost nobody buys a computer because of hoped-for future software.
Warren, what is this better architecture that Intel ran up a flagpole? What should we search for to look it up?
--itanium. And by the way, looking into this, it seems ARM and itanium have made some business progress, BUT the way they did so was by going into entirely new markets, like servers and cell phones. In those markets, the software was not already there. So the buyers really *DID* care about "future software" as opposed to "old software." So they really did care about performance. So that allowed these new better architectures to gain some business success.
Why did Apple switch from the 68000 to the x86? Steve Jobs may have been an asshole, but he was a smart one.
Yes, backward compatibility is very important. Nobody wants to be endlessly rewriting their code; we want to move on to newer and better things, not rehashing the old.
What makes a computer language "better"; that seems to be a matter of personal opinion. I did my 68000 coding at Coherent Medical in C, and I liked the language. Later, Coherent switched over to Windows based controllers and C++, but by then I had moved from software engineering to optics and laser engineering. I never learned C++, but I can appreciate its usefulness. I'm still fond of Fortran. I tend to detest new languages, and the only one I've embraced is Python. Once at a computer show, I asked a Forth guy what's so good about Forth. He told me astronomers like it, but otherwise go read about it. When I asked a real astronomer, he said "Nah, we don't use Forth." Then there was Smalltalk; it was supposed to solve all the problems of the universe. I said that when it takes off, there will be plenty of books to read about it; I'm still waiting.
--quite. But if you look at some of the languages, they have nice ideas. And I have ideas of my own too for what should be in languages. But all that usually is not going anywhere due to the same energy barrier as Esperanto, only with some new bad-incentives added to (like: fear that the language will go away and you'll be left twisting in the wind).
So, why are we not speaking Esperanto? Maybe because its only use is to converse with fellow fanatics.
--it is because, even though it is a better language, and humanity would be better off if we all spoke it -- we'd outperform old-style humanity by a lot -- the average Joe does not find it to his individual benefit to learn Esperanto right now. It only works if nearly everybody does it. If 1 person does it, it's the opposite of useful. It's a barrier to human progress caused by the difference between individual and society-wide incentives. And similar remarks could be made about, e.g. switching to better energy sources... switching to better voting systems... This is a fundamental common problem-structure that comes up in many guises.