If you've ever looked at the VAX instruction set, you'll possibly be surprised to see an essentially-Greenspun microcoded CPU (there's definitely list-walking instructions in there and all sorts of BIZARRE things).
One of the few CPUs I know of where "optimize for lowest instruction count" tends towards generating slower and larger code than either "optimize for smallest code size" or "optimize for fastest execution" (also the inly CPU I know of to have had a "optimize for instruction count" compiler, but taht may be by-the-by).
If you've ever looked at the VAX instruction set,...
...and lived to tell the tale....
Yeah. The polynomial evaluation instruction on early Vaxen was slower and larger than DIY evaluation, as I recall. (This may have been microcode version and/or VAX model dependent). It made me laugh many a time.
( But then, I actually liked the nasty complexity of early DCL quoting, so,... )
For a long time I thought the early RISC proponents just had a sour grapes attitude because they weren't smart enough to write a compiler that could make use of the VAX instruction set. :)
"optimize for instruction count" makes sense in a universe where a megabyte is a huge (and hugely expensive) amount of RAM. Especially on a machine with virtual memory* and poor page swapping algorithms. A program that exceeded it's available RAM by even a few bytes could take a horrible hit in performance due to disk swapping. A less than speed optimal program may well outperform a speed optimized program if stays resident in RAM.
*) Virtual memory in it's original sense, not today's definition which is mostly just memory management.
"optimize for smallest code size" makes more sense in that context, and the comment you're replying to asserts that "optimize for instruction count" tended to produce bigger code than "optimize for smallest code size", which is unsurprising. (The surprising part is that "optimize for smallest code size" existed.)
It would've made sense, if it wasn't for the fact that the really powerful instructions (as in "did lots of stuff") usually ended up being MASSIVELY huge, memory-wise. We're talking machine instructions in the multiple tens of bytes, here.
Greenspun's tenth rule is a trap: to admit you follow it, you must evince dangerous lispish tendencies. But to deny you follow it, you must point to all your code and say, "Inquisitors! I defy you to find the tiniest spot of lisp in my unblemished codebase!"
Really, though, "Lisp" in this context means "Lisp at its best." Take a few months off your job, stay away from romance and fun, and just read the Common Lisp manuals and The Art of the Metaobject Protocol, and you too will understand Lisp At Its Best. And you will realize why List At Its Best probably can't be yours.
And if you start hassling other languages for not being Lisp At Its Best... why, is that really Lisp's fault? really?
Really, though, "Lisp" in this context means "Lisp at its best."
I'm not sure it does, so much as an ideal Lisp which Common Lisp and historical Lisp practice only imperfectly embody. For example, I think Clojure only recently discovered how Lisp macros are really supposed to be done -- I suspect the use of namespaces for hygene represents almost as much of a watershed moment as the switch to predominantly lexical scoping. Now that we've got about a century of art in terms of the design of turing-complete languages (counting the programs written for the Analytical Engine), it's become fairly evident that ideal-Lisp is a very significant attractor in language design. Probably not the only one -- ML/Prolog hint at another attractor out there. But in any case we can get closer to it than "Lisp at its best".
I was going to say no, but I'm not entirely sure. The two are weak attractors individually themselves. However, type specifications in ML dialects like Haskell are getting rather prolog-ish. There seems to be a certain synergy there. That might be my Haskell-bias showing though.
Well, what they have in common is that you can write a function in terms of pattern-matching on cases, as long as you sacrifice data representation abstraction; and they share an allergy to side effects. But Prolog is dynamically-typed and constraint-oriented/bidirectional, while ML (and Haskell) are statically-typed and functional. I think they're different enough that they belong to different families.
Think about the type domain in ML dialects. Hindley-Milner type inference -- particularly with the introduction of type class constraints as in Haskell -- starts to look a LOT like value-domain Prolog.
Say what? You can do a lot of interesting things with decidable languages. Sure, you can't do everything, but there's a lot you can express with linear-bounded automata.
When I posted that comment I thought someone might respond this way.
The extent to which HM type inference falls short of Turing-completeness is a bigger difference between it and Prolog than almost any other conceivable language feature. I'm pretty sure you can't emulate an LBA with HM type inference. You can't emulate a PDA or even a DPDA, as far as I know. You don't even have definite iteration (for i = 1 to 10).
I'd be on pretty shaky ground saying you can't emulate a FSM with HM type inference, but I also don't see how to do it.
This is the part where my knowledge of type systems falls down, and where I'd go dig out my copy of Pierce's Types and Programming Languages if I knew which box it was in, because I know Pierce gets into the decidability of the simply typed lambda calculus and what you can do with it, but I don't remember much of that chapter. It's one of those books I need to read five or six times before I really understand it.
Or rather, the two are near weak attractors individually. I don't think SML or the various flavors of Prolog approach ideal-ML or ideal-Prolog much more closely than Common Lisp does ideal-Lisp.
Lua very intentionally omits 95% of the feature set of Common Lisp, so that you can use it in environments where you only have room for the half of Common Lisp you're using. There are off-the-shelf libraries for a lot of those features, though, so you don't have to use an informally-specified, bug-ridden version.
I grok that, though it's mind-boggling to me that a language with first-class functions wouldn't have either list slicing a la Python or cons/car/cdr. Presumably this is because the built-in listlike object is the associative array (which gives the whole thing an uncomfortably PHP-like feel), but it still leaves me wondering where the tools that fit in my hands so well have gone.
And the whole business of indexing from 1 is just ridiculous. This is computer science, not biology.
Strangely, I never learned Pascal. I would have, if I'd taken the Computer Math II class that my high school offered (Computer Math I, aka "BASIC programming", was a required course for honors students), but the year I planned to take it (junior year), only four students signed up -- and one, my then-boyfriend, moved to Utah just before the school year started, so there weren't enough students to justify paying a teacher for it. I could have tried again my senior year, but I wanted to take both second-year chemistry and second-year physics, so there was no room in my schedule.
Computer Math I was also the course where I kept getting in trouble for finishing the assignments within the first few minutes of class and spending the rest of the hour either playing Wing Commander, exploring the network, or writing BASIC programs that annoyed other people. Halfway through the semester I got pulled aside and asked "okay, since you obviously already know everything on the syllabus, what do you really want to be doing in here?" I said, "I hear there's this language called C..." and they sat me down in another room with an ancient copy of the Borland compiler and a softcover book that someone had probably gotten off the shelf at Barnes and Noble (it was not K&R). My high school wasn't much to write home about when it came to CS; it was really only by an improbable string of coincidences that I ended up in software engineering at all.
it was really only by an improbable string of coincidences that I ended up in software engineering at all.
So natural talent had nothing to do with it? Are you going to tell us that the reason you finished each day's assignment in a few minutes was that you were raised in a monastery of kernel hackers who taught you BASIC before you could ride a bicycle?
Point. Though it's really only been recently that this has become evident, since the people I was hanging out with in high school had not only done the same thing with respect to BASIC, but also understood modem init strings and spoke enough assembler to be dangerous and had assorted other skills that had me thinking I was just average. After high school, I didn't touch an interpreter again until grad school, a good ten years later.
Pascal as a professional programming tool was pretty much dead by the mid 80's, although Wirth's book on algorithms and data structures was probably still in use and had a Pascal style syntax for the examples. Anyone with a reasonable knowledge of C could pick up Pascal in a couple of days.
What I was alluding to is that in Pascal, one defines the size of an array by the valid indices for that array. For example,
someString: array [0..99] of char;
defines a C style array of 100 bytes, indexed from 0 to 99
But to get a bit weird...
fooArray: array [-100..100] of foo;
defines an array of type foo containing 201 elements that can be indexed from -100 to 100. And yes, any arbitrary values such as 17..77 can be used.
This maybe useful for some programming problems, but I've never needed anything like it.
no subject
no subject
no subject
One of the few CPUs I know of where "optimize for lowest instruction count" tends towards generating slower and larger code than either "optimize for smallest code size" or "optimize for fastest execution" (also the inly CPU I know of to have had a "optimize for instruction count" compiler, but taht may be by-the-by).
no subject
...and lived to tell the tale....
Yeah. The polynomial evaluation instruction on early Vaxen was slower and larger than DIY evaluation, as I recall. (This may have been microcode version and/or VAX model dependent). It made me laugh many a time.
( But then, I actually liked the nasty complexity of early DCL quoting, so,... )
no subject
For a long time I thought the early RISC proponents just had a sour grapes attitude because they weren't smart enough to write a compiler that could make use of the VAX instruction set. :)
no subject
"optimize for instruction count" makes sense in a universe where a megabyte is a huge (and hugely expensive) amount of RAM. Especially on a machine with virtual memory* and poor page swapping algorithms. A program that exceeded it's available RAM by even a few bytes could take a horrible hit in performance due to disk swapping. A less than speed optimal program may well outperform a speed optimized program if stays resident in RAM.
*) Virtual memory in it's original sense, not today's definition which is mostly just memory management.
no subject
no subject
Indeed. I have disabled cache on pc's just to see how much the performance drops. Tip: don't do this with an o/s that boots into a gui.
I've also done a fair amount of embedded kernel development so I've benchmarked the differences between cache on and cache off at boot time.
no subject
no subject
You're right, I mis-read the comment. my above comment should have been in regards to "optimize for smallest code size".
no subject
no subject
Really, though, "Lisp" in this context means "Lisp at its best." Take a few months off your job, stay away from romance and fun, and just read the Common Lisp manuals and The Art of the Metaobject Protocol, and you too will understand Lisp At Its Best. And you will realize why List At Its Best probably can't be yours.
And if you start hassling other languages for not being Lisp At Its Best... why, is that really Lisp's fault? really?
no subject
no subject
I'm not sure it does, so much as an ideal Lisp which Common Lisp and historical Lisp practice only imperfectly embody. For example, I think Clojure only recently discovered how Lisp macros are really supposed to be done -- I suspect the use of namespaces for hygene represents almost as much of a watershed moment as the switch to predominantly lexical scoping. Now that we've got about a century of art in terms of the design of turing-complete languages (counting the programs written for the Analytical Engine), it's become fairly evident that ideal-Lisp is a very significant attractor in language design. Probably not the only one -- ML/Prolog hint at another attractor out there. But in any case we can get closer to it than "Lisp at its best".
no subject
no subject
no subject
no subject
no subject
no subject
no subject
no subject
The extent to which HM type inference falls short of Turing-completeness is a bigger difference between it and Prolog than almost any other conceivable language feature. I'm pretty sure you can't emulate an LBA with HM type inference. You can't emulate a PDA or even a DPDA, as far as I know. You don't even have definite iteration (for i = 1 to 10).
I'd be on pretty shaky ground saying you can't emulate a FSM with HM type inference, but I also don't see how to do it.
no subject
no subject
no subject
no subject
And the whole business of indexing from 1 is just ridiculous. This is computer science, not biology.
no subject
There's always Pascal, where you can index an array from where ever the heck you want. :)
no subject
Computer Math I was also the course where I kept getting in trouble for finishing the assignments within the first few minutes of class and spending the rest of the hour either playing Wing Commander, exploring the network, or writing BASIC programs that annoyed other people. Halfway through the semester I got pulled aside and asked "okay, since you obviously already know everything on the syllabus, what do you really want to be doing in here?" I said, "I hear there's this language called C..." and they sat me down in another room with an ancient copy of the Borland compiler and a softcover book that someone had probably gotten off the shelf at Barnes and Noble (it was not K&R). My high school wasn't much to write home about when it came to CS; it was really only by an improbable string of coincidences that I ended up in software engineering at all.
no subject
So natural talent had nothing to do with it? Are you going to tell us that the reason you finished each day's assignment in a few minutes was that you were raised in a monastery of kernel hackers who taught you BASIC before you could ride a bicycle?
no subject
"What's that, Daddy?" I asked.
"It's a computer," he said.
"How does it work?" I asked.
He handed me the manual and said, "Find out!"
I learned how to ride a bike when I was nine.
no subject
...any number of other eight-year-olds, though, had that same opportunity and didn't make it through enough of the manual to do anything interesting.
no subject
no subject
Pascal as a professional programming tool was pretty much dead by the mid 80's, although Wirth's book on algorithms and data structures was probably still in use and had a Pascal style syntax for the examples. Anyone with a reasonable knowledge of C could pick up Pascal in a couple of days.
What I was alluding to is that in Pascal, one defines the size of an array by the valid indices for that array. For example,
someString: array [0..99] of char;
defines a C style array of 100 bytes, indexed from 0 to 99
But to get a bit weird...
fooArray: array [-100..100] of foo;
defines an array of type foo containing 201 elements that can be indexed from -100 to 100. And yes, any arbitrary values such as 17..77 can be used.
This maybe useful for some programming problems, but I've never needed anything like it.