vatine: Generated with some CL code and a hand-designed blackletter font (Default)

[personal profile] vatine 2009-12-20 04:49 pm (UTC)(link)
Lua is (almost) Greenspun by design. Although Greenspunning C takes a bit of effort.

[identity profile] maradydd.livejournal.com 2009-12-20 05:13 pm (UTC)(link)
Postgres actually did a halfway decent job of it, though some of their list-walking functions work backward.
vatine: Generated with some CL code and a hand-designed blackletter font (Default)

[personal profile] vatine 2009-12-20 05:38 pm (UTC)(link)
If you've ever looked at the VAX instruction set, you'll possibly be surprised to see an essentially-Greenspun microcoded CPU (there's definitely list-walking instructions in there and all sorts of BIZARRE things).

One of the few CPUs I know of where "optimize for lowest instruction count" tends towards generating slower and larger code than either "optimize for smallest code size" or "optimize for fastest execution" (also the inly CPU I know of to have had a "optimize for instruction count" compiler, but taht may be by-the-by).

[identity profile] joedecker.livejournal.com 2009-12-20 09:02 pm (UTC)(link)
If you've ever looked at the VAX instruction set,...

...and lived to tell the tale....

Yeah. The polynomial evaluation instruction on early Vaxen was slower and larger than DIY evaluation, as I recall. (This may have been microcode version and/or VAX model dependent). It made me laugh many a time.

( But then, I actually liked the nasty complexity of early DCL quoting, so,... )

[identity profile] lightning-rose.livejournal.com 2009-12-20 09:46 pm (UTC)(link)

For a long time I thought the early RISC proponents just had a sour grapes attitude because they weren't smart enough to write a compiler that could make use of the VAX instruction set. :)

[identity profile] lightning-rose.livejournal.com 2009-12-20 09:59 pm (UTC)(link)

"optimize for instruction count" makes sense in a universe where a megabyte is a huge (and hugely expensive) amount of RAM. Especially on a machine with virtual memory* and poor page swapping algorithms. A program that exceeded it's available RAM by even a few bytes could take a horrible hit in performance due to disk swapping. A less than speed optimal program may well outperform a speed optimized program if stays resident in RAM.


*) Virtual memory in it's original sense, not today's definition which is mostly just memory management.

[identity profile] mentalguy.livejournal.com 2009-12-22 01:08 am (UTC)(link)
These days, replace "RAM" with "instruction cache". Cache misses really hurt.

[identity profile] lightning-rose.livejournal.com 2009-12-27 05:30 pm (UTC)(link)

Indeed. I have disabled cache on pc's just to see how much the performance drops. Tip: don't do this with an o/s that boots into a gui.

I've also done a fair amount of embedded kernel development so I've benchmarked the differences between cache on and cache off at boot time.

[identity profile] kragen.livejournal.com 2009-12-22 07:54 pm (UTC)(link)
"optimize for smallest code size" makes more sense in that context, and the comment you're replying to asserts that "optimize for instruction count" tended to produce bigger code than "optimize for smallest code size", which is unsurprising. (The surprising part is that "optimize for smallest code size" existed.)

[identity profile] lightning-rose.livejournal.com 2009-12-27 05:39 pm (UTC)(link)

You're right, I mis-read the comment. my above comment should have been in regards to "optimize for smallest code size".

vatine: Generated with some CL code and a hand-designed blackletter font (Default)

[personal profile] vatine 2009-12-29 11:30 am (UTC)(link)
It would've made sense, if it wasn't for the fact that the really powerful instructions (as in "did lots of stuff") usually ended up being MASSIVELY huge, memory-wise. We're talking machine instructions in the multiple tens of bytes, here.

[identity profile] allonymist.livejournal.com 2009-12-21 10:29 am (UTC)(link)
Greenspun's tenth rule is a trap: to admit you follow it, you must evince dangerous lispish tendencies. But to deny you follow it, you must point to all your code and say, "Inquisitors! I defy you to find the tiniest spot of lisp in my unblemished codebase!"


Really, though, "Lisp" in this context means "Lisp at its best." Take a few months off your job, stay away from romance and fun, and just read the Common Lisp manuals and The Art of the Metaobject Protocol, and you too will understand Lisp At Its Best. And you will realize why List At Its Best probably can't be yours.

And if you start hassling other languages for not being Lisp At Its Best... why, is that really Lisp's fault? really?

[identity profile] maradydd.livejournal.com 2009-12-21 02:19 pm (UTC)(link)
What I did to Lua is decidedly not Lisp at its best. But it got the job done.

[identity profile] mentalguy.livejournal.com 2009-12-22 01:06 am (UTC)(link)

Really, though, "Lisp" in this context means "Lisp at its best."

I'm not sure it does, so much as an ideal Lisp which Common Lisp and historical Lisp practice only imperfectly embody. For example, I think Clojure only recently discovered how Lisp macros are really supposed to be done -- I suspect the use of namespaces for hygene represents almost as much of a watershed moment as the switch to predominantly lexical scoping. Now that we've got about a century of art in terms of the design of turing-complete languages (counting the programs written for the Analytical Engine), it's become fairly evident that ideal-Lisp is a very significant attractor in language design. Probably not the only one -- ML/Prolog hint at another attractor out there. But in any case we can get closer to it than "Lisp at its best".

[identity profile] kragen.livejournal.com 2009-12-22 07:55 pm (UTC)(link)
Do you think ML and Prolog are the same attractor?

[identity profile] mentalguy.livejournal.com 2009-12-22 11:19 pm (UTC)(link)
I was going to say no, but I'm not entirely sure. The two are weak attractors individually themselves. However, type specifications in ML dialects like Haskell are getting rather prolog-ish. There seems to be a certain synergy there. That might be my Haskell-bias showing though.

[identity profile] kragen.livejournal.com 2009-12-22 11:23 pm (UTC)(link)
Well, what they have in common is that you can write a function in terms of pattern-matching on cases, as long as you sacrifice data representation abstraction; and they share an allergy to side effects. But Prolog is dynamically-typed and constraint-oriented/bidirectional, while ML (and Haskell) are statically-typed and functional. I think they're different enough that they belong to different families.

[identity profile] mentalguy.livejournal.com 2009-12-22 11:33 pm (UTC)(link)
Think about the type domain in ML dialects. Hindley-Milner type inference -- particularly with the introduction of type class constraints as in Haskell -- starts to look a LOT like value-domain Prolog.

[identity profile] kragen.livejournal.com 2009-12-23 06:11 am (UTC)(link)
Hindley-Milner type inference is decidable and therefore can't do anything interesting.

[identity profile] maradydd.livejournal.com 2009-12-23 01:52 pm (UTC)(link)
Say what? You can do a lot of interesting things with decidable languages. Sure, you can't do everything, but there's a lot you can express with linear-bounded automata.

[identity profile] mentalguy.livejournal.com 2009-12-27 06:13 am (UTC)(link)
For that matter, with common Haskell extensions, type inference isn't strictly decidable either.

[identity profile] kragen.livejournal.com 2009-12-27 06:07 pm (UTC)(link)
When I posted that comment I thought someone might respond this way.

The extent to which HM type inference falls short of Turing-completeness is a bigger difference between it and Prolog than almost any other conceivable language feature. I'm pretty sure you can't emulate an LBA with HM type inference. You can't emulate a PDA or even a DPDA, as far as I know. You don't even have definite iteration (for i = 1 to 10).

I'd be on pretty shaky ground saying you can't emulate a FSM with HM type inference, but I also don't see how to do it.

[identity profile] maradydd.livejournal.com 2009-12-27 06:15 pm (UTC)(link)
This is the part where my knowledge of type systems falls down, and where I'd go dig out my copy of Pierce's Types and Programming Languages if I knew which box it was in, because I know Pierce gets into the decidability of the simply typed lambda calculus and what you can do with it, but I don't remember much of that chapter. It's one of those books I need to read five or six times before I really understand it.

[identity profile] mentalguy.livejournal.com 2009-12-22 11:23 pm (UTC)(link)
Or rather, the two are near weak attractors individually. I don't think SML or the various flavors of Prolog approach ideal-ML or ideal-Prolog much more closely than Common Lisp does ideal-Lisp.

[identity profile] kragen.livejournal.com 2009-12-22 07:59 pm (UTC)(link)
Lua very intentionally omits 95% of the feature set of Common Lisp, so that you can use it in environments where you only have room for the half of Common Lisp you're using. There are off-the-shelf libraries for a lot of those features, though, so you don't have to use an informally-specified, bug-ridden version.

[identity profile] maradydd.livejournal.com 2009-12-22 08:17 pm (UTC)(link)
I grok that, though it's mind-boggling to me that a language with first-class functions wouldn't have either list slicing a la Python or cons/car/cdr. Presumably this is because the built-in listlike object is the associative array (which gives the whole thing an uncomfortably PHP-like feel), but it still leaves me wondering where the tools that fit in my hands so well have gone.

And the whole business of indexing from 1 is just ridiculous. This is computer science, not biology.

[identity profile] lightning-rose.livejournal.com 2009-12-27 05:45 pm (UTC)(link)

There's always Pascal, where you can index an array from where ever the heck you want. :)

[identity profile] maradydd.livejournal.com 2009-12-27 05:59 pm (UTC)(link)
Strangely, I never learned Pascal. I would have, if I'd taken the Computer Math II class that my high school offered (Computer Math I, aka "BASIC programming", was a required course for honors students), but the year I planned to take it (junior year), only four students signed up -- and one, my then-boyfriend, moved to Utah just before the school year started, so there weren't enough students to justify paying a teacher for it. I could have tried again my senior year, but I wanted to take both second-year chemistry and second-year physics, so there was no room in my schedule.

Computer Math I was also the course where I kept getting in trouble for finishing the assignments within the first few minutes of class and spending the rest of the hour either playing Wing Commander, exploring the network, or writing BASIC programs that annoyed other people. Halfway through the semester I got pulled aside and asked "okay, since you obviously already know everything on the syllabus, what do you really want to be doing in here?" I said, "I hear there's this language called C..." and they sat me down in another room with an ancient copy of the Borland compiler and a softcover book that someone had probably gotten off the shelf at Barnes and Noble (it was not K&R). My high school wasn't much to write home about when it came to CS; it was really only by an improbable string of coincidences that I ended up in software engineering at all.

[identity profile] kragen.livejournal.com 2009-12-27 06:11 pm (UTC)(link)
it was really only by an improbable string of coincidences that I ended up in software engineering at all.


So natural talent had nothing to do with it? Are you going to tell us that the reason you finished each day's assignment in a few minutes was that you were raised in a monastery of kernel hackers who taught you BASIC before you could ride a bicycle?

[identity profile] maradydd.livejournal.com 2009-12-27 06:16 pm (UTC)(link)
No, it was because when I was eight my dad brought home a PCjr in a box.

"What's that, Daddy?" I asked.

"It's a computer," he said.

"How does it work?" I asked.

He handed me the manual and said, "Find out!"

I learned how to ride a bike when I was nine.
Edited 2009-12-27 18:17 (UTC)

[identity profile] kragen.livejournal.com 2009-12-28 07:53 am (UTC)(link)
Haha! Very similar to my own story.

...any number of other eight-year-olds, though, had that same opportunity and didn't make it through enough of the manual to do anything interesting.

[identity profile] maradydd.livejournal.com 2009-12-28 11:01 am (UTC)(link)
Point. Though it's really only been recently that this has become evident, since the people I was hanging out with in high school had not only done the same thing with respect to BASIC, but also understood modem init strings and spoke enough assembler to be dangerous and had assorted other skills that had me thinking I was just average. After high school, I didn't touch an interpreter again until grad school, a good ten years later.

[identity profile] lightning-rose.livejournal.com 2009-12-28 04:43 pm (UTC)(link)

Pascal as a professional programming tool was pretty much dead by the mid 80's, although Wirth's book on algorithms and data structures was probably still in use and had a Pascal style syntax for the examples. Anyone with a reasonable knowledge of C could pick up Pascal in a couple of days.

What I was alluding to is that in Pascal, one defines the size of an array by the valid indices for that array. For example,

someString: array [0..99] of char;

defines a C style array of 100 bytes, indexed from 0 to 99

But to get a bit weird...

fooArray: array [-100..100] of foo;

defines an array of type foo containing 201 elements that can be indexed from -100 to 100. And yes, any arbitrary values such as 17..77 can be used.

This maybe useful for some programming problems, but I've never needed anything like it.