e40 2 days ago

The strange thing about Erik is that he was sweet, gentle and helpful in person. I spent a fair amount of time with him, and never once saw any of the behavior he was famous for on usenet.

He was highly intelligent but he had an odd childhood and was somewhat isolated socially, in the real world. It’s what ultimately killed him. At least outside of tech. Near the end he was almost a cult-like figure, at conferences especially. He would accumulate a hoard of fans. I’m unsure if he liked that.

He really loved coming to the Bay Area.

I definitely miss him irl.

Sorry for the ramblings. I’m on mobile and editing is hard.

  • crystal_revenge 2 days ago

    I never got the chance to interact with, much less meet, Erik, but I think the thing people misunderstand about even his more vitriolic messages (his XML rant warms my heart to this day), is that he wasn't being an asshole, he was pursuing values people had increasingly abandoned. He was an impassioned idealist.

    There was a time when technologists aggressively cared about the technically correct solution as a opposed to the "business correct" or "socially correct" answer. Erik's rants (which I'm more familiar with) didn't criticize the uninitiated, they attacked people who didn't care to know better.

    He's the fundamental opposite of today's Twitter/X "shitposter", those who promote rage for clicks and engagement above understanding and thinking. He inspired you (or at least young me) to be good enough, to understand well enough, that when you spoke you wouldn't insight the rage of people like him. He was a warning to neophytes not to get too cocky before you learned the ropes. He was an object lesson that ego was far less important than understanding. Something lost on the most recent Twitter/X generation.

    I'm certain anyone resembling Erik out there in the world would not have survived the tech industry the last 10 years and for me that is a tragedy. I hope Erik's name forever lives on in internet lore and at least some new people setting out secretly hope to understand their field enough not to invoke the rage of his lurking spirit.

    • specialist a day ago

      I was vocally apostate during the XML mania. Discovering Naggum validated me. He articulated what I merely intuited. Gratitude.

      I too am sorry Naggum's gone.

      XML did teach me one thing: This too shall pass. Eventually. So I should just chill and focus on my own thing. And sometimes I do.

agambrahma 3 days ago

Ah a Naggum re-discovery.

See a yearly index here: https://www.xach.com/naggum/articles/

I went through the whole thing a few years ago, there are some gems hidden in there.

mst 2 days ago

Anybody interested in efficient list (and other data structures, but mostly lists) implementation will probably find the VList approach interesting (relevant paper archived at https://trout.me.uk/lisp/vlist.pdf)

  • deepnet 2 days ago

    [post] tl;dr

    A ( linked ) list is a collection of CONS ( Construct ) cells.

    A CONS cells can extract, via the commands:

    ‘car’ ( Contents of Address Register ) - the payload ( data item in list );

    ‘cdr’ ( Contents of Decrement Register ) - the link ( to the next list item, e.g. CONS cell )

    The names are for historical reasons ( using two halves of a single register ) lost in the beards of yore.

    The linked list appears simple, this belies the fact of its exquisite perfection and the great thought that went into inventing, perfecting and implementing it.

    “[The ‘list’ command]… creates a series of conses and sets the CAR pointers to point to each argument. which you have provided. (LIST (LIST 1 2) (LIST 3 4)) => ((1 2) (3 4))

susam 2 days ago

The original discussion is here: https://groups.google.com/g/comp.lang.lisp/c/T_Tozq2P53I/m/N...

I found it intriguing that the original post there claims that

  (cons (list 1 2) (list 3 4))
intuitively seems to have the length 2 whereas the correct length is 3. I find 3 more intuitive though. It could be because right from my early days of learning various Lisps, I learnt about lists in terms of cons cells.

There are many ways to explain this, such as by drawing diagrams of the cons cells from first principles, printing out each cons and list on a REPL and confirming the equivalences, etc. Here's one that happens to be my take on it. I'll be (ab)using the equals sign (=) to mean that the expression on the LHS and the one on the RHS are equivalent.

First let us build a list of three items using cons cells:

  (cons 4 nil) = (list 4)

  (cons 3 (cons 4 nil)) = (list 3 4)                [1]

  (cons 'x (cons 3 (cons 4 nil))) = (list 'x 3 4)   [2]
Now let us build a list of two items using cons cells:

  (cons 1 nil) = (list 1)

  (cons 1 (cons 2 nil)) = (list 1 2)                [3]
Now substituting LHS and RHS of [3] into 'x in LHS and RHS of [2], respectively, we get

  (cons (cons 1 (cons 2 nil)) (cons 3 (cons 4 nil))) = (list (list 1 2) 3 4)
Simplifying the LHS above using [3] and [1], we get

  (cons (list 1 2) (list 3 4)) = (list (list 1 2) 3 4)
Clearly, the length of RHS is 3.

From the REPL:

  CL-USER> (cons (list 1 2) (list 3 4))
  ((1 2) 3 4)
  CL-USER> (list (list 1 2) 3 4)
  ((1 2) 3 4)
  CL-USER> (length (cons (list 1 2) (list 3 4)))
  3
  CL-USER> (length (list (list 1 2) 3 4))
  3
  CL-USER>
Also, on Emacs:

  ELISP> (cons (list 1 2) (list 3 4))
  ((1 2)
   3 4)

  ELISP> (list (list 1 2) 3 4)
  ((1 2)
   3 4)

  ELISP> (length (cons (list 1 2) (list 3 4)))
  3 (#o3, #x3, ?\C-c)
  ELISP> (length (list (list 1 2) 3 4))
  3 (#o3, #x3, ?\C-c)
  ELISP>
See also: https://gigamonkeys.com/book/they-called-it-lisp-for-a-reaso...
  • zahlman 2 days ago

    > intuitively seems to have the length 2 whereas the correct length is 3. I find 3 more intuitive though. It could be because right from my early days of learning various Lisps, I learnt about lists in terms of cons cells.

    That's just how cons is defined. the first argument is a list and the second is an arbitrary object, and the result has the elements from the list plus the other object. It doesn't take a bunch of algebraic manipulation to understand this definition. It just takes a recognition that an "arbitrary object" could be a list, as well as a recognition that the other definition you imply ("make a new list with both arguments as elements") would have to either be variadic (which in turn would obviate `list`) or would only ever be able to make lists of exactly 2 elements. (The point is that `cons` can serve as a simpler primitive, and `list` can be implemented in terms of it.)

  • abecedarius 2 days ago

    That original post started with

    > (list 1 2 3) == (cons 1 (cons 2 (cons 3 nil)))

    and I'd have answered that cons could be a variable-arity function such that

    > (list 1 2 3) == (cons 1 2 3 nil)

    and maybe that'd help this newbie's intuition. (Common Lisp does have this variable-arity cons but by a different name list*, for some reason.)

  • kagevf 2 days ago

    To me, the form is not intuitive, but the result is:

      ((1 2) 3 4)
    
    Someone else in that thread mentioned that CONS is basically adding (1 2) to the front of the (3 4) list, and that helped me understand it better.
  • kazinator 2 days ago

    Something you know because of what you learned cannot be called "intuitive".

    Intuitive means that, having studied nothing, you somehow make the right guesses about something.

    A device is intuitive if you can chuck the user manual aside as you unbox it, and proceed to use it without difficulty or puzzlement.

kagevf 3 days ago

This doesn't come across as "caustic" as when I previously read it. And the "bile" isn't directed so much at Lee as it is at something he wrote; "All in all, the pairs notion is redundant."

Naggum genuinely seems to be hoping that his long and thorough explanation will convince Lee of his point of view, and not as a put down.

> I hope you understand and appreciate what I have written above so the following does not apply to you anymore.

The "following" where he goes off on what Lee wrote, but not on Lee himself.

It might be worth it for me to re-read Naggum's posts, now that I have a better understanding of where he was coming from.

  • Joker_vD 2 days ago

    Not caustic? "Short-sighted users", "stupid language designers who believe too strongly in strong typing", "certainly can't be bothered to invent a useful concept for their short-sighted programmers", "dumb people who implement something the simple way, so it becomes complex and generally useless to everybody else", "unlike today's braindamaged CPU's", "just look at what C programmers do when they need lists! shudder".

    Sure, you may believe that LISP is the greatest invention since the sliced bread, and everything went downhill after that but even if you are correct — that's still just, like, your opinion, man.

    • nescioquid 2 days ago

      Consider that you may be eavesdropping on the sort of self-talk the author engages in. The clear reasoning mixed with vituperation makes me suspect he probably beats himself up. There is a lot of worry over being mistaken and stupid.

      I'm unfamiliar with the author, so I could be way off base.

    • kagevf 2 days ago

      I wrote "not *as* caustic" ... in particular, he wasn't being caustic toward Lee, but rather to the idea of cons pairs being redundant.

      It's enough of distinction to make me think I should re-examine what Naggum had to say - in this and in other posts - and not get caught up in any pre-conceived notions I might have had about him.

    • abecedarius 2 days ago

      It was pretty mild for Naggum. Like the GP I was expecting a higher density of that sort of thing, but maybe my memory is too slanted by his worst flames.

  • p_l 2 days ago

    Unfortunately, from my experience, Xah Lee is someone who will ignore information from others if it requires to correct what he wrote/spoke, especially when he claims something authoritatively when in reality he has no in-depth, if any, information on the subject

    • kagevf 2 days ago

      He does have a tendency to write off people as “fanatics”.

agency 3 days ago

They're also the original and simplest persistent data structures https://en.wikipedia.org/wiki/Persistent_data_structure

  • guerrilla 3 days ago

    What criteria do you use to judge those two properties?

    • Nevermark 2 days ago

      Not sure about original.

      But simplest: Address pairs are extremely simple, obviously. But they can implement anything.

      Lists, structures, arrays, trees, arbitrarily complex multidimensional and irregular data types.

      Using symbols as representational "links" they can even implement general (multi-, etc.) graphs. Which is everything.

      But remove one of those addresses, and all you have left are linked series of addresses, ending with one element, or an end-of-series marker. That isn't general or very useful.

      So address-pairs get the prize.

      • guerrilla a day ago

        Thanks for the legit answer.

        What do you mean about reprensentational "links"?

        • Nevermark a day ago

          Direct links are done with addresses.

          When you use lists to define a data structure, you can linearize the whole structure. I.e. (... (...) ... ( ... ) ... ... )

          What you can't do is have some leaf of the structure refer back to a higher point in the structure. Which is what you need to represent a graph. I.e. any point can refer to any other point.

          So instead, you linearize but use symbols to define a referencable point, and then to refer to it.

          I.e. ( ... (....) (define a ( ... )) ... ( ... ( ... a .... ) ... a ... ) ... )

          In this case, "define a" indicates that the next list isn't just accessible at that location, but wherever "a" appears later. And then we reference it in more then one place later.

          This allows not only tree-like structures to be linearized, but graphs.

          Everything is still lists, and lists-of-lists, but just as symbol or number elements can appear at more than on place, now lists in the structure can be referenced in more than once place.

          All done with just pairs of addresses and symbols. Even numbers can be built out of pairs of addresses and symbols.

          I.e. a binary number: (one one zero one zero zero one zero)

          Of course, a friendly language is going to add lots of notation on top of address-pairs and symbols, for numbers, looping, functions, decimal number, etc. to make common code and data forms much easier to type and read than a jungle of parentheses.

          --

          This is very much the reason we have defined and referenced symbols in all code (or complete data languages). So that linear text can describe a graph. And the same process occurs, first the text is parsed into a tree (parsing), then at some point, symbols defining references are replaced with addresses of symbol definitions (linking).

          The result is code can be a graph. We can have two functions that call each other, etc. Not just tree expressions.

          • guerrilla a day ago

            I see, interesting. I never really thought about LISP as low-level. It was always just something that I implement in C however I want to. You gave me a bit of a new perspective on it.

kerkeslager 2 days ago

Erik Naggum's writings were pretty influential on me as a young programmer, and there's still some value in this piece, but nowadays I find a lot of this insufferable. Yes, it's not actually easy to implement lists, and there's semantic value to Common Lisp's implementation. But...

1. You can't tout the benefits of constant time appending to the beginning of a list while ignoring the benefits of constant time lookup and modification within the list. And before anyone starts talking about "but most programs don't need random access because you just traverse the list"--yes, and vectors do that better too because of cache locality. Cache-aware implementations of Lisps are still an order of magnitude slower than vectors for traversals because dereferencing the cdr pointer is obligatory. If you start talking about speed being a benefit of Lisp in comparison to C (a comparison which Naggum introduces later here), you've lost the plot.

2. Nostalgia for an architecture that no longer existed even in 1998 when this was written is poor reasoning for opaque names. "car" and "cdr" aren't good names. It kind of doesn't matter much: once you learn them it's fine, but it is a slight barrier to Lisp's success, as it does present yet another small challenge for beginning Lisp programmers.

3. In general, the ragging on Scheme is rather pointless--I found my way to Lisp via Scheme. And a large part of why I ended up back in Scheme land (well, Racket by that point) was that it became clear to me that the Scheme community benefits from ideas learned in Common Lisp, while Common Lisp often (not always) rejects NIH ideas.

  • pfdietz 2 days ago

    It would be trivial to replace car and cdr in your code with any other symbol names you like. Even the Common Lisp standard supplies alternatives (first, rest). If this was not done, it was because people using lisp didn't see any great value in doing it.

    • Majromax 2 days ago

      Your point and the grandparent's point are not contradictory. 'car' and 'cdr' can be opaque pieces of jargon that make the language slightly but noticeably more difficult for new learners, and they also can be standard pieces of jargon that current users would find no value in changing.

      In that way, it's a bit like indentation in Python.

      • zahlman 2 days ago

        I would say that it's more like the `def` keyword in Python. People learn that it's short for "define" - but that doesn't make much sense given that other kinds of things can perfectly well be "defined". It also leads to beginners expressing themselves weirdly - I've read things like "so I have a defined function like so:" countless times.

        • theonemind 2 days ago

          What do you find strange about someone saying they defined a function?

          Just curious. I can't find anything strange about the wording or conceptual understanding likely behind such a statement.

          • zahlman a day ago

            Not "I have defined a function", but "I have a defined function". As if they think that "defined" is part of the terminology.

      • pfdietz 2 days ago

        I think this illustrates that perceived shallow blemishes that upset newbies have nothing to do with success or failure of a programming language. Another example here is "all those parentheses", and perhaps even lack of static typing.

        • kerkeslager 2 days ago

          It's a shallow blemish yes. I wouldn't say is has nothing to do with the success or failure--but as I said in my post, it kind of doesn't matter much.

          The problem I have here is this total unwillingness to admit that it is a blemish. Like because Common Lisp did it, it can't be wrong, so we have to come up with bizarre justifications for this like pretending some ancient hardware architecture is better than today's "braindamaged" CPUs.

          • pfdietz a day ago

            I don't think anyway says it isn't a blemish. The point being made is that it's a trivial, irrelevant, and unimportant blemish.

            • kerkeslager 21 hours ago

              The way Naggum presents it, it's as if it is not only not a blemish, but a work of genius.

              • pfdietz 9 hours ago

                If I wanted to nitpick, I'll point out that I used "says", not "said". In the present tense, Naggum can no longer say anything at all. :/

  • AnimalMuppet 2 days ago

    > And before anyone starts talking about "but most programs don't need random access because you just traverse the list"--yes, and vectors do that better too because of cache locality.

    Valid, though at the time (even 1998!) caches were less critical for performance than they are now, because the gap between processing speed and main memory speed was smaller. In fact, on the machines where Lisp originated, there was no cache.

    • lispm 2 days ago

      Lisp programs often needed large amounts of virtual memory. Factor ten (VM size vs. RAM size) was not unheard. RAM was basically a cache for that.

    • kerkeslager 2 days ago

      I am not sure of the architecture of the machines where Lisp originated, but for all the computers I had access to in 1998, the gap between memory speed and virtual memory speed was relevant. So perhaps I should have just said "locality" and not "cache locality".

      It's hard for me to imagine any sane architecture where using more memory and having if fragmented rather than contiguous is going to perform as well as using a contiguous block of memory.

      And to be clear, I'm not saying there are no upsides. Cons cells are very effective as a minimal yet highly expressive data structure. All I'm saying is that this post by Naggum, and a lot of writing about Common Lisp and other Lisps, pick out the upsides and present them as if they're all that exists, when the reality is that the choices described here are tradeoffs. Sometimes (often, even!) the tradeoffs are worth it. Sometimes they aren't. That doesn't fit the narrative that Lisp is god's gift to McCarthy who then gifted it to humanity, but it is reality.

      • lispm 2 days ago

        > It's hard for me to imagine any sane architecture where using more memory and having if fragmented rather than contiguous is going to perform as well as using a contiguous block of memory.

        Yet many of the language runtimes are actually following the Lisp model. Just that it's there not the cons cell with two slots, but arrays and objects with slots. A Java program is a multitude of dynamically allocated objects with one or more slots, where many of the slots contain pointers to other objects. The memory management is done by the runtime (and, usually, its garbage collector).

        > Sometimes they aren't. That doesn't fit the narrative that Lisp is god's gift to McCarthy who then gifted it to humanity, but it is reality.

        Other than what you might think (and set up as a strawman), that's well known and the evolution of Lisp is also showing how the implementations had to deal with this problem.

        The LISP 1 implementation introduced mark&sweep garbage collection (-> McCarthy). As a reaction to it Collins proposed "reference counting" as another way to manage&reclaim allocated memory.

        Over time early Lisp programs (Macsyma is an example often mentioned) were problematic on multi-user time-shared machines. One reaction to that was to develop single user workstations for Lisp, where all the memory was available to a single user. There one then used large virtual memory, which still made memory access (for example during garbage collection) time consuming. A GC run of a large heap in VM could run for 30 minutes making the computer mostly unusable during that time. Vectors/Arrays and OOP objects then also used the model of dynamic allocation with many referenced objects (unless they were of some primitive types). Generational GCs, compacting GCs, hardware-supported GCs, etc. were invented and implemented. Full GCs through virtual memory heaps were rare then. The availability of cheaper RAM made it possible that more of the large heaps fit into RAM.

        Lisp implementations early on added vectors and other data structures. But they still often need pointers. A vector of records (called structures in Common Lisp) is usually a vector with pointers to records. For example Common Lisp implementations usually will not provide vectors of inline records. A vector of fixnums OTOH will be a vector without pointers -> the fixnums will be stored directly in the vector.

        In the end a Lisp heap is a huge graph of objects referencing other objects. This does not only effect CONS cells, but also vectors, arrays, records, ... -> they also reference other, non-primitive, objects via pointers.

        That's not very different from any other language runtime which uses dynamic memory allocation and a heap of linked objects (-> Java, JavaScript, ...). The problem of dealing/avoiding non-locality and pointer-chasing is there, too.

        • kerkeslager 20 hours ago

          This is moving the goalposts. My critique was about list implementation, not about implementing the entire runtime.

          > Yet many of the language runtimes are actually following the Lisp model.

          This is frankly not true in any way relevant to this conversation. Using garbage collection is not equivalent to using the entire Lisp model. Java, JavaScript, Python, Lua, Ruby, C#, etc. do not make extensive use of linked lists in their standard libraries, instead preferring--you guessed it--vectors. Hashmaps in modern languages are implemented as... vectors again. And if you dig into how structs are implemented in say, C#, they are implemented as contiguous blocks of memory.

          Yes, there are some things contiguous blocks of memory can't do, so everyone has to support nested data structures, but by default, most languages push you toward contiguous memory in their standard libraries and idioms whenever possible, because it's just so obviously more performant for the vast majority of cases.

          > That's not very different from any other language runtime which uses dynamic memory allocation and a heap of linked objects (-> Java, JavaScript, ...). The problem of dealing/avoiding non-locality and pointer-chasing is there, too.

          Yes, and my point is that having cons cells as a ubiquitous data structure in your language greatly exacerbates this problem. The extremely common case of lists does not need to entail pointer chasing, and in the vast majority of cases it should not entail pointer chasing, and contrary to your claims here, in most languages it does not entail pointer chasing. Lisp-family languages are fairly unique in this problem. Ironically, higher order functions like map/reduce/etc. push you toward operating on lists as a whole which is exactly the case where vectors most outshine linked lists.

          • lispm 13 hours ago

            > My critique was about list implementation, not about implementing the entire runtime.

            Linked lists are actually managed by the runtime.

            Example on a esoteric platform, a Lisp Machine, a real, but outdated, computer. But we still have Virtual Lisp Machines, which have a virtualized implementation (not as in a microcoded CPU, as the real ones were).

            If I cons a list via CONS, I get a linked list made of cons cells. But over the lifetime of the list, this may change. At some point in time the runtime may decide to copy the list (for example because it has a copying garbage collector). The effect then is that the list is allocated in contiguous memory. The program itself does not see a difference.

            First the list (1 2 3) is allocated as [1 | -> [2 | -> [3 | NIL]]], written in Lisp as (1 . (2 . (3 . NIL))) . At some point in time the runtime changes this transparently to [1][2][3], where the cells are tagged, such that the successor is in the next memory word and where the last element is tagged that it has no successor. That's called cdr-coding.

            Now If I copy a list, with COPY-LIST I always get a cdr-coded list as a result. Several list functions return cdr-coded lists, instead of linked lists.

            This cdr-coding method is not much used anymore, since there is not much of an advantage: either one uses linked lists (because they are easy to use and have useful properties, like that CONS does not need to copy its arguments) or other available data structures (vectors, structures, ...).

            What is still used are locality improving garbage collectors.

            > Using garbage collection is not equivalent to using the entire Lisp model. Java, JavaScript, Python, Lua, Ruby, C#, etc. do not make extensive use of linked lists in their standard libraries, instead preferring--you guessed it--vectors.

            Lisp also makes use of vectors, where necessary. These languages with managed memory like Java have the same object/reference model as Lisp and the same calling mechanisms. linked lists are only a special case (-> java.util.LinkedList).

            Say we have N cities. Each city is a an object of class CITY. Each city will have attributes, for example NAME and PEOPLE-NUMBER. Now we want to keep two "lists" of them one sorted by name and another one sorted by PEOPLE-NUMBER.

            We need two vectors and N city objects. In Lisp the city objects are not stored in the vectors. They are stored somewhere in the heap. -> there goes your "contiguous memory" to bust. The vectors are pointers into a heap. Every access to the nth city object through one of the nicely contiguous vectors, references an object which is stored somewhere else.

            That model is used in many languages implementations. It's the Lisp model of memory management, introduced with Lisp 1 -> dynamic memory management plus a garbage collector to reclaim unused space.

            > And if you dig into how structs are implemented in say, C#, they are implemented as contiguous blocks of memory.

            That's also the case in Lisp. But a struct (or list or vector) of structs usually points to those being somewhere on the heap. A struct/list/vector of structs is not a single contiguous block of memory. structs in slots are typically not inlined. Some languages inline data non-primitive structures, Lisp usually does not.

            > contiguous blocks of memory

            That's can be an illusion. In a virtual memory system, the memory is made from pages of a fixed size. Random pages are cached in RAM, influenced by their usage pattern over time.

            > in most languages it does not entail pointer chasing

            It does, see above. Just not tail pointers in the list.

            > Ironically, higher order functions like map/reduce/etc. push you toward operating on lists as a whole which is exactly the case where vectors most outshine linked lists.

            That's why in Common Lisp the functions MAP and REDUCE work over vectors, too.

            > Lisp-family languages are fairly unique in this problem.

            see Prolog and various functional languages, ... some even have basic strings implemented as linked lists of characters (which Common Lisp does not do).

            If we look at data structures, one tries do address quite a bit more than "contiguous memory", for example persistence, runtime updates and runtime lookups, ... see for examples: https://cstheory.stackexchange.com/a/1550

  • rjsw 2 days ago

    Lisp has had vectors as well as lists for a long time. If you want a vector then use a vector.

    • Jach 2 days ago

      Yup, though usual caveats on if the item stored is itself a fat object requiring a pointer chase... SIMD exists as well, SBCL has it available out of the box through sb-simd. SBCL also tries to be a bit smart with its linked lists: when first making a list, if you make it sequentially (make the cons cells sequentially), it'll all be allocated sequentially. And regardless of that, when the garbage collector moves a list, its new location will be sequential.

      The OCaml people at Jane Street noted that if you just focus on cache lines, the overhead these days between list traversal vs. vector traversal can be closer to a factor of 2. Like 16 cycles vs. 30 cycles for a new cache line on an old Skylake architecture, thanks to prefetching -- explicit prefetching instructions are hard to get right, but CPUs have also gotten better in the last 10 years at automatic prefetching. (It might be even better on Apple's architecture which I believe has more line fill buffers?) A miss is still around 300 cycles though, and a cache line of 64 bytes very typically represents more than one item (e.g. 8 longs).

      • kerkeslager 2 days ago

        > Yup, though usual caveats on if the item stored is itself a fat object requiring a pointer chase... SIMD exists as well, SBCL has it available out of the box through sb-simd. SBCL also tries to be a bit smart with its linked lists: when first making a list, if you make it sequentially (make the cons cells sequentially), it'll all be allocated sequentially. And regardless of that, when the garbage collector moves a list, its new location will be sequential.

        This is the sort of thing that I was referring to when I referred to "cache-aware implementations of Lisps".

    • kerkeslager 2 days ago

      This is obvious and irrelevant.

kragen 3 days ago

Well, shit. Educational, but filled with bile. This post is an example of how to go wrong and make yourself and everyone around you unhappy, while still being right.

  • pavlov 3 days ago

    Brings back unhappy memories of the conversation style in Usenet. I didn't dare to use newsgroups after I asked some wrong thing as a 15-year-old in some computing-related group and got dumped on by angry adult men.

    I get it that they were frustrated by Eternal September and whatever. But thinking back to those times makes me appreciate the moderation culture that dang has created here, as well as the non-linear structure that moves less popular comments out of the way of attention.

  • stackghost 3 days ago

    He's responding to Xah Lee, well known to be one of the most abrasive and unpleasant people in the lisp community. The hostility might not be desired but it was certainly earned.

    • lispm 3 days ago

      Xah Lee wasn't a part of the Lisp community. He was a Usenet troll, active in several programming related newsgroups, not just comp.lang.lisp. He had never written a line of Lisp (or anything more challenging) and never contributed anything, during his Usenet times. His effect on these newsgroups was extremely negative. Discussing with him was useless. Erik Naggum tried and failed (as others did). Erik was a part of the Lisp community, but unfortunately also with negative behavior, especially on comp.lang.lisp. He could write long&useful explanations, but discussions with him were often not pleasant.

    • QuesnayJr 3 days ago

      It was Erik Naggum replying to Xah Lee, so you had two of the most abrasive and unpleasant people in the lisp community interacting.

    • dokyun 3 days ago

      It doesn't look like Xah was really trolling here, most of his responses are pretty polite. And he seems to indicate at this point in time he wasn't really familiar with lisp, so I'd venture to guess he wasn't nearly as prominent as he later became.

      • stackghost 3 days ago

        You may be correct; I generally enjoy not thinking about Xah Lee so I might not have the correct context for TFA's point in time

      • p_l 3 days ago

        Xah has talent for speaking out of his ass in polite and knowledgeable sounding way

        • Jach 2 days ago

          Most of the stuff I've seen has not been so polite, he's such a notorious hater. Still, it can be entertaining in a way, in small doses (I imagine I'd feel a bit different if I was forced to share a newsgroup commons for years), and such a critic can provide a good perspective once in a while. Even though he's still around I think of him as a character as much as Naggum was, even if they were very different in important ways. The world is richer from having more characters.

          The other day I ran into a "gem" from Xah that just made me laugh at its expression. Willing to tank the karma/flag to reproduce:

              ① I fucking hate this motherfucking tech geeking hacker fucks who
              leave dead links on the web and don't fucking care. What the fuck?
              Aren't these elite, elegant, coding experts who always take perfection
              in algorithm, code, style, language, etc?
              
              I don't understand, how could they, have their home page dead?
          
          I laughed again when some followup links to his site 404'd. It's not at all a fair attack, but man, sometimes I really hate dead links too...
          • dokyun 2 days ago

            I think he's hilarious, and at a time where a lot of programming pundits only pretend to be subversive and shameless, I welcome Xah's unflitered presentation of his opinions. He often makes a lot of good points, and I tend to find his stances on things thought-provoking even if I disagree with him.

        • NikkiA 2 days ago

          He's my least favourite emacs user

        • exe34 2 days ago

          the original LLM

  • dreamcompiler 3 days ago

    That was Naggum's trademark: Extremely good explanations coupled with zero tolerance for fools. As long as you asked him a question in good faith, he treated you with courtesy. But I confess I relished the entertainment of reading his deconstructions of bad-faith actors.

    Naggum died much too young. I hope he has found peace.

    https://en.m.wikipedia.org/wiki/Erik_Naggum

    • moron4hire 2 days ago

      I can't stand this style of writing. I especially dislike it when I catch myself doing it.

      It's not necessarily because of the negativity. It's mostly just because people who write like this take forever to get to their point.

      • kazinator 2 days ago

        A certain Tim R., a present day comp.lang.c poster, takes multiple posts spanning weeks to get to the point.

        • moron4hire 2 days ago

          And these people always say they're "just being direct".

          It's a good litmus test for me in my own writing. Am I actually being direct or just indulging in anger? What happens if I just get to the point and say what I want? I have found I get more of what I want that way.

    • e40 2 days ago

      It’s been a while since I’ve seen those photos on the wikipedia page, which I took. He was very photogenic. He came to Berkeley for a Lisp conference. Brings back some really good memories. Thanks.

      • dreamcompiler 2 days ago

        I met him (and McCarthy too) at that conference. Can't believe it's been 25 years.

    • kragen 2 days ago

      Naggum destroyed the Lisp community by driving away everyone who didn't relish it. (Like you, Xah Lee did relish it, even when the bad-faith actor being "deconstructed" was a character he was playing—perhaps especially then!) He also had, as you can see in this post, zero tolerance for people preferring different tradeoffs from his own (often because they held different values), which fragmented it.

      • lispm 2 days ago

        German has this great word for your comment: "unterkomplex". The English "simplistic" might come near. comp.lang.lisp was not "the Lisp community", Naggum did not single handedly destroy it, you ignore the other participants, the general Usenet dynamics and its overall fate...

        • kragen 2 days ago

          Usenet was important for the development of many other language communities, and Lisp was denied that opportunity. Naggum didn't act alone, certainly, and bad blood existed in the Lisp community since before Usenet (Stallman is a difficult person, and we can all be grateful he never actually bombed the Symbolics offices), but he was the pivotal figure in how things played out in the late 01990s. There wasn't a Naggum-free alternative general Lisp venue where new users (or Lispers who merely disagreed with Naggum, such as Schemers) could post without fear of public humiliation, so mostly they didn't. It's a rare undergraduate who doesn't count public humiliation among their worst fears, and for people like Steele, Weinreb, Gabriel, Friedman, and Stallman, participation on Usenet was all downside with no upside.

          Lisp started to recover immediately when Naggum died. Or, more accurately, when he got too sick to post.

          I could make the same criticism of HN, but to a much smaller degree.

          Kaz is of course correct that in Lisp, as in any language, there's a silent majority of programmers who don't post in public forums like Usenet. But they depend on the efforts of those who do for things like FAQs and free software.

          • lispm 2 days ago

            You are, sadly, digging yourself deeper into a hole. I'm not interested to follow you there.

            • kragen 2 days ago

              None of this has to do with me; I can't have posted to comp.lang.lisp more than a few times, and I don't recall ever interacting with Naggum.

              Most of what I've said is self-evidently correct, and almost all of the rest is relatively easy to verify by reading the archives of old mailing lists and Usenet.

              The only exception is my insinuation that Stallman was tempted to bomb the Symbolics offices, which I don't have a reliable source for; maybe you could find out by asking Stallman, or Stallman's former friends and former coworkers.

              • lispm 2 days ago

                "self-evidently correct" is not a good word if you want to discuss things with people who were actually active in the wider Lisp communities, including comp.lang.lisp, other newsgroups, various mailing lists, Lisp conferences, etc.

                • kragen 2 days ago

                  I'm interested to hear your alternative perspective, but as long as I lack it, I'm going to have to agree with dang: https://news.ycombinator.com/item?id=26855767

                  My objective here is not to attack you for your partisan support of Naggum; rather, I want to understand what alternative perspective you're coming from in which my observations perhaps aren't self-evidently correct—though I note that you've carefully avoided actually saying they aren't, leaving open the interpretation that it's just that you wish I'd phrase it differently.

                  • lispm 2 days ago

                    > My objective here is not to attack you for your partisan support of Naggum

                    Look, I'm not supporting Naggum.

                    From the actual Lisp users I've met in real life, few were reading comp.lang.lisp, almost none was posting there. Very few would even know who the people there were.

                    The actual people in the Lisp community I admire are many - I could list 100 - they had a much greater impact on what was happening. Most of them never used comp.lang.lisp . Among women the participation rate was near zero!

                    Most of the interesting communication was in specific communities over mailing lists. I fondly remember SLUG (the Symbolics Lisp Users Group), the MCL mailing list, and several others.

                    comp.lang.lisp had the most negative impact by trolls, like the one Naggum was answering, which caused endless bullshit discussions and which had very destructive behavior. We never found a way to deal with that. What it killed in the end was that the time for Usenet was over, Google groups, spam, it was mostly irrelevant and a much more toxic Internet took over. See 4chan, various Reddit groups (though the Lisp subreddits at Reddit do better) and X for much worse communication systems. Plus Lisp was in a general demise in the 90s. Not the Usenet caused the demise of Lisp, but the demise of Lisp was causing a general lack of interest. Cause and effect were different of what you claim.

                    For you the contact to Lisp could be mostly net related (?). When I was learning at the university roughly up to 20 professors were in some way or another involved in Lisp projects. There were people I've seen in real life, often daily. There were many people with deep Lisp knowledge (and this was not even the US). You can bet that none of them used or cared about comp.lang.lisp. Same for the people in the projects and the students.

                    What you think was "the Lisp community", was just a popular online community of mostly random people, with less impact on what actually happened in the Lisp communities, then you believe.

                    • kragen 2 days ago

                      It seems like our points of view are closer together than I had thought.

                      I first learned about Lisp because my uncle lent me a Lisp textbook; later, an executive at the company I was working at lent me SICP and exploded my brain. Most actual Lisp users I've met in real life weren't reading comp.lang.lisp either. But it's true that most of my contact with Lisp has been over the internet.

                      That's because most of my contact with any technical discipline has been over the internet, because almost everyone's has been, for 30 years now. It would be inconceivable to say, 25 years ago, "Most actual Perl users I've met don't read comp.lang.perl.misc," or, "Most actual Python users I've met don't read comp.lang.python," but it was and is true of Lisp. Sharing code and discussions over the internet has been a superpower for the programming-language communities that have managed it, including cross-language communities; I first got to know pg on the Lightweight Languages mailing list, for example. Meanwhile, Lisp users were isolated in the kinds of tiny single-implementation ghettoes you describe, or were collaborating only with other researchers at the same institution.

                      What I'm pointing to is precisely the fact that comp.lang.lisp was just mostly random people with little impact on what actually happened in Lisp. Every person who might have taken up Lisp, who instead focused on Perl or PHP or Haskell, was a missed opportunity for Lisp to thrive. As with the undiscovered antibiotics resulting from our current pharmaceuticals regulatory regime, the costs are invisible, but staggering. They are the Lispers that weren't born during the 01990s, the Lisps that didn't get finished or even started, the libraries that got written in Perl or JavaScript because Quicklisp and Leiningen didn't exist yet (as far as I can tell, there's still no equivalent for Scheme.)

                      You're right that trolls like Xah were what made comp.lang.lisp unusable. However, despite the almost 800 lines of code he contributed to Emacs, Naggum was one of them—the worst of all. Those 800 lines are apparently the totality of his public code, and in total they're smaller than some of his individual hate-filled posts to comp.lang.lisp. The period of Naggum's unquestioned domination of the newsgroup was precisely the period during which Usenet was the most important groupware medium. And the practice of establishing dominance by public ridicule didn't end at the boundaries of Usenet; as dang points out above, to this day we still have to combat it in virtually every public discussion of Lisp. That's Naggum's lasting legacy.

                      Since he became inactive, though, the situation has improved dramatically. We have Arc, Clojure (and Leiningen!), SBCL (which started while Naggum was still active but took years to become popular), the entire Racket empire (now built on the redoubtable Chez compiler), several new "Little" books, miniKANREN, Hy, Quicklisp, R7RS-small, and, as you point out, reasonably functional subreddits. #Lisp on Libera is a pleasant place, and Emacs Lisp and Guile now have native-code compilers. ACL2 and PVS are significant minorities in formal methods. You can even run PDP-1 LISP, or modify and reassemble it with MIDAS.

                      • kazinator 2 days ago

                        I would guess that Perl programmers lurking in comp.lang.perl.* 25-30 years ago would have been a tiny minority among all Perl programmers.

                        • kragen 39 minutes ago

                          I don't think that's true, although maybe you could define "Perl programmers" to include people who downloaded broken, insecure CGI scripts from Matt's Script Archive and made random changes to them until they seemed to work. Even a lot of those people relied on clpm for guidance—often asking questions rather than just lurking. As you can remember, we didn't have Stack Overflow at the time, or even PerlMonks, so clpm was pretty much the main public forum for people to ask questions or announce libraries. Corporate and academic institutions to support Perl basically didn't exist; perl.com was running off Tom Christiansen's home internet connection for several years.

                      • lispm 2 days ago

                        > since he became inactive, though

                        Again you are digging yourself deeper into this hole. Now I'm really out.

                        • kragen 44 minutes ago

                          I don't know what "hole" you're talking about. The hole of correctly describing how the Lisp community destroyed itself by permitting Naggum to dominate its public communications, and to some extent has recovered since then?

                          Perhaps you're implicitly criticizing me for violating some taboo so dreadful you don't dare even to name it; for example, a taboo on speaking ill of the dead. Unfortunately, any truthful accounting of almost any tragic historical events requires willingness to speak ill of the dead, so I'm more than willing to "dig" myself into that "hole". I'd like to invite you to join me here; cowering in fear like that is unworthy of you.

      • kazinator 2 days ago

        comp.lang.lisp is not the same thing as "the Lisp community".

        Even in the heyday, the Usenet news groups represented a minority. For any language, OS or tech stack, there were many more programmers off Usenet than on.

        A programming language newsgroup represents the intersection between interest in that language and interest in Usenet.

    • eadmund 2 days ago

      I think the issue is distinguishing between bad faith and simple ignorance.

      You catch more flies with honey than vinegar!

      • kazinator 2 days ago

        Fruit vinegar (e.g. apple cider) is the tool of choice for trapping fruit flies.

  • agumonkey 3 days ago

    He made it a kind of game or culture. Also iiuc he's answering to a certain person who was known to be regularly walking on the line of trolling. That might have added more fuel to the fire.

  • bsder 3 days ago

    Erik Naggum (RIP) was particularly noted for having lots of bile if you got on the wrong side of him.

    Xah Lee is noted for being especially abrasive and pretty much gets on the wrong side of everybody.

    Putting the two together was always a spectacle worth a bucket or two of popcorn.

    And, the real answer to "Why cons cells?" is the same reason as "What is up with the names car and cdr?". The only two data structures which are cheap enough (in both RAM and CPU) for the machines of the time (IBM 704 from 1954!) to deal with are arrays (via index registers) and pairs (via register sections which gives us car/cdr).

  • anonzzzies 3 days ago

    It is possible for your brain to parse information from people you don't have a personal connection with as what the message is and skip the how. Not sure why people get lit up by the how; who cares about that from people you don't give a toss about? These days you have AI for it; Claude can give you a lovely summary of the facts without the how. My brain does that automatically since I was in high school. Comments from anyone but close friends or close relatives cannot make me 'unhappy', no matter what they are.

    • johnisgood 3 days ago

      This is getting out of hand, "feelings over facts" kinda thing these days, people are unable to deal with petty "insults", etc.

  • LAC-Tech 3 days ago

    I'd rather an Erik Naggum than milquetoast passive aggressiveness. If you're going to be a dick, at least be up front with it.

    • NikkiA 2 days ago

      The thing that always amused me about Erik was that he was often right, but dismissed because of his tone or reputation.

    • biorach 2 days ago

      Or, just don't be a dick, full stop.

pfdietz 2 days ago

comp.lang.lisp was one of the meatier parts of Usenet I read.