Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Clojure adds syntax, which breaks some of the inherent symmetry of Lisp slightly[0]. This isn't something you're likely to notice unless you do a lot of heavy metaprogramming.

Clojure cannot guarantee elimination of tail calls, because of the limitations of the underlying JVM. This also means that it can't properly handle corecursion in the same way that pretty much every single other Lisp can.

(Both of these have been discussed before many times before on HN, so I can pre-empt the next comment in this thread, which will be someone pointing out that Clojure provides "loop" and the "recur" macro - to which I'd respond, yes, they're logically equivalent in the end, but having to force the transformation to a loop explicitly breaks the paradigm, which for me is the whole point of using a Lisp. This is one of those topics that can be discussed ad nauseum with no "conclusion", so it's not worth spending too much time on it.)

Lastly, anytime you're dealing with binary compatibility between various JVM languages, the abstraction is inherently a bit leaky. I haven't used Clojure myself, so I can't comment specifically there, but from my experience with using Java libraries in Scala, I can testify that some of the warts of Java end up leaking into Scala code. Nothing debilitating, just a bit frustrating[1].

[0] Racket does too, but the "syntax" added (square brackets) has the same semantics, so it's really just an equivalent token - an alias, if you will.

[1] One particular example I remember has to do with how Foo.class in Java works, and how a library dependent on this particular pattern has to be used in Scala - it just gets a bit messy.



> Clojure adds syntax, which breaks some of the inherent symmetry of Lisp slightly[0].

In what way does this break the symmetry?

Lisp source code is a Lisp data structure (or becomes one when read by the reader). In Common Lisp and Scheme, that structure is either an atom (an integer, a string, a symbol, etc.), or a list consisting of cons cells. In Clojure, that structure can also be a vector or a map. This is enabled by the fact that vectors and maps have their own literal syntax.

I was uneasy about this in the beginning as well, but then I came to the conclusion that this is not unlispy at all.


Symmetry may not have been the correct word, since it does provide referential transparency but it certainly does add an extra layer of complexity to the parsing, even if it's only one extra step. Furthermore, I would argue that the additional syntax isn't necessary, which is my biggest beef with it - since you can convey the semantics of a vector or a map without altering the syntax at all, there's no reason to complicate the syntax any more than needed.

> In Common Lisp and Scheme, that structure is either an atom (an integer, a string, a symbol, etc.), or a list consisting of cons cells.

Let me fix that for you: in other Lisps, an item is either an atom or a cons cell. There's "no such thing" as a list in Lisp.

There's a huge difference between having an option with two outcomes (S -> atom | cons) and an option with three outcomes. In computer science, we count "zero, one, many" - booleans are an example of this. By adding a third option, we've stepped out of the realm of the binary into the "many", and that's a much messier world to deal with.


> I would argue that the additional syntax isn't necessary, which is my biggest beef with it - since you can convey the semantics of a vector or a map without altering the syntax at all, there's no reason to complicate the syntax any more than needed.

You could say the same about the quote, backquote, unquote and unquote-splicing syntactic sugar being built into the reader. It is redundant, and yet it's there in most Lisps -- because it helps readability/maintainability at the cost of the little complexity it adds.

> Let me fix that for you: in other Lisps, an item is either an atom or a cons cell.

In Common Lisp, it is only correct insofar as the language defines "atom" as "not a cons cell" [1], contrary to the intuitive understanding that it's an indivisible entity. E.g., CL vectors are atoms, even though they have more in common with lists than, say, symbols. And they do have literal syntax, like #(1 2 3). How is that different from Clojure's [1 2 3], save the different type of parens?

[1]: http://www.ai.mit.edu/projects/iiip/doc/CommonLISP/HyperSpec...


This has been in Lisp since the big flood.


Common Lisp:

    * #(1 2 3)

    #(1 2 3)

    * (type-of #(1 2 3))

    (SIMPLE-VECTOR 3)


Being a bit of a pedant: do you mean "mutual recursion" instead of "corecursion"? The former means multiple functions calling each other, thus being recursive indirectly. The latter is more like an inverse of recursion: instead of starting with some data and reducing to base cases, you start with base cases and produce (co)data.

Defining a lazy list in terms of itself is an example of corecursion, which Clojure can do (albeit a bit awkwardly because you have to make the laziness explicit).


I wonder if, in the future, this limitation of Clojure will go away. I know there has been talk of the JVM folk adding things like support for tail recursion. That said, I'm not smart enough to know whether talking about adding support for tail recursion translates to "Once we're agreed, let us go ahead and solve this tractable problem," or if it translates to "Gee this would be nice, but it's wicked hard."


> support for tail recursion

Let's be clear - it "supports" tail recursion; it just uses more than a constant amount of space on the stack to handle the calls - in other words, it doesn't transform the recursive call to a loop.

Eliminating tail calls is by no means a hard problem in the general case - it's on the order of something you might be expected to do in an introductory compilers class.

The problem is particular to the JVM. As I understand it, is due to the fact that the JVM was architected in a way that doesn't allow it to guarantee that a tail call can be transformed into a loop. These techniques certainly existed in the early 90s (they've existed since, what, the 60s?), but at the time, I guess people didn't think it was a priority.

I have no further knowledge of the JVM, so I can't really comment on whether or not they'll be able (or willing) to add the support, but the problem is dealing with legacy designs/systems, not a difficult problem of CS theory.


> Eliminating tail calls is by no means a hard problem in the general case - it's on the order of something you might be expected to do in an introductory compilers class.

Actually it is problematic. Especially the interaction with dynamically scoped constructs...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: