Hacker News new | past | comments | ask | show | jobs | submit login
Dialects of Lisp (programmers.stackexchange.com)
94 points by mwillmott on Feb 22, 2013 | hide | past | favorite | 90 comments



I've seen Common Lisp start to make a great comeback in the past few years. I started programming in it seriously about a year and half ago, and since then the packaging system (quicklisp) and libraries for it are maturing very rapidly. A lot of people are making useful libraries for practical purposes.

A lot of people say the language's timing was off. I couldn't agree more. However, it's starting to get a second chance. There are several open-source implementations that compile to machine code and/or support threading/networking/etc. Deployment is essentially free. For every feature that an implementation has that's not in the standard, there's a library that creates a standard, cross-implementation interface for it (for instance threading and the "bordeaux-threads" library).

The language is screaming fast, extremely expressive and powerful (insert mandatory macro hype here), and becoming "standardized" all over again via its libraries. I use it (Clozure CL) in production and haven't ever had an issue.


> The language is screaming fast

I don't disagree, but that's an interesting change in the CW.


(CW == conventional wisdom?)

There are probably a few reasons for that. Many more programmers these days probably get their start with "scripting" languages like Python, Ruby, Perl, PHP, and JavaScript. Compared to these other dynamic, high-level languages, you probably could call Common Lisp "screaming fast". (Of course, I'm aware that there is very active implementation work going on for several of those languages that is probably doing a good deal to close the gap.) On average, though, CL is probably in the same ballpark as Java, and can sometimes be made as fast as C or C++ with a lot of effort.

Also, note that Common Lisp is very much a many-implementation language (in contrast to languages like the ones above that, for most of their history, were defined by their single implementation), and different implementations make difference performance trade-offs. For much of recent history, the very high performance CL implementations have been somewhat out of reach of the typical student, hobbyist, or other weekend hacker that probably makes up most of the Lisp community. Commercial CLs are quite expensive, and high-performance free CLs like CMUCL have historically not been too friendly to beginners. When I came to Lisp, it seemed much easier to get started with CLISP (an extremely portable bytecode-compiled implementation) than with CMUCL. More recently, SBCL (an offshoot of CMUCL) and Clozure CL have made a lot of progress towards providing a go-to open source CL implementation with very good performance.


If I were to bet, I'd place my money on Clojure as the 100 year language. It is as unbound by hardware in regard to cross platform development as any other likely candidate, it has good access to libraries meeting contemporary expectations, and is likely to continue to have access to new libraries written to meet future expectations - all due to its interaction with the JVM (and .NET).

Of course I am adopting PG's premise that the 100 year language is a LISP while ignoring his premise that it is not a bit kludged up by practical considerations (i.e. the ugliness of its dependence on the JVM).

(/edit) Clojure also has an evangelist in Rich Hickey whom people seem to respect independently of Clojure.


Common Lisp is 29 years old(if we count from the publication of CLTL1) or 19 years old(if we count from the publication of the ANSI standard), clojure is 5-6 years old. Scheme is even older. I love clojure, but it has some mighty big shoes to fill, CL was build for longevity from the beginning. Also, both scheme and CL run on the JVM. CL also survived the death of many operating systems, architectures and platforms in those 29 years, clojure is yet to survive one.


To me, Common Lisp appears to suffer from bad timing. It was designed on the cusp of paradigm shifts from timesharing to networking; from text interfaces to graphic interfaces; and from computers being rare to being ubiquitous. Hitting it's peak in popular consciousness in the midst of the Black Friday and S&L recession didn't help either.

My view is premised upon the idea that a major deficit of Common Lisp is a poor access to contemporary independent user interfaces. The solutions that exist are not competitive with the leverage Clojure enjoys from JVM (and .NET).


You are somewhat correct, but one of the reasons I love common lisp is exactly because it sort of overcame this "bad timing", which is very unforgiving to lesser languages. The fact that today I can write lisp apps in a language whose primary platform(the lisp machine) died 20 years ago, and still enjoy a relatively active ecosystem is quite impressive.

Do you think clojure has the potential to be useful in 20 years if the JVM dies today hypothetically? I don't know, I think yes, because it has a bunch of good ideas, but it isn't at all clear. Take JS as an opposite example, if the web is to disappear, JS will disappear as well, no matter how many other platforms it is ported to, it just isn't good enough to stand on its own.

I could be talking out of my ass here, but my point was that CL has a good track record for survival, so it has the potential for a 100 year language, while clojure is yet to prove itself for such a task. These things are extremely hard to predict, tech history has a habit of throwing such wild-cards as personal computing and the web. We'll see.


The primary platform of CL did not die. CL was ALWAYS developed on multiple platforms. From day -1 there were implementations on non Lisp Machine platforms. People actually ran companies to do so: Lucid (Unix), Franz (Unix, PC), Coral (Mac), Goldworks (PC), Procyon (PC, Mac), Harlequin (Unix), ... The Carnegie Mellon University developed CMUCL for Unix. Japanese developers KCL (Unix, ...), CLISP from Germany, ...


You are correct, I think, on the bad timing. Also factor in that Common Lisp used to be expensive to use and deploy. In the early 1980s, I 'went cheap' and bought a Xerox 1108 (for $25K, which was a lot of money in those days). Deployment costs? Buy a 1108 for each customer. Yuck! I used my 1108 for prototyping and demos (and made money selling an AI product written for it), but no deployed systems.

Today with free SBCL and Clozure (among many others), deployment is free - what I would have given in the 1980s for something like SBCL!


We deployed Common Lisp on Unix for nothing using KCL/AKCL/GCL. AKCL appeared in the mid 80s.

In the 80s SBCL was called CMUCL. Free.


That is true. I don't remember why I didn't use CMUCL. We were mostly a Dec VAX shop; perhaps VAX was not supported?


If you were using DEC Common Lisp, you might have used bits of CMUCL.


My bet is that when Clojure gets to be 29 years old it will, also, suffer from several problems. I don't believe it is possible to design a future proof programming language. Who knows what will come next and how it will affect programming tasks?


To clarify, I am making a reference to this PG essay:

http://www.paulgraham.com/hundred.html


Hitting it's peak in popular consciousness in the midst of the Black Friday and S&L recession didn't help either.

Is there any real connection here with CL? Did these events somehow precipitate or deepen the AI winter?


Well, Rich used to use Common Lisp (I know because I used to use his very neat LispWorks to Java bridge stuff).

Clojure really does hit the sweet spot combining a concise agile language with good practical decisions, similar I think to some of the decisions that went into the design of Scala (like a generalized seq API for lots of data structures).

Except for some personal serious side projects, I am now retired, and I chose Clojure over other good possibilities (e.g., Common Lisp, Racket, Gambit-C, Scala, Smalltalk, or Ruby) because it is a practical language and still lots fun to code in.

Now that I don't have to worry about staying up to speed on all of the languages my customers used to like using, I find that just using a single language really saves me a lot of time. Some advice for people still working: perhaps choosing a single practical and extensible language like Scala or Clojure is a good strategy; I used to like being a polyglot programmer but that requires a lot of overhead!


I wouldn't put my money on the JVM platform. It's funny but Clojure's bigest advantage is also its biggest disadvantage.


Given its adoption by enterprise, the JVM at worst has a COBOL future. At best, it's future is worse-is-better.

http://www.dreamsongs.com/WorseIsBetter.html


> At best, it's future is worse-is-better

If we're taking the worse-is-better angle My suspicion is that clojurescript, which compiles to javascript, is a much more likely candidate to continue based on worse-is-better.


Clojure is so closely tied to the JVM that it's hard to imagine it being the "100 year language" unless the JVM turns out to be the "100 year platform".


The original Clojure implementation was built on the JVM, but it also targets the CLR and JavaScript. I don't think it's irrevocably tied to the JVM.


Some people would argue that once you remove the JVM there isn't much there (that can't be found elsewhere).


Exactly! The reason why languages like Clojure and Scala are becoming popular is not only their cool features (although in that sense Scala has an advantage over Clojure which is just 'another Lisp') but because of the ready availability of libraries to do pretty much any thing you need to do. Take that away from Clojure, and you have something that can barely compete with CL. Take it away from Scala and you end up with another academically interesting, yet obscure language (like Haskell.)


Clojure is currently my favorite language, partly for the pragmatic choices it made, but these pragmatic choices will limit its continued use.

Hygienic macros often prevent one from shooting oneself into the foot, but they prevent also some truly elegant code. Having lists and vectors with a common interface is cool, but that the semantics of conj is different for both is annoying. Easy java interop is worth a lot, but it prevents call/cc.

It's probable that i will gladly use Clojure for the next 10, maybe even 20 years, but it won't be the programming language my grandchildren will ask me to shut up about.


Hygienic macros prevent accidental symbol capture. They don't prevent intentional symbol capture. You may be thinking of the limitations of pattern based macros, but many Scheme systems provide hygienic procedural macros.


I am curious, what do you think hygienic macros break? I'm a fan of Racket, and I haven't yet found anything that it's macros can't do.


I'm still too new to Lisp to answer this question, but I have seen the book Let Over Lambda recommended as having answers to that and similar questions.


Not bounded by hw but still bounded by its VM implementation.


With Clojure being available for the jvm, the clr through ClojureCLR and javascript through ClojureScript I don't believe that this will be much of an issue.


Except that ClojureScript is dependent on JVM Clojure, and if ClojureCLR were to take advantage of the CLR's capabilities (TCO and stack allocation, for example) it also wouldn't be Clojure.

So we're really talking about dialects of Clojure (and perhaps might as well be talking about dialects of Lisp in general).


We're not that far from officially having a boostrappable ClojureScript compiler. And some people have already done it - http://github.com/kanaka/clojurescript


>It is as unbound by hardware in regard to cross platform development as any other likely candidate

Gambit c runs on more hardware


I'm not familiar with a lot of other dialects, but the information on Racket is terrible. I wouldn't trust any of the other responses.

Racket is not an R6RS scheme, it's... Racket. R6RS is an available language/library in Racket, but it's not what you usually use[1].

Racket has also always been a language laboratory, not just a scheme. I'd also wager that it's the most popular scheme used today, maybe even the most popular lisp.

[1] The Racket primary language is specified with #lang Racket, R6RS is specified with #lang r6rs. R5RS is specified with #lang r5rs.


This answer references Chicken Scheme, which compiles to C - I'm curious, is there any work on a Scheme that compiles to LLVM? (A brief search didn't reveal anything that's been updated in the last 4-5 years.)


Pretty much all the major Schemes predate LLVM, and modifying a compiler to use LLVM instead of its existing tools is not necessarily trivial and not necessarily worth it.


Gambit is getting backends that include x86 and JavaScript Real Soon Now. There has been LLVM work, but I don't think it progressed very far.


+1 for mentioning Gambit-C. I don't use Gambit-C anymore, but I used to use it a fair amount for writing small utility programs. Gambit-C can be used to build very small and efficient executables. Gambit-C doesn't have the huge library that Racket or Chicken have but it is great for building small command line utilities, etc.


you can compile the c code to llvm IR, but why would you do that? LLVM is not a very good VM it's not platform agnostic, some stuff like the the pointer size are set a compile time, c seems to be a better platform agnostic assembler.


It's not really a VM. It's more like a tool set for writing compilers.


I wonder if Arc is finished or dead. I like it it seems the ecosystem is missing.


No mention of NewLisp?


NewLisp is to Lisp what ramen noodles is to Italian cuisine.


Urban Dictionary about Ramen Noodles: "Quite possibly one of the most all time best foods ever to be found in the United States of America."


Why don't people do things like this for Dialects of Algol? Is it because most of them got new names, like Java, C, and so on?


I was actually thinking about this same question earlier. There are dialects of Lisp and dialects of BASIC, but most other languages are not considered dialects of anything similar. How different do two programming languages have to be before they're no longer considered dialects?

It seems like Scheme and Common Lisp, for instance, have more differences than Java and C# have from each other. Why wouldn't C# be considered a Java dialect? Visual Basic is very different from BASICA, yet they are both considered BASIC dialects. Why shouldn't Pascal and Ada be considered ALGOL dialects?

The line-drawing process seems completely arbitrary.


"The line-drawing process seems completely arbitrary."

The real world cannot be exhaustively described with a single taxonomy.


Because natural language's type hierarchy is not that sound as we programmers would like it to be.


The way is Scheme -> Common Lisp -> Arc.

Scheme comes first because it is more refined and there are excellent courses and books.

CS61A probably the best introductory course. After that it is good to watch original SICP lectures by the two magicians and read the book. Then it is necessary to take a look at what HtDP team is teaching.

After this there will be no trouble with Common Lisp. Classic books are these by pg and Norvig. CLtL is just a reference.

Then, with all the knowledge so far one could understand arc.arc and appreciate what have been done there.)

With this background you will understand what Clojure really is and why, or what is Haskell, and why there is nothing special in it.

btw, there is a brilliant course by Dan Grossman on coursera covering FP. He is very clever and consistent. After this you will smile at Haskell guys.)

Update:

I forgot Emacs Lisp. An Introduction to Programming in Emacs Lisp is an excellent book that covers all the basic principles and gives a perspective how to use lisp in a real-world project.)


The idea that there is nothing special about Haskell, or that Haskell is all that closely related to lisp, is patently absurd. Haskell comes from a different, more mathematical background with a much more extensive underlying theory than lisp.

I actually came from the background you advocate--61A (which really was an awesome class) was my first CS course in college. (Unfortunately, they have since ruined it.) I certainly appreciate lisp and have actually used Racket in a practical setting. That said, I've found Haskell and OCaml to stand out both from purely pragmatic and theoretical considerations.

Haskell is the only moderately popular language that lets you control side-effects globally. That by itself is extremely special. This makes writing maintainable code easier and gives the compiler considerable freedom in optimizing code.

Haskell has an incredible but also surprisingly simple type system. You can't just ignore that. There is nothing in lisp--not even typed Racket--that even comes close to Haskell's type system.

Even aside from language qualities that make Haskell special--which I haven't even come close to covering--it also stands out because of its community. Not only is the community particularly nice and welcoming, it's also special in not being afraid of a bit of math. This doesn't mean you as a programmer have to know much math, but it does mean libraries tend to be simpler and more self-consistent because they rely on very well defined mathematical abstractions with simple algebraic rules. These abstractions also allow for significantly more general code.

Haskell comes from a different theoretical background and philosophy than lisp. It has different practical advantages and a very different philosophy. So claiming that it's nothing special and that you will understand all of it just coming from lisp is completely wrong.


I agree with much of what you wrote there but you should remember that the single most distinguishing feature of Haskell (aside from its predecessor Miranda) is laziness. Almost everything unique you described about Haskell was not accidental, but necessary to build a wholly lazy language.


From reading what Haskell related articles I've seen around HN and other places over the last few months, the always-on laziness seems to actually be a problem a chunk of the time, correct? Well, not so much a problem, as not always a good thing.


Laziness (or, properly, non-strictness) enables a radically different style of programming, and combined with functional purity, also enables kinds of code transformation that are not valid in strict, effectful languages [1]. Almost any dynamic programming algorithm is trivial in Haskell, c.f. packrat parsing, which is embarrassingly easy to implement using a table of lazy values, and you can also trivially write control structures and infinite data structures [2].

The big problem is that reasoning about non-strict evaluation is quite non-intuitive (although not impossible.) It's quite easy to write code which seems inefficient that runs in no time at all, and equally easy to write code that seems efficient that takes a long time to run. It requires a rough understanding of how your compiler is optimizing your program and what the resulting code is doing, which can be tricky. Laziness also (not surprisingly) interacts with side-effects in a complicated way, so much work has been done to alleviate the problems with lazy IO, e.g. iteratees [3].

So, yes, sometimes it's a great thing, and sometimes it's a problem. The position of the Haskell community is that the exploration of how to properly leverage laziness is worth the tradeoff, but there is indeed a tradeoff.

[1]: http://augustss.blogspot.com/2011/05/more-points-for-lazy-ev...

[2]: Infinite data structures (i.e. codata) can be written/created in a strict language, but the ability to write them falls out of laziness quite naturally, and a lot of people experience them for the first time in a lazy context.

[3]: http://www.haskell.org/haskellwiki/Iteratee_IO


What kind of purity and laziness cannot be done in Scheme?)


Unrestricted beta reduction. You really should know that if you want to take a position of authority on the differences between programming languages.


I have no such intention.


Many people in the Haskell community like laziness. It can be a problem for performance reasons, sometimes. On the other hand, it's a boon to making the language more flexible and expressive. It helps write better, more modular code.

Laziness also makes the language more foreign and harder for many people to learn initially, but I think that fundamentally does not matter. Learning Haskell is a constant cost where the benefits are at least O(n) (and plausibly more) to how much you program with it, so any learning costs are quickly dominated as you use the language.

Unless you're trying to use Haskell to compete with C (and some people are), laziness is a net gain. Take a look at the "Why Functional Programming Matters" paper for a good case in favor of laziness.

Now, laziness does have some downsides over strict evaluation; I just believe the upsides easily outweigh them.

The reason you read about problems with laziness is because the problems are encountered before many of the benefits by people learning the language and because the problems are easier to articulate than the advantages. Also, Haskell is the only common lazy language, so that particular aspect stands out.


Yeah. Personally, I think it was an extraordinarily interesting experiment that didn't pan out. However, the corralling of side-effects (which was necessary for a lazy language) turned out to be one of the best innovations in the space, so that reinforces that we learn as much from mistakes as from successes.

Simon concedes the troubles with laziness and his biggest item of support for laziness is that it keeps you honest in terms of side-effects. I don't buy it. We can control side effects with the type system -- we don't need complete laziness (some laziness is actually quite easy to hand to otherwise eager languages).

P.S. The other major contribution Haskell made to PL was type classes, can't believe I forgot it.


As far as I know, Haskell is just a ML-family language. It is a dialect of ML, we could day.

The decision of its designers to create so-called pure-functional language is nothing but a fancy term and it deceived lots of people. Machine code it produces isn't pure or somehow different. And, of course, I can do the same functional programming in, say, Lisp. Or ML.

Laziness is also not a silver bullet, you could have lazy evaluation in Scheme or even CL on demand if you wish.

So, Haskell is nothing special because it just another ML-family language, over-hyped and over-sold.


I like having access to mathematical semantics, simple algebraic rules, and all the other things you mentioned but combining these features with a non-homoiconic syntax spoils the whole thing for me.

> Not only is the community particularly nice and welcoming, it's also special in not being afraid of a bit of math.

The Lisp community has never been afraid of math. John McCarthy had a PHD in mathematics and the earliest computer algebra systems were developed in Lisp. The Macysma computer algebra system (now developed as Maxima) predates Haskell by decades.

> Haskell has an incredible but also surprisingly simple type system. You can't just ignore that. There is nothing in lisp--not even typed Racket--that even comes close to Haskell's type system.

The most important thing for me is having the ability to describe cardinality (especially in terms of machine bits such as 2^8, 2^16, etc) and enumerations (like the ASCII character encoding). What aspect of Haskells type system is that Lisps need to "come close to"?


My point with math is that Haskell libraries and code use mathematical abstractions (like various sorts of algebraic structures) far more than lisp; I was not taking about what people do with Haskell but rather how they do it.

The type system does quite a bit more than just describe cardinality. The single most important feature is typeclasses; you simply can't replicate some of what typeclasses can easily do without a similar type system.

Apart from that, you also want the type system to let you control "effects"--not just state and IO but also things like error management and non-determinism. You also want a good way to reuse the type system to enforce your own domain-specific invariants; this is what GADTs are for.

The only serious effort I know for adding a type system to lisp is typed Racket, and I don't think it does anywhere near as much as Haskell. It certainly does not have typeclasses, and I think it ends up having union types everywhere which are more awkward and bug-prone than the usual sum types languages have. (This is a necessary compromise to integrate well worth normal Racket, but it's still a compromise.)

Beyond that, most people don't use typed Racket at all, so I suppose the main way lisp can come close to Haskell's type system is in having one at all.

As far as homo-iconic syntax goes, I sometimes miss it, but not too often. Having a flexible syntax that can look like math is far more important usually, and Haskell gets most of the way there without being too complicated. I personally think that seething like Agda's mixfix syntax is the best option overall.


> The type system does quite a bit more than just describe cardinality. The single most important feature is typeclasses; you simply can't replicate some of what typeclasses can easily do without a similar type system.

In the C programming language types are mainly used to describe cardinalities. For example, char is the cardinality of the smallest addressable unit of memory of the machine (often 256). We don't need more then machine cardinalities as entire operating systems have been written in assembly and C. What advantages do type classes and GADTS have that make them worth adopting? Is there any evidence that they make programs more safe and reliable?

> My point with math is that Haskell libraries and code use mathematical abstractions (like various sorts of algebraic structures) far more than lisp; I was not taking about what people do with Haskell but rather how they do it.

I am building a computer algebra system with Lisp in my free time. There are many Lisp computer algebra systems that I have learned a lot from like Maxima, Reduce, and Axiom. One of my favorite computer algebra systems is GAP and it is written in C and not Haskell. I am not convinced that Haskell uses more mathematical abstractions and algebraic structures and I am not familiar with any well maintained computer algebra system written in Haskell.

> Having a flexible syntax that can look like math is far more important usually, and Haskell gets most of the way there without being too complicated.

Math doesn't look like anything. Mathematics is about abstract concepts (especially the natural numbers) not representations. You mean that Haskell is meant to look like math does on paper, but whats so important about using a paper syntax when we have keyboards?


Most Lisps are untyped. Typed-Racket and Qi seem to be the standouts. Other than that I would agree that there's not "more mathematical" background.


Type theory and accompanying first-order logic is the foundation upon which mathematics is built. This gives strongly typed languages a much more firm foundation which has the potential t be exploited by theorem-proving programs reasoning about and modifying other programs (e.g, compilers).

I assumed that's what the OP was talking about by saying Haskell was "more mathematical". The situation is analogous to ad-hoc databases vs. those firmly rooted in the relational model.


> Type theory and accompanying first-order logic is the foundation upon which mathematics is built.

Since when? Mathematics was originally founded on the study of the natural numbers. The mathematics I find most valuable (eunmerative combinatorics) still uses the natural numbers as its foundation and not type theory.


Number theory/abstract algebra, which is the study of natural numbers among other things, is a form of set theory. And set theory is what mathematics calls type theory.


> Number theory/abstract algebra, which is the study of natural numbers among other things, is a form of set theory.

Let S be the set {a,b} and let T be the set {c,d}. There are two bijections S -> T between these two sets {(a,c),(b,d)} and {(a,d), (b,c)}. These two sets are isomorphic to one another because they have the same cardinality.

The isomorphism classes of sets are cardinalities which are themselves the natural numbers 0,1,2,3,... and infinity. In combinatorics we can just focus on the study of these isomorphism classes rather then sets themselves.

We can enumerate binary relations: ([],[0],[1],[[0 0],[0 0]],[[0 0],[0 1]],[[0 0],[1 0]],[[0 0],[1 1]],[[0 1],[0 0]],[[0 1],[0 1]],[[0 1],[1 0]],[[0 1],[1 1]]). Logical disjunction [[0,1],[1 1]] is equal to 10 in this enumeration.

Using the enumeration of binary relations the entire field of study of graph theory can be described in terms of natural numbers without any need for sets. As a result, graph theory is considered to be a branch of combinatorics. We only need to have sets mathematically when dealing with cardinalities like aleph one that emerge from uncountable sets.

Since analytic number theory deals with the uncountable set of real numbers and abstract algebra deals with operations on sets including the real numbers, these two fields are indeed founded on set theory. Algebraic combinatorics and combinatorial number theory are the branches of these two fields that focus on countable sets.

> And set theory is what mathematics calls type theory.

No it isn't. Just look up differences of type theory from set theory on wikipedia [1]. You almost never hear the word "type" spoken in math courses. According to John Shutt "I took some tolerably advanced math courses in college and graduate school. One thing I never encountered in any of those courses, nor that-I-recall even in the THUG talks, was a type. Sets aplenty, but not types" [2].

[1] Type theory differences from set theory http://en.wikipedia.org/wiki/Type_theory#Difference_from_set...

[2] Where do types come from http://fexpr.blogspot.com/2011/11/where-do-types-come-from.h...


> No it isn't ... You almost never hear the word "type" spoken in math courses.

You misread me. What I said was that "set theory" is the general mathematical framework exactly equal to what we computer scientists call "type theory". We have completely different ontologies for what are essentially the same thing (the differences listed on Wikipedia are superficial/conventional and IMHO of no relevance here or in the article).

Mathematicians deal with sets, countable or otherwise, algebraic manipulations of them, bijective mappings, etc. Computer scientists operate on domains (types), their operators, functions, etc. These are different terminology for the exact same thing.


> You misread me. What I said was that "set theory" is the general mathematical framework exactly equal to what we computer scientists call "type theory".

That makes more sense. I am coming from a mathematical background and not a computer science background myself.

> Mathematicians deal with sets, countable or otherwise, algebraic manipulations of them, bijective mappings, etc.

That is true. As a combinatorist I am one of those people who mainly focuses on countable sets. The isomorphism classes of sets (the counting numbers) are at the foundation of everything I do in combinatorics but other areas of mathematics (e.g topology) have very different approaches.


I have no idea what you're talking about. Grossman's course is barely covering an introduction to programming languages (literally, the course my sophomore year of college covering the same material was called "introduction to programming languages").

I guess you could say that there's nothing "special" about Haskell, but the same would be true of Arc. I think Racket is the real champion in Arc -- a truly flexible programming language laboratory.


His course covers the ideas behind so-called functional programming paradigm - high-order functions , environments, lexical scooping, closures and the standard idioms, including currying and lazyness. This course about concepts behind programming languages, not languages themselves.


Yeah. That's an introduction. You basically didn't mention anything that wasn't covered in the untyped lambda calculus at the beginning of the last century.


Yeah, but that is enough for computers to work.)

Inside a computer everything is a number - sometimes a number represent itself, sometimes it is a pointer (a number of a byte, an offset) and sometimes it is a code, a number of an unicode character, etc.

The key idea there is that a pointer points to everything, value has a type, not a pointer. There are no pointers-to-integers along with pointers-to-booleans and pointers-to-strings. There are only numbers here.

Having one kind of pointer for all is the semantics of untyped lamda calculus. We could cons everything with everything, we could have heterogeneous lists, tuples, whatever.

Well, it seems that untyped lambda calculus are good-enough.) Perfection is achieved when there is nothing to remove, and not adding features is sometimes more wise.


Yeah, Grossman's course covers SML, Racket, and Ruby. Haskell is not even part of the course at all.


Can you expand on "or what is Haskell, and why there is nothing special in it" ?


When you say there is nothing special in it, do you mean Clojure or Haskell?


The later.


<pedantic class='educational'> So, you mean Clojure? :-)

I guess English is not your native language. http://grammar.about.com/od/words/a/latergloss.htm: Use later when referring to time. Use latter when referring to the second of two persons or things mentioned previously.

So, grammatically, 'the later' would refer to the language that was created later, and that's Clojure. </pedantic>


Ok, thanks!


Why do you think it would be bad to just start with Clojure as a "first Lisp"? (yeah, there's the JVM, but leiningen makes things bearable and lighttable is a one-click repl to "just start coding")


Because, in my opinion, Clojure isn't a Lisp. It is Lisp-like language, or Java with parenthesis and immutable collections, if you wish. It violates few basic principles of Lisp - one common underlying list-structure for code and data (Clojure's code is a mix of different containers) It violates a general evaluation rule - one for every expression with a very few special forms, etc.

I cannot explain it in a few sentences, I have tried few times, but I'm sure that it is better to learn the classic Lisps first, so you could be able to appreciate their elegance and simplicity (a few selected ideas put together) and then you could compare it with what you have in Clojure.

It is also important to learn how Lisp programmers used to build almost everything they wish out of conses (except arrays and hash-tables, of course) how they reason about access patterns for list structures, learn simple and elegant classic functions, such as map, filter, reduce, etc in their natural environment.)

It short, compared with even CL Clojure is a mess.


> It violates few basic principles of Lisp - one common underlying list-structure for code and data (Clojure's code is a mix of different containers)

I can't see how this is true. IIRC Common Lisp has more standard data types than core Clojure does — it's just that Clojure has reader support for more of them and uses them a bit more freely. It is still homoiconic.

> It violates a general evaluation rule - one for every expression with a very few special forms, etc.

I assume you meant to write something like "one list for every expression". Clojure has essentially the same rule. In general, the things you are probably thinking of that omit lists where most Lisps use them (like Clojure's let and cond using juxtaposition rather than lists to group things) are macros, which could be written in any other Lisp just as well. These are done for convenience, much like Common Lisp's loop.


I upvoted this because I agree. I just spent about a week writing an app in Clojure, and it doesn't feel entirely like something in the Lisp Way[1]. That doesn't make it wrong, bad, immoral, etc; it is different. It has some interesting ideas which I think should be fed back into other Lispy languages via libraries (easy parallelism, immutability).

[1] This is the hard to explain part.


The idea that Lisp 1, Typed Racket, Dylan and Common Lisp are all Lisp but Clojure is not seems really weird to me. It feels like up till the '80s, Lisp was free to make progress and differences were embraced (even if that embracing was of the form "My Lisp is superior because of its differences from yours!"), but now any progress makes you "not Lisp." From people's consistent inability to articulate a meaningful distinction, I'm almost convinced that the idea of Clojure as "not really Lisp" is mostly a distaste for having reader support for vectors instead of requiring you to write (vector thing1 thing2) like most Lisps would.

Yes, Clojure isn't exactly like older Lisps. Why should it be? Those Lisps are all different from each other as well. If new Lisps can't introduce new ideas, what is the point?


Personally, I don't think Clojure should exist as a production language, bluntly. I think it should have been a skin on ABCL. But that is water long under the bridge.

But I would describe Clojure as "not Lispy" because its behavior seem to fall into the compilation idea more than the interpretive idea. I would execute `lein run` and a failure would drop me out into the Linux shell. An equivalent command in CL would throw a condition and drop me into an interactive mode to patch it up. I fully acknowledge that my tool use may not have been ideal, of course. But my Clojure was giving me a compile-run-test cycle as opposed to the the interactive editor/repl cycle of CL. Errors in the source would cause exceptions and crashing in the compile phase; errors in the runtime would cause a crash.

I want to hone in on the second point you made, in terms of new ideas. New ideas are good (c.f., Kernel). New ideas should be aggressively pursued. I draw the line when someone has a slightly varied idea and wants to fork out something & get a community going just for that slight variance. That's just annoying and forks effort. If the new idea is really radical and can't be reconciled with extant tools, of course it deserves to be a new thing. Lisps are an insanely awesome tool for exploring semantics and I hope they keep getting created and churned on for decades to come. I just hope that the ideas which are reconcilable to production systems are reconciled so that we don't fragment further.


You can fire up Emacs and go to town with Clojure using SLIME, Clojure isn't really any different in that respect.


Oh, that's good to know. I had tried to get SLIME playing well with Clojure but there was some issue where it didn't play nice with Common Lisp-configured SLIME and I bailed out.


The idea is that the core Lisp languages are Lisp 1, Lisp 1.5, Maclisp and Common Lisp. Then we can add to this: Franz Lisp, Standard Lisp, ISLisp and Emacs Lisp.

Porting code between these is relatively easy or even trivial when only the core is used.


If you want immutable collections in Common Lisp, try FSet: http://www.cliki.net/FSet


Yeah, there are a couple immutable collections out there! I hope to get using them soonish.


* sidenote: CS61A is Python based from 2011 on




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: