Hacker News new | past | comments | ask | show | jobs | submit login
Dylan: the harsh realities of the market (logicaltypes.blogspot.com)
172 points by kryptiskt on Aug 25, 2014 | hide | past | favorite | 118 comments



I'm the Bruce that he mentioned in the post.

For better or worse, I've been pushing Dylan forward heavily over the last few years and am effectively the primary maintainer.

Over the last couple of years, we've made a lot of progress. We've completely revived the documentation from 1990s era FrameMaker files and have it published via a pretty modern system. We've converted from SVN to Git and moved to GitHub. We've done 4 actual releases. We've improved our platform portability. We've provided some basic debugging integration with LLDB. We've fixed some long standing issues in the compiler and tool chain. We've improved the GC integration on all platforms.

But there's a lot to do. We need to fix our Unicode support. We need to improve the type system in at least minor ways if not major ways. We need to improve how parse failures are handled as the errors are not always friendly. We need more libraries. Some of this is really easy, some isn't. But for pretty much everything, there are bite-sized pieces of work that could be done in a couple of hours/week that would lead to significant gains.

I've wanted to just flat out use Dylan for something and have built some small prototypes with it and while they've worked out well enough, the actual projects themselves didn't go anywhere (unrelated to the use of Dylan).

I think this blog post was triggered by a comment that I'd made publicly yesterday that I'm feeling rather discouraged at this point. There was also a private email that I sent to 19 people who have been involved with Dylan recently, but the author of this post didn't get that email.

I view Dylan, not as a language from the past, but as a stepping ladder towards building a better language for the future. We don't have to get bogged down in a lot of the minutiae involved in creating a new language as a lot of the work has been done. We get to focus on things at a different level and those things are just as important. People bring up Goo often when Dylan comes up. Goo is interesting, but the implementation is nothing close to being industrial enough to survive an encounter with the real world.

I came to Dylan because I saw the mess that Scala and other languages were. I didn't like where they were going and following some people on Twitter like https://twitter.com/milessabin and others seems to show that I'm not alone.

And that's why I'll probably keep at it with Dylan. I want a better future and I'm going to keep trying to build it.


Thank you for your work and please, please keep working on Dylan.

A couple of years back I wanted to learn Dylan, but it really looked rather old and unfriendly. I read about Dylan module system and object system and thought it's really interesting. My first impression was that it's somewhat similar to Racket units system and to CLOS.

I hoped for Dylan to be modernized enough to make it easier for me to learn it. I forgot about it as I learned a dozen or so new (or old) exciting languages in the meantime, but seeing improvements in Dylan - especially in the docs - makes me interested in it once again.

> there are bite-sized pieces of work that could be done in a couple of hours/week that would lead to significant gains

How to get involved with it? Are there issues on Github, is there a mailing list? Free time is scarce, but if I finally decide to learn Dylan I can help with its development as well.

Again, please don't stop working on Dylan. It's only natural for such a language to have a small user base, it's nothing to be worried about. Just keep working on it and sooner or later it will become (more) popular. That's my experience with various niche languages to date, at least.


Bruce, not sure if you are interested in commercial success, but assuming you are, one suggestion is : whatever you may feel about Scala, I believe that Martin Odersky had the right strategy when he noted that Object Oriented Programming did well despite it's flaws because people could build real, working things with OO languages. This probably holds true even more for PHP.

So my suggestion is to do the same for your language : build a cool, new mp3 player that reminds folks of the days when WinAmp was so awesome. Or build a scalable server-side framework that makes WhatsApp-scale chat easy as pie. Make it easy for folks to achieve commercial success, and your language will thrive, too.


It sounds like you care not only about Dylan but also about advancing programming language construction in general. Have you considered writing, e.g., blog posts about design decisions you make as you work on Dylan? Maybe it would help other people learn from your work.


I do ... and I post on http://dylanfoundry.org/.

I don't usually bother to post them on HN as I don't have the time to try to get something on the front page (otherwise, no attention). I do post them on r/lisp though or lobste.rs usually.

I've got a couple of posts in draft stage now that I hope to publish this week or next.


Since you seem very qualified to critize a PL, could you elaborate on the problems you see with scala ?

I recently had to study the option of starting a real project with this language, but after looking at it, it didn't feel like an elegant and well thought out language. More like a monster language people tried to stuff as many features as they could into.

But that was just a first glance impression. So i'm really wondering what someone like you think of it.


If you're going to rant, you might want to actually have a clear point to make.

> college kids on comp.lang.lisp asking for the answers for problem-set 3 on last night's homework

Surely not during the Naggum days. CLL was a hostile wasteland.

> That is the lesson of perl and python and all these other languages. They're not good for anything. They suck. And they suck in libraries and syntax and semantics and weirdness-factor and everything.

What? How have you not heard of CPAN? There is not a single language in the world that can touch Perl's libraries. I'm not sure why you feel the need to toss either Perl or Python under the bus to make some petty point about Dylan's lack of popularity. Python replaced Scheme at MIT. It's time to move on. I know I have.

You have to have your head pretty far up your own ass to not see how much Common Lisp sucks. It's a language designed by committee, and it looks like it.

I've used Erlang too. For everything Erlang does well, there are countless areas that make you want to bang your head against the desk.

Languages don't matter. Platforms matter. APIs matter. Playing nicely with the rest of the world fucking matters. Common Lisp wouldn't.


> What? How have you not heard of CPAN? There is not a single language in the world that can touch Perl's libraries.

That's a really outdated meme. CPAN is small, pretty much every language you hear about regularly has a larger package space than CPAN. Even Go's package space is bigger than CPAN's. JS and Java each have package spaces roughly 3x CPAN's size.

http://www.modulecounts.com/


CPAN is much higher quality, though. For every Perl package that works and has unit tests and documentation, there are 10 node packages someone tried for a month to write, then gave up and left at 0.01 with no docs. Perl isn't sexy, but Perl diehards have written and published modules to do everything you can imagine.


The http://www.modulecounts.com/ site counts the number of distributions on CPAN, which probably is the closest measure for comparisons with number of projects given on PyPI, RubyGems, etc.

However GoDoc seems to be showing number of package namespaces. If so this would be more comparable to the module count on CPAN (see http://www.cpan.org/)

So at this point in time we have:

  GoDoc:         36,497
  CPAN dists:    30,216 
  CPAN modules: 137,603


https://metacpan.org/recent

There are a couple of CPAN module installers, and every one of them, by default, will not install the target module if there are any test failures.

Honest question: are there any other languages that do that? I last looked a few years ago, and it didn't seem to be the case.


Maven will not, by default, allow you to perform a release if there are any test failures. That seems like a better model, at least for a VM language - if something works on the release machine and not on the user's machine, you have bigger problems.


That's great to hear!

I'm not too strong on Java; does what you're saying imply that some/all/most of the freely available Java 'modules' or 'packages' that are built with Maven will end up running and passing associated test suites in most all of the organizations that end up using the code in question?


No, the tests are run as part of the deploy. So before deploying a library, if the tests don't pass, the deploy fails.

However, the artifact that actually gets deployed is a .jar file containing .class files. Users of libraries don't rebuild the libraries.


Just to clarify, "deploy" in this sense means "upload to your organization's maven repository (or maven central)".


Ok cool, I understand.

So would you guess there'd be wide-spread 'deploy-time' test coverage across the Java ecosystem?

Thanks again!


> Languages don't matter.

Bullshit.

Languages DO matter. Language features matter, too. Even syntax matters, although not that much.

It's easy to get trapped in "languages don't matter" attitude if you know a couple languages. The perspective changes drastically with mastery - being able to use a language to its fullest, as opposed to just using it - and with dozens more of learned languages.

A language is your baseline, a starting point for building things. How you build API, how you interact with outside world, how you accomplish common tasks depends on a language and language features. And it gets even more important as you climb the abstraction ladder. For example, at language level Scheme offers call/cc. You may say it doesn't matter, as it's just a language feature. But take a look at Racket continuation-based web server - it's an impressive piece of work which solves a certain problem really well and is almost impossible to implement in some other languages (you don't need continuations specifically to implement this, there are other features which would enable it). At a language level Lua offers coroutines, which in itself is not very impressive. But take a look at OpenResty: you can write code which looks perfectly normal and synchronous (no callback etc.) while still fitting inside of async by nature Nginx.

Anyway, thing you mention, like APIs, do matter too, but language features are equally important. Don't dismiss them because you superficially know a few languages; rather learn some of them in depth and build real world things with them and I'm 97% sure you'll see how language features matter.


I think you're missing George's point. Languages don't matter in isolation. It's the language plus the libraries plus the rest of the ecosystem, which he's calling the "platform".

If you have this language with these great features, but it has lousy libraries and therefore you have to write a bunch more stuff yourself, you need to weigh that against the great features when deciding whether to use that language. Just having the great features isn't enough. (In fact, this is a big part of how Java conquered a big chunk of the world. As a language, it's kind of a yawn. But the library is like Barbie - it has everything.)


I completely agree. As I wrote, libraries and other features of a platform do matter and of course you need to take those into account when deciding which language to use. It's always very specific to a given situation though, for example lack of "batteries included" libraries may not be a concern when choosing language to embed in your app (like Lua). But in general great language features and great platform features (for example working package manager) are both very important factors that you need to take into account when deciding on language(s) for your project.


> You have to have your head pretty far up your own ass to not see how much Common Lisp sucks. It's a language designed by committee, and it looks like it.

Erh, no. You don't seem to have any historical insight into how common lisp came to be. (Or maybe you do have historical insight and are just being willfully ignorant).

It is a language designed by a set of companies and institutions compromising 25 years ago. Companies and institutions that had radically different operating systems and hardware, it wasn't a matter of writing software for a machine that ran either 'Windows or Unix.' You had to support the lisp machine vendors and a variety of mainframes/minicomputers. Emphasis on 'Common.' The theoretical possibility was that you could write a program on a MacIvory and then run it on your Symbolics Machine, ThinkingMachines machine, Unix Machine, Windows machine, PDP-11, Dandelion, DEC Alpha etc...

Go write a C (or Python for that matter) program that lets you access the file-system port-ably on a huge variety of operating systems using only the C standard. I suspect it will be difficult.

Anyway, you couldn't be any more wrong. You've limited the 'entire world' to essentially two similar platforms, one of which (Windows) isn't really that well supported.

The whole point is that it tried really, really hard to satisfy the entire lisp world, which included a number the larger players who were put out of business by Moore's law and cheap generic hardware rapidly outpacing expensive special purpose hardware.

All this said, Common Lisp (and Scheme, and it's implementations) have features that the general programming community is still rediscovering 30 years later. Dismissing it as something that sucks because the people involved didn't try hard enough is incredibly myopic.

tl;dr: Nu-uh.


Sure, 25 years ago there were good reasons for the design compromises of Common Lisp. But that doesn't mean they make sense today, where "Windows or Unix" is pretty much the world, and there are any number of consistent, elegant languages that work on both.

> All this said, Common Lisp (and Scheme, and it's implementations) have features that the general programming community is still rediscovering 30 years later.

A language is more than an accumulation of features. It doesn't matter how advanced an implementation is if it's not actually pleasant to program in.

> Dismissing it as something that sucks because the people involved didn't try hard enough is incredibly myopic.

I don't think anyone's saying they didn't try hard enough. But interop between CL and the rest of the world is undeniably poor - and this largely seems to be a deliberate choice by the CL community - and it's not much of a stretch to say that its unpopularity is a result of this.


So we say the language is terrible because the world around it changed? Using Common Lisp may be inconvenient because the standard is out of date, that doesn't mean the design (or language) was a failure in a historical sense.

I just think this whole argument is basically a straw man. Comparing something well specified that hasn't changed is 30 years to an entire class of hand wavy 'better languages' isn't an apples to apples comparison, and is, frankly, a dumb waste of time.


I'm less interested in questions like "how much credit do the original designers of CL deserve?" and more interested in "what languages should I consider for my new project?". Is Dylan a viable option? If not, why not? "It's not because it inherits common lisp's non-interoperability" is then an interesting point.


> inherits common lisp's non-interoperability

Yeah, but that would be bullshit to assume. I use for example LispWorks (commercial, proprietary) and Clozure CL (free, open source) on my Mac. Both have excellent interoperability with C and Unix. Both have a native Cocoa interface to the Mac.


> between CL and the rest of the world is undeniably poor

Because CL is a language and not an implementation.

Many CL implementations have excellent interoperability capabilities.


Your comment is a bit confusing because you're basically explaining how Common Lisp was indeed a language designed by committee containing all sorts of compromises between competing vendors. That's what the OP was getting at.


I wasn't trying to contradict that. I was trying to indicste that failures in it are due to how much the terrain has changed. By and large the standardization process was successful, the failure is that there haven't been any revisions since (and likely won't be).


> It's a language designed by committee, and it looks like it.

This means nothing. When a design succeeds, we say it was designed by the community; when it fails, by committee. You can produce examples for any language: In Common Lisp the committee that designed it and the community of previous Lisp users that backed it. For Python, the community of users and Guidos' mailing list pals. There is nothing inherently superior or inferior about design by community or committee, respectively.


I think a good reference for what people mean by "designed by committee" is to look at the reference spec for a language.

I remember picking up the Common Lisp reference book in my college library. It was one of the thickest books in the entire CS section. Obviously much thicker than any books on Scheme, but even on the same level as "The C++ Programming Language". And the latter spends a lot more time explaining things than the Lisp book did.

CL is a huge and complex language, and I'm still not sure why there are 80 different ways to loop. It's a confusing language, with not nearly as much orthogonality as it could have.


Haskell was designed by committee, and it's exactly the opposite of the stereotype: it's a small language with orthogonal features and a clean syntax. The few "features" the language has, like list comprehensions and do notation, are defined by simple translations into the rest of the language.

http://www.haskell.org/onlinereport/


And conversely, Ada was designed by one man, and if you weren't aware of that historical fact and just looked at the language spec, you'd swear it was the classic example of committee design. Fred Brooks' dictum that design should be done by a single mind, or at most two, is a guideline not a rule.


>Ada was designed by one man

http://cs.fit.edu/~ryan/ada/ada-hist.html

"The Ada design team was led by Jean D. Ichbiah and has included Berned Krieg-Bruechner, Brain A. Wichmann, Henry F. Ledgard, Jean-Cluade Heliard, Jean-Loup Gailly, Jean-Ryanmond Abrial, John G. P. Barnes, Mike Woodger, Olivier Roubine, Paul N. Hilfinger, and Robert Firth."


Right, but wasn't it structured as 'Ichbiah assisted by the others' rather than 'Ichbiah leading a committee by consensus'? In particular, iirc, there were some points at which the others almost unanimously disagreed with Ichbiah on some design decision and he overruled them and went ahead anyway.


Scheme had a very shallow standard. It's written in a dense style and lacked basic stuff like error handling or an object system.

> I'm still not sure why there are 80 different ways to loop

There are many ways to loop, because users found those useful. Lisp by tradition is not telling the developers what they have to use. It's the opposite: it gives the developers various degrees of freedom to shape the language (from macros, read macros, ... to the MOP).

You could have easily removed something like DOLIST from the standard. But why? Software before and after the standard will still use it. How about MAP ? Remove it! But wait, people have used it already and will still use it...

Lisp languages are ball-of-mud languages where users can add their own control structures.

If you look at Scheme, they have added all this and more with SRFIs. Now you get a language, whose implementations support a minimal language and each will implement some subset or superset of SRFIs...


Dylan was originally designed by committee back in the days when it was designed by people @ Apple, Harlequin and CMU.

And many of the people involved with that were also involved with the Common Lisp standardization (like David Moon, Scott Fahlman, etc).

Common Lisp is a pretty interesting example, but due to the politics of the various companies involved, the vast amounts of code in each of the various Lisps, and so on, I don't think it is a fair reflection on "designed by committee". It is just what that committee was able to design given the constraints imposed upon them.

In many ways, Dylan was a stripped down and much more minimal Common Lisp, but with aspects of Scheme as well. But Dylan was designed from a green field, while Common Lisp was designed with a number of existing Lisps in mind that each had a stake.


Common Lisp had different goals from Dylan. Common Lisp was designed as a powerful Lisp dialect, incorporating ideas from 20+ years back plus some new stuff. It was designed to be 'backwarts' compatible with Maclisp and its dialects.

Dylan was designed as a new language (compatible with nothing) for application development and delivery for small machines.


Alternatively, look at R6RS[1]. 90 pages, including the index. That language is also designed by committee, and looks like it, too. An effective committee, that is.

[1]: http://www.r6rs.org/final/r6rs.pdf


> I'm still not sure why there are 80 different ways to loop

Please ignore if you're asking rhetorically. But there are reasons why it was put together in kind of a wild political dash. See: (https://groups.google.com/d/msg/comp.lang.lisp/Llmnxk2SYUk/y...) and (http://www.dreamsongs.com/Files/HOPL2-Uncut.pdf)

An evolving, used platform which was (suddenly?) pressured to freeze.


I am vaguely aware of how CL came to be, but was unaware of this report. I'll give it a read, thanks.


> When a design succeeds, we say it was designed by the community; when it fails, by committee.

Extremely well said.


> You have to have your head pretty far up your own ass to not see how much Common Lisp sucks. It's a language designed by committee, and it looks like it.

The original design was by four people. Common Lisp does not look worse or better than the Lisp's it was designed to improve on (Maclisp, Lisp Machine Lisp, NIL, ...) and the Lisps it made mostly obsolete (Portable Standard Lisp, Interlisp, ...).


> Languages don't matter. Platforms matter. APIs matter. Playing nicely with the rest of the world fucking matters. Common Lisp wouldn't.

this. playing with CL inside CL is fun and nice. As soon as you try to interface with the outside world, and actually do almost anything you run into trouble. I mean an FS api that was designed before we solidified on hierarchical filesystems? Lisp does not play nice or well or easily with the outside environment it is running on. (in my experience)

it can be an ugly language, but if it has all the libraries, then you can get so much more done so much quicker with so much work done for you, people will go there.

Again, some little progress has started with things like quicklisp finally starting to provide a better (or first proper) package management system for CL like most newer languages have all had, but still. little and late. But given time and work maybe resuscitateable. We'll see


Hmm, where is the ISO C++ standard to access the Windows file system?

If you were using say, Allegro CL on Windows, I doubt you had any difficulties accessing the file system.


http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2014/n394...

Not finalized yet, but soon: https://isocpp.org/std/status

(It is basically a standardization of boost::filesystem)

Here are the MSDN docs: http://msdn.microsoft.com/en-CA/library/hh874694.aspx


That's great. But it's POSIX, not native Windows.


the libraries abstract over most fs warts see Practical Common Lisp for an example


"Languages don't matter. Platforms matter. APIs matter. Playing nicely with the rest of the world fucking matters."

Nice, I like that.

Reminds me of the Discourse peeps, who rejected eg PHP, and wrote it in their pet language, ignoring that no-one cares about language, and finding the problem that their platform is so difficult to install and use, that no-one uses their software.


> no-one cares about language

Yeah, why should users ever care about language? Users generally don't know what the heck this "language" thing is. At least if it's not French, English, Chinese or something like that.

Now, for developers, not caring about a language they use is very short-sighted, to say the least.


> the Discourse peeps, who rejected eg PHP, and wrote it in their pet language, ignoring that no-one cares about language

Are you talking about Ruby/Rails?


The OP is being overly dismissive, but has a point -- there's a market for Ruby, but in terms of most of the use cases for online bulletin board software, both Rails and Postgres are much rarer than PHP and MySQL.


Read the whole post, everything sucks. It is Salinger + PK Dick + Vonnegut + Allan Watts.


Clojure is pragmatic in using jvm as its platform.


The article hits rather close to home, as someone who, for better or worse, is committed to improving Common Lisp's ecosystem. I often feel this existential dread of "Is it worth it? Whose lives will it change? Will I spend years labouring in obscurity for nothing?". And while the answer to all those questions is probably not what I'd like to hear, I still do it.

This is why ecosystems are, for most people, more valuable than intrinsic language features: Tribalism along the lines of "We have X thousand people backing us up"/"X thousand devs can't go wrong". People don't care about monads or macros, they care about feeling like they're part of a large community.


As someone also invested in Common Lisp, it.... hit really close to home.


Its even worse for the PL designer inventing something new without even a small ecosystem in place; I ask myself everyday "is it worth it?"


The important question isn't necessarily: Is it worth it, without the ecosystem? The more interesting questions is: What does this language add?

For example, I need to deal with some unix process management tasks. I'm not crazy enough to do them in straight C, so I'm using python for it's strong C bindings, so I can use system calls in C with python level control flow. However, there's a lot to be desired, because dealing with all this concurrency and inherent raciness of the system in a language like python is a royal pain in the rearside.

Similar things with java and performant programming. Some of my major programs at work don't use the java ecosystem outside of log4j, junit, apache commons lang and apache commons collections. They use java, because java is darned fast without being as brittle as C or as nuts as C++. So java adds a lot of really cheap development security.

Designing yet another language that puts the existing features together with just a tiny difference doesn't add anything. I've enjoyed looking at experimental languages, and I've enjoyed tinkering around with ideas myself some time ago. But by now, I've accepted I'm not smart or talented enough to make something worthwile, and it will be some time until an interesting new breakthrough occurs.


My languages tend to be very innovative or even inventive; e.g. I recently posted this to HN: http://research.microsoft.com/en-us/people/smcdirm/managedti...

My problem is quite the opposite: when the languages are so different from what already exists, people (and even myself) have trouble thinking about how adoption would even occur, at least in the short term.


A lot of languages have made one major mistake: It's not easy to get a development environment going, and some even wanted me to not use vim.

The first step to some kind of adoption is that I can get it running with an "aptitude install crazy-foo-language-dk", maybe after adding a repository. After that, crazy-foo or crazy-fooc should be interpreter and/or compiler, and that's that.


Ya, well, my languages include their own development environments (I see no different from language and IDE, actually). Smalltalk was great in this regard also, to the chagrin of many developers who wanted to use vim.


Have to figure it takes 10-20 years to grow an industrial strength language. It takes a lot of faith in an idea or stubbornness ( probably both ) to go through that struggle. I admire people with that kind of dedication and I wish exploring the language design space wasn't such an enormous investment.


It is worth it. And your case it isn't the language per se, but spreading the ideas. The language is an ephemeral vehicle.


Hi eudox, I would love to use CL for my projects - I've read (most of) Norvig's and Graham's books, written small programs etc. I really like most of the language, especially the 'break into debugger on error'.

But, when I came to write CL for a project that needed gui, web scraping, using APIs etc (i.e. I wanted to connect it to the outside world) I was stumped. Unlike in e.g. python, there was no nice gui library that I could just use. Now maybe people will reply telling me that something exists, but compared to python, the difference in ecosystem is massive. (possibly some commercial CL environment has this, but I am unwilling to pay for something that is closed).

I really want to use CL for more than just toy programs, but I don't/can't.

So, when you are asking yourself "Is it worth it? Whose lives will it change?" - the answer is mine, but only if you think you are able to fix this.

Good luck.


I've had good luck using Haskell for similar things that you are describing. Hackage has a good variety of packages that do similar things to beautifulsoup, numpy, pandas, etc. Generally, I've found them to be less built up, but they still provide a decent level of abstraction. Cabal hell is real though.


Regarding CL with some quick googling (searched "common lisp html parser" and "common lisp http client") I found:

Html Parser: http://www.cliki.net/cl-html-parse

Http client: http://weitz.de/drakma/

I'm not sure why you are/were stumped? I could search for qt bindings for you, but I'm sure they exist and are pretty mature.


Thanks for that. Maybe I will try again some time.

I'm not saying it is impossible, rather that it is hard work compared to e.g. python. I recall looking through a few promising google results and finding e.g. that the library is no longer maintained, has bits missing etc. With python, when I use e.g. beautiful soup, it just works - I lose almost no time getting setup. When doing a project that needs many external libraries, that makes a big difference.

e.g. the first item on http://www.cliki.net/GUI has its last news item from 2007, and the mailing list gives 'not found'. But there is a new(er) github page with the comment "I cleaned up the library just in case I needed GUI in Lisp, but it turned out that I did not. Hence, the primary extent of my testing is running test-gtk:gtk-demo application. Bug reports and/or patches are welcome."

Since the parent was talking about ecosystems etc, I thought my experiences were worth mentioning - "Python novice uses python to get stuff done" but "CL novice gets lost in old/unmaintained libraries, gives up and sticks with c++ & python".

I mean no disrespect to the people who write these libraries etc., I am just describing what I found as a novice in CL.


Many people who actually write GUI software in Lisp use one of the commercial offerings, typically either LispWorks and Allegro CL. That's one of their advantages: they offer maintained and cross platform GUI libraries.

One can for example also use Clozure CL for advanced GUI stuff, but that is best on the Mac.

For example http://opusmodus.com is a composition software using Clozure CL.

Other composition environments are written with LispWorks: PWGL, OpenMusic, Symbolic Composer, ScoreCloud.

http://www2.siba.fi/PWGL/

http://repmus.ircam.fr/openmusic/home

http://www.symboliccomposer.com

http://scorecloud.com


Thanks for the info. I just had a quick look at Clozure CL - as you say, the main Gui library is for apple. The other gui library is labelled 'under development' but also 'last modified: 6 years ago' :(

I wonder whether the (I'm told) good commercial lisp offerings have hampered lisp adoption, in that people who use lisp a lot, pay for commercial offerings, leaving the free stuff less used and maintained? Then newcomers, who don't want to pay to do hobby projects in a new language, are discouraged by the less maintained free stuff.



> These languages are defined, right there in the dictionary.

Erlang: see 'career wrecker.'

Please. Someone, wreck my career some more.

Unlike Dylan Erlang was created by a company for a purpose with very clear goals and it did and still excels at meeting those goals, and nothing out there gets close to the qualities it has. Not everyone needs those qualities, but sometimes nothing will do. Erlang is at the core of many solid industrial applications -- mobile to internet gateways, message queues, trading systems, large databases, handling millions of concurrent connection and billions of messages per day for WhatsApp.

What does Dylan do? This is the second time I heard about Dylan. I've played with Mercury, Prolog, Nimrod, Curry (Haskell + Logic programming) and other rather obscure languages but haven't heard about Dylan much.

Some languages just don't make it, sometimes it is just luck. However I don't like the disparaging and angry remarks thrown around at other languages and ecosystems. That does nothing to promote Dylan it only pushes people away.


> What does Dylan do?

I'm totally unfair here, because I really don't know much about Dylan, but my view maybe explains a bit why Dylan does not even "win" with people like me, who really like those strange languages like Lisp or ML.

To me, Dylan was never about creating something new and great. It was taking Common Lisp and "fixing" its syntax.

Unfortunately, I can't stand Lisp without prefix notation and lots of parentheses. And people who hate prefix notation and lots of parentheses aren't interested too much in Common Lisp.

The resulting set of people interested in Dylan was... well, I've heard of Bruce. Andreas Bogk was doing some heavy advocacy in Germany, to the point where people just didn't want to listen anymore.

Let's say about ten people in the world cared. And I just don't see how that set could grow by orders of magnitude.


Dylan started out with a prefix syntax; the infix syntax was added on late in its development at Apple, because it was believed the market wasn't willing to adopt a prefix-syntaxed language. Turns out it didn't matter very much.

Dylan also standardized and simplified a lot of the dark, twisty corners of Common Lisp. Generic functions became a core part of the language, rather than something bolted on by ad-hoc macro packages. The collections API was unified to use generic functions (so no more mapc/mapcar/mapcan/mapl/maplist/mapcon mess), and the language syntax's infix operators were all defined in terms of generic functions as well, so you could eg. define a matrix type and have + and * work on it as expected. The superclass linearization was fixed to be monotonic. Macros are hygienic. It's a Lisp-1, and everything in the language is an object bound to a name.

Honestly, I think its lack of success came down to a Worse-is-Better problem. Dylan is a really nice language, designed by a committee with a combined total of over 2 centuries of Common Lisp experience. But it doesn't excel in any one domain. It's a large language, fairly complicated, which tried to mainstream a number of features with complex interactions and limited applicability. To an outsider, who doesn't understand those features or how they might be useful, it's too hard to pick up in a weekend. And so simpler languages like Scheme, Go, Python, or Erlang steal its thunder - they lack the total power of Dylan, but you can point to Scheme and say "Small composable building blocks", or Go and "Fast concurrency with quick, simple deployment", or Python and "Prototype anything, with batteries included", or Erlang and "Reliable message-passing concurrency." If you point to Dylan and try to describe it in one sentence, you get "Generic functions all the way down", which means nothing to the average programmer.

It's much the same problem that Scala faces, except that Scala features JVM interop and so Scala advocates can say "Tomorrow's language features with yesterday's Java code."


Dylan was not designed for Lisp, Scheme or Smalltalk users. It took ideas from there, but it was developed as a replacement for C, C++ for application development. If Apple had actually used it, stuff like Keynote, Logic, iPhoto, iTunes, Xcode, ... would have been written an Dylan.

Apple used Objective-C for that then.


Dylan mostly died before practical applications were written in it. I doubt there were more than ten serious applications written in it (besides its own IDE and compilers).


Dylan was my favorite language back in college. I remember following PG's essays to Lisp, and then Lisp to all the newer dialects like Dylan or Goo. Dylan had it all: a metaobject protocol, generic functions, optional static typing, infix macros. I even got started working on an Eclipse plugin for it, which I ended up shelving after like 3 weeks when I lost interest.

Unfortunately, there are large network effects to programming languages, and the stuff that really makes you productive - libraries and tooling - Dylan just lacked. It wasn't practical to write anything larger than an ICFP contest entry in it. So I went from Dylan to Python, which lacks many of the really cool language features and is a lot slower, but at least comes with so many batteries included that you can whip up a prototype for anything really quickly.


I don't really think it's purely tooling that's the issue. If it were so easy to use Dylan, then the tooling would be easy to write.

I think it's performance.

"Scripting languages" tear compiled languages apart when it comes to iterating. Write a "hello world" web app page in Python and Scala, then see which one takes 30 seconds and about a gig of RAM to actually display in the browser.

On the other hand, if you needed performance scaled across millions of users, you needed C or C++.

The big languages are C and C++ and Fortran (if you need performance), a long laundry list of "scripting" languages, and Java. Java is the only one that actually gets a spot because of its tooling, most other languages only got tooling because people loved the languages.

There's a lot of compiled languages coming out (or re-emerging) these days (Scala, Haskell, Swift, Julia) with funky features that require compilation that don't get you closer to the metal. I suspect that's largely because computers now have the RAM and CPU power to actually make them fun to use.


The combo of C(++) and Python is very strong. A friend of mine works in a research HPC lab (think around 16000 cores); they use Python for the IO and interface glue and C for the heavily parallel science code. It's the best of both worlds.


Dylan is compiled. It was done by the same folks who did CMUCL, which was the original optimizing compiler for dynamic languages, built in the 80s, well-before Strongtalk begat Self begat Hotspot begat V8.


Typo: Self begat Strongtalk.


Back when Dylan was being developed, C and C++ compilers produced bloated slow code on home computers. Assembly was the only path for real performance.

We were blessed with choice of having compilers for systems programing for Pascal dialects, Modula-2, Oberon, Basic dialects, Forth...

Somehow along the way the UNIXification of the enterprise brought the wrath of C into these other ecosystems.


You could think of it as putting a low-pass filter on some of the good ideas from the ’60s and ’70s, as computing spread out much, much faster than educating unsophisticated people can happen. In the last 25 years or so, we actually got something like a pop culture ... So I think the lack of a real computer science today, and the lack of real software engineering today, is partly due to this pop culture.

A Conversation with Alan Kay, ACM Queue, 2004, https://queue.acm.org/detail.cfm?id=1039523


The history of mass computing involves numerous 'bottleneck' events where an increase in usage was bought with a curtailment of hardware and software quality. You had the first blast of cheap minis available outside military and academic environments. Then you had cheap home computers with limited BASIC implementations. That was followed by web apps scripted with half-assed '90s JavaScript and served from cramped mass hosting servers. Then you had smartphones -- iOS using a somewhat spruced-up but constrained version of the decades-old NeXTSTEP, and Android using an outdated and pared-down version of Java.

This is pretty much what "Worse Is Better" is about. Cheap, readily-available software that runs on cheap, readily-available hardware is always going to have a huge head start.


Unix is another example of a worse reimplementation of an earlier system to fit the hardware constraints of the day.

The thing is though, the security model of multics would be a much better fit for today's security needs, but we don't have it because the hardware that could run it was too expensive 40 years ago, which seems crazy when I think about it. Sometimes it feels like the industry as a whole is no longer ambitious, that building fundamentally better systems is no longer considered important. It's nice that you can run unix on your phone, but i would like to run something better than unix on my desktop. Where are the OS's that are ambitious enough to eventually turn into scarlett johansson in the movie 'Her'?


> Algebraic types? Dependent types? You'll never see them. They're too ... research-y. They stink of academe, which is: they stink of uselessness-to-industry.

One may think that because closures are finally entering the mainstream (after what, 5 decades?), we have hope for those things to come as well.

But then I saw Swift. Built-in support for an Option type, so one can avoid null pointer exceptions. At the same time, this languages manages to recognize the extreme usefulness of algebraic data types, without using them in their full generality. Like, why bother with a generic feature when we can settle for an ad-hoc one?

I'd give much to know what went so deeply wrong in our industry that we keep making such basic mistakes.


> But then I saw Swift. Built-in support for an Option type, so one can avoid null pointer exceptions. At the same time, this languages manages to recognize the extreme usefulness of algebraic data types, without using them in their full generality. Like, why bother with a generic feature when we can settle for an ad-hoc one?

Swift has the syntax to define arbitrary algebraic datatypes, even if it doesn't yet work in the beta versions of the compiler.


It seems to be likely that 1.0 will hit before this really holds true. In Beta 6 I'm told you can easily crash the compiler with recursive generic ADTs and most have to travel "through" some heap type to compile at all.

Furthermore, Swift doesn't support enough laziness/deferral/coalgebraic formulation to have, say, an infinite stream type without breaking GCD. These will probably be fixed in time, but Swift's ADT support is still pretty experimental to say the least.


Wow, where did you get Beta 6? I only have 5.


This is based off comments someone else had on trying to compile some of my experimental Swift modules.


> I'd give much to know what went so deeply wrong in our industry that we keep making such basic mistakes.

I don't think we can all agree on what counts as progress. Some saw Exceptions as the advance in error handling we need while Go reverts back to error codes. People still think Go is superior for different reasons. I think both suck and prefer conditions and restarts as in Common Lisp.

It is rather difficult to build a programming language from a set of axioms we can all agree on.


>Some saw Exceptions as the advance in error handling we need while Go reverts back to error codes.

Just a note that Go generally uses strings for error handling, not error codes. This avoids the need to look up the meaning of each error codes in a table somewhere.


Testing strings for errors is another example of a left turn in Go's design. Specially in a time of localized applications or OS that change messages between versions.


How are errors typically localised?


Not.


> I'd give much to know what went so deeply wrong in our industry that we keep making such basic mistakes.

The vast bulk of the industry is much more practical than theoretical. They don't care about your theory about how languages ought to be built. They care about solving the problems that are actually hindering programmers who are trying to write programs.

"But", I hear you say, "null pointer exceptions are one of those problems!" True. "And algebraic types can fix that!" Also true. But here's the thing: (Almost) Nobody cares. Nobody thinks that algebraic types are a price worth paying to fix null pointers.

Do not automatically assume that you are right, and that everybody else is too stupid to see it. Instead, try to expand your mind far enough to see that they may have a better grasp of the trade-offs that confront them than you do. They think your solution doesn't work in their world. Bemoaning their stupidity is the lazy way out. Instead, try to find out why they think that.

I think the problem is not with the industry. The problem is with your expectations of the industry.


In other words, the industry is thinking short term. Often very short term. They only see what's in front of them. Then next problem to solve, the next developer to hire, the next library to use…

So, the industry makes this analysis: yes, I could spend a few days learning about sum types, but it won't save me nearly as much time in avoided null pointer exceptions over the next month. So, no, it costs too much.

I guess we just have to live with this systemic irrationality. I guess long term thinking is just too much to expect. I guess decades old scientific results are too bleeding edge to risk employing them.

I can only think: not fast enough!


No, you either don't understand what I said, or you're trying to make it say what you want. You're not at all saying what I said in other words.

You think the industry is too short-sighted to know what's good for it. I'm saying that you're too narrow-minded to know what's good for the industry.

You think you know better than the industry what the industry ought to be like. I think you're wrong. I think the people in the trenches know better than you how to solve the problems they face. A choice can be different than yours without being stupid or short-sighted.


Swift has discriminated unions, and IIRC Optional is one example. It's defined as something like:

  enum Optional<T> {
    case None
    case Some(T)
  }


> closures are finally entering the mainstream (after what, 5 decades?)

8 decades, to be precise. They predate electronic computing - Church published his original paper in 1932.

It still amazes me that it's only in the past few years that widely-used languages have begun adopting them.


Elements of the idea were in Church's work, but I think it wasn't until the 1960s that anyone thought of closing a lambda by using the lexical environment to bind its free variables. And it wasn't until the 1970s that Sussman and Steele really drove the idea home with Scheme.

Before then even LISP didn't have closures. It was based on Church's ideas, and it did need to deal with figuring out how to bind a lambda's variables in order to evaluate it. But it was a dynamically scoped language, so capturing free variables from an anonymous function's lexical scope (which is what a closure does) didn't really make sense for it. Instead it just used the execution context to bind variables.


I don't think everyone knew how to implement lambada properly, but the calculus was clear enough. Hell, the original calculus was both lexical and substructural!


Dylan is also, unfortunately, an example of "worse is better" in action. The extant Dylan implementations were incredibly ambitious. CMU's d2c built on the experience with CMUCL. Harlequin Dylan (AKA OpenDylan) was "Dylan all the way down" with a sophisticated native code compiler and an IDE written in Dylan. Multithreading, generational GC, etc.

And what filled the dynamic language niche? Interpreted languages like Ruby and Python that have yet to achieve 1970's levels of implementation sophistication. But simplicity made them agile and portable and allowed resources to be spent on libraries.


Dylan wasn't really a 'dynamic language' in the classical sense. It was thought to bring the power of dynamic languages to application developers.

Dylan was made obsolete by Objective-C at Apple, not Ruby or Python. Outside Apple Dylan was made obsolete by Java.

Some of the early Dylan users complained that too much time was wasted on the IDEs, while the language implementation and libraries were lacking.


I appreciate this article by someone who is serious about contributing useful solutions to the world--not just the social aspect of programming--and appreciates a language that empowers him to develop those useful solutions as readily as possible.

When I'm not programming I like to get some distance from my work and hang out with people who have diverse interests. When I'm serious about programming I use Common Lisp. When I'm serious about connecting with other people I use English. Many people seem to confound these pursuits and end up with languages that compromise weakly between talking to people and talking to computers.

For me, programming is about solving business problems ASAP in a manner that is amenable to a long series of minor improvements over many years. Having a stable language standard with language improvements happening as add-on libraries is a huge win. My old code keeps working, so I can stay focused on improvements instead of bailing water.

Also Lisp has the seemingly magical property of being one of the easiest languages to read, understand, and reason about by programmers who have the aptitude to learn it--and it scares the pants off of people who don't. With all of the "expert programmer" pretenders out there it's helpful as an employer to have something that separates the serious programmers from the pretenders.


I fully agree. Maybe it's just me, but that sounds much more hands-on to me than the typical HN comment.


If you want to gauge a language look at one thing and one thing only. Who are these people using it and what are they using it for? The answer to that question is more important than anything else on the list after that. Amen.


Good one. Used this heuristic to select Erlang and very happy since that time :-)


There are a lot of things that strike me as being wrong about this essay. Most importantly, to understand the success or failure, you have to think about it, and its economics, like any other product on the market. Granted, programming languages have different characteristics as products than lighthouses or tv shows or telephones, but they do share things with them that we can learn from. This is a somewhat dated attempt of my own to share a bit of that thinking:

http://www.welton.it/articles/programming_language_economics

A few other things:

* Like rdtsc says, Erlang does not fit the mold in a lot of ways. It had a large corporate sponsor from the get-go, which was good for it in some ways (money for developers), and perhaps bad in others: lots of production code early on means it's not possible to change stuff that is less than optimal.

* As per my article, new/small/unpopular languages need a niche, a beachhead if they are to gain traction. You can't create a new language and platform from scratch with an ecosystem as big as Java's (indeed, piggybacking on the JVM is a popular strategy because of this), so you'd better have one thing where you absolutely kick ass. Erlang has this in spades, for instance. Ruby had Rails. Tcl had Tk and a few other "killer apps". PHP was way easier to get started with than mod_perl, back in the day. I don't see this for Dylan, particularly, but then I don't know much about it, so maybe it's there somewhere, and BruceM will figure it out and the language will gain a following.


I don't know that much about Dylan, but isn't there some consensus that Julia is the "new Dylan"?


I really like Dylan, but it has a huge problem that never gets discussed - verbosity. The syntax is entirely too verbose for a modern language. My suggestion would be to first complete the Intellij plugin and then actually change the syntax so it can compete with the scripting languages.


Very interesting read.

I have a soft spot for Dylan from magazine articles back in the Newton days.

Here we have a Lisp like language, with a more approachable syntax for the average Joe/Jane developers, AOT compilation and Apple kills it.

I think it is also important to bring out the paper from Erik Meijer about using Visual Basic to hook the typical enterprise developer into FP (via LINQ).

http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.118....


You must find the corporate developer who cares ;-) http://steve-yegge.blogspot.se/2010/12/haskell-researchers-a...

And then maybe try to get more like him/her.


Something I never quite understood about languages was why they are restricted to one or another domain. For example the author writes:

Algebraic types? Dependent types? You'll never see them. They're too ... research-y

Why can't those features be baked into C++ or Java?


I think C++ is full now.


People keep saying that yet the standards committee keeps nailing new tentacles on.

http://c2.com/cgi/wiki?ExtraLegsOntoaDog


Well as far as I know neither language is clever enough to allow arbitrary proofs to be expressed at compile-time. (I hear that C++'s template system is Turing-complete, which may or may not be true, but even if it is, that doesn't mean that dependent types could be expressed in a way that's easy to write/read.) So you'd have to essentially rewrite a big part of the compiler if you were to "bake dependent types" into C++ or Java, which is of course easier said than done. Not to mention that dependent types are extraordinarily complicated and have only been implemented in research systems.


Java has a fixed assembly-language level syntax. Look to Scala or F#


Well, you can add ADT's to C/C++ with a macro system: https://github.com/eudoxia0/cmacro/blob/master/t/good-macros...


I love Dylan. We should all be using it instead of C++ and Java. Particularly C++. Each time I type "friend" in C++ or have to use the STL, I miss Dylan.


just say fuck! we're all thinking it!




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: