Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Same for FORTH. I don't think I've ever seen a FORTH programmer and a LISP programmer in the same room; I can't imagine the discussion terminating.

Over the years, as a left-handed person, I've come to see this as evidence of underlying diversity in thought patterns that can't simply be learned/trained around. Certain people find certain metaphors and modes of computing much easier to work with than others. Few people on either side can see this. The result is that you have people who, having feeling like they were fighting scissors all their lives suddenly discovering that a different sort of scissor exists that gives better results, start evangelizing it to everyone. The rest of the world tries it, finds it impossible to use, and concludes the weird scissor advocates are mad.

Neither the LH people holding LH scissors nor the RH people holding RH scissors are "wrong". It's just the RH people trying to use LH scissors who are having a bad time.

(The limit case of this is a few people who are using languages which are uniquely tailored to themselves and absolutely nobody else; colorforth, the late author of TempleOS, Urbit, etc)



I actually think it's a lot more about initial training. and corporate sponsorship.

People would rather die/retire than change how they program. The assembly guys died/retired rather than learn structured languages. The imperative people died/retired rather than learn OOP. The OOP people are currently dying/retiring as languages shift to be much more functional.

ALL of these things have existed for 50+ years, so why did they take so long to adopt (as people now almost universally agree these things are better)? It's all about what they are taught and use early on in their career. With the exception of 5-10% of outliers, everyone takes what they learned in school, adds a little more past that as they are learning to code, then cement their synapses into that pattern for the rest of their career.

A conflating factor here is corporate backing. If you look at the popular programming languages, only a couple became successful without a large corporate sponsor. Corporations tend to be conservative in their investments too. This creates a secondary effect where even when you encounter a superior language pattern, you cannot use it because your company isn't likely to pay you to rewrite complex libraries in a new language.

JS pushed the world so far toward functional programming because even though it was different, it was the only standard. OOP devs in the 90s and 00s would rather die/retire than actually learn JS and more functional programming paradigms. But because it was the standard, big companies like Google and MS were forced to pour in resources and make it accessible. In turn, that has led to a glut of functional features being bolted on to other languages now that a generation of programmers has adopted them.

I'd hypothesize that if Eich had made a Scheme instead of JS (as he originally intended), we wouldn't be having this debate today. It would have forced corporate backing of a lisp and of a functional language. The current crop of devs would have been introduced to both of these ways of thinking and nobody would question the merits of a lisp-like language compared to ALGOL/C derivatives that have been foisted into mainstream education and corporate support for the last 50+ years.


> The OOP people are currently dying/retiring as languages shift to be much more functional.

I don't know what programming language landscape you're looking at, but to me it doesn't look like that at all. I grant you that more and more languages are getting functional bits glued on. That doesn't make them FP languages, though. And my perception is that they aren't being used as FP languages. They're being used as other-paradigm languages that can do a bit of FP when desired. If people actually wanted an FP language, they'd switch to one. And they aren't.


> People would rather die/retire than change how they program

As the saying goes, academia advances one funeral at a time.


More strongly (and accurately, I think): The world advances one funeral at a time.

Good saying, in any case, however strongly one wishes to phrase it. :)


> This creates a secondary effect where even when you encounter a superior language pattern, you cannot use it because your company isn't likely to pay you to rewrite complex libraries in a new language

I think the software dying or retiring is perhaps a more important factor than the developers, who can be flexible if the money/kudos is there for them. Rust has been picked up in various places to "forcibly retire" C/C++ programs which have had too many CVEs in their lifetimes; the only reason they were not retired earlier is the lack of a language with the required properties. If there was a bug-free highly performant implementation of SSL that existed in Lisp, that might be a thing that encouraged people to take a look at adopting it despite their lack of familiarity.

> ALL of these things have existed for 50+ years

Especially Lisp. Lisp has been around for a long time. At this point I think it's suffering from a "true communism has never been tried" argument which overlooks that people have tried it and takeoff has not been achieved, and its advocates continue to blame everyone else (as in your post) rather than engage in a little introspection or reflection.

> I'd hypothesize that if Eich had made a Scheme instead of JS (as he originally intended)

As the world turns, WASM now has a canonical format in S-expressions.


I’ve seen the programmer battles over programming paradigms in years past. They’ll pick inferior tech simply because it’s what they know.

I’d argue that alternatives to C/C++ exist, but they aren’t so C-Like, so nobody wanted to try them.

Unlike communism, Lisp has been successful at pretty much every place it’s been tried until someone showed up and demanded everyone change to their preferred language.

Ironically, WASM is the least usable “lisp” ever created because of its stack design.


> Neither the LH people holding LH scissors nor the RH people holding RH scissors are "wrong". It's just the RH people trying to use LH scissors who are having a bad time.

True, and good insight, but it doesn't mean that all scissors, or all languages, are equal. They can all perform computation, but a person with a particular language might create incredible things that another person with another language would have a hard time with.

Let's appreciate the difference in tastes, which reflect our different in thought processes and approach to the world, but it does not mean that every language is the same, nor every person is the same. A Lisp virtuoso will build tools which are completely alien in design and operation than tools built by a Java virtuoso.

I really don't want to take this off-topic, but this fallacy is very present in our modern approach to human diversity, where instead of celebrating our differences, we simply reduce it to a one-size-fits-all approach. There is enough space for everybody, the result is effectively the same, but you simply can't replicate a Lisp-wielding Paul Graham in any other language.


>, but you simply can't replicate a Lisp-wielding Paul Graham in any other language.

But anyone who wants to really dig deeper will wonder about replicating which particular aspect of PG's productivity.

If the aspect they care about is the ViaWeb ecommerce site that he sold to Yahoo for a few million, then all of PG's essays[1] about (paraphrasing) "Lisp syntax being their secret weapon to agility and flexibility compared to others using inferior blub languages" ... isn't going to be convincing to others who notice that Amazon ecommerce was built on Perl/Java/C/C++ instead of Lisp. And later, Shopify was built with Ruby on Rails and PHP instead of Lisp.

In other words, maybe the secret weapon to building ViaWeb wasn't "Lisp" but instead, "PG's brain". (https://en.wikipedia.org/wiki/Confounding)

This is why language discussions focusing on "superior syntax" really don't really move the needle when it comes to mass adoption. Yes, Lisp's "code-as-data" is powerful etc, etc but there may be other factors that make that syntax feature not so important in the bigger picture.

[1] "Beating the Averages" : http://www.paulgraham.com/avg.html


Wild guess here (not a Lisp programmer). When I hear people talking about the practical benefits of Lisp, not just the nice theoretical stuff like homoiconicity and macros, I hear about interactivity. In Lisp it sounds like you can be pretty successful with a more experimental, on-the-fly development process. That might help a crack team push out features faster, but they might produce code that is proven to work more by experiment than by logic, which might mean it's harder to understand and keep building on. This leads to small Lisp shops that beat their competition in a niche but have trouble scaling it to Amazon size.


That is remarkably accurate for a wild guess.

Lisp code tends to be built from the inside. Working in a REPL, modifying the system state as you go, storing code into a buffer then recompiling and loading it into the image with a keystroke. Recompiling functions as you modify them, without recompiling the rest of the file, updating the code behind some classes without taking the system down, and the interactivity you get from the tooling for debugging and error handling. It all adds up.


> In Lisp it sounds like you can be pretty successful with a more experimental, on-the-fly development process.

That’s a side effect of having a REPL.


By itself this doesn't seem like a distinctive advantage of Lisp anymore. Lots of languages have REPLs. But I'm told that Lisp is more advanced in some way. Better debugging, breakloops, everything can be modified in-flight, etc.


Interactive interfaces (command line interfaces, etc.) are long available: BASIC, Smalltalk, Prolog, APL, UNIX shells, etc. Some of them also used source level interpreters.

One simple difference is that the READ EVAL PRINT LOOP of Lisp works with code and data in similar ways. It makes it easy to process lists which are data and also to process lists which are programs. READ reads lists, EVAL evaluates lists & other values (and can be rewritten in itself) and PRINT prints lists. Code is lists and programs are lists, too.

In the moment where the REPL uses a source level interpreter, debugging can go very deep, incl. modifying the running source code. That's kind a second nature when one is interactively developing Lisp code: every error stays in the error context and provides a break loop to explore/change/repair the current state - including that each additional error in a break loop just gets us into another break loop level, another level deeper.


> A Lisp virtuoso will build tools which are completely alien in design and operation than tools built by a Java virtuoso

Crossing with the "build what, exactly?" thread: https://news.ycombinator.com/item?id=36195055 - does the special alien-ness translate into either greater user satisfaction or greater commercial success?

It is possible that there are Lisp virtuosos. It is also possible that you have to be a virtuoso to write Lisp, as is definitely the case for J. Every J program is a tiny work of art. That's why there aren't very many of them.

> you simply can't replicate a Lisp-wielding Paul Graham in any other language

You can't replicate Tolstoy in any language other than Russian, either. I'm not sure that proves anything other than the existence of unique talents that are also bound to their circumstances?


Perhaps it has something to do with winner takes all, or maybe it has no connection at all


FORTH is (was?) absolutely fantastic in it's niche. As an alternative to assembly when you were developing for and probably more importantly WITH underpowered computers, it was great. Even better, you could roll the tools together yourself with minimal effort.

Today one's rarely in a position where you have to bootstrap your development environment from almost nothing, so there are better alternatives.

But if you're stuck on a desert island with a computer and a CPU databook, bootstrap a FORTH compiler and then use that to make your LISP. :)


As someone who has been learning forth over the last like... 2 weeks... yeah. Somehow my brain was like 'this', and it just sorta clicks really nicely in a way that no other language has so far. I also really like Lisp fwiw, so if you see me in a room, you've seen a Lisper and Forther in the same room ;)

What's interesting to me is that I see Lisp and Forth as extremely similar-in-spirit languages, though FORTH is definitely "lower level" (pointer arithmetic everywhere) than Lisp. Depending on your implementation, I bet you could squeeze out nearly every bit of performance to be had on a given system with forth (given enough time), but I'd be really surprised if you could with any lisp.


I came to a similar conclusion about how some people strive to reduce what they call "visual noise" on their programming languages, by removing things like semicolons, parentheses, curly braces, etc. and others, like myself, who like the punctuation.

I think I read code visually. I understand its literal 2D-visual shape first, and use punctuation as visual markers to help me along. Then I backfill the gaps in the structure with the actual "content" of the code.

For the longest time I was baffled by the other camp's attempts to get rid of punctuation. What are they thinking? I now believe what they're thinking is of code as linear text. If you read the code linearly first, as text, and build the structure in your head as you go then yeah, all; of(that) seems {kinda} pointless && annoying.

Now guess which camp is more likely to write blog posts about how theirs is the One True Way? Ah, the humanity.


I've actually gone one step further and I'm pretty sure I can adapt to anything now.

I coded in Java for 20+ years and then switched to Kotlin. Semicolons were gone, it was a breath of fresh air.

And then I learned Rust and I faced what I thought were two absolutely insurmountable obstacles: semicolons AND snake_case. I considered Rust ugly for these reasons and really dragged my feet. But I was curious about the language, so I perservered.

One week later, I wasn't noticing any of these any more. I still do think that semicolons are a manifestation of a language developer who favors their time over mine, but it's become more of a pet peeve than a deal breaker.

The human mind is wonderfully flexible.


If I ever do my "pet crank opinions" programming language, it will use "." as the statement terminator as it is in English. And COBOL.


> Certain people find certain metaphors and modes of computing much easier to work with than others.

Absolutely.

I took a couple of Lisp classes in my computer science program, and I saw this almost instantly. Some people who had a very hard time catching up in most other languages suddenly shined. However, some students who were otherwise very successful, had a harder time with functional languages.

I always wondered how this worked. Is it that we're just wired differently?

> ...concludes the weird scissor advocates are mad.

I just wanted to clarify I don't think this. I appreciate when people are passionate about something like Lisp. At least, I'm happy for them!


> Is it that we're just wired differently?

very much so. Richard Feynman had once said that even the simple act of counting is perhaps completely different between different people. Some people count by visually seeing a count, some use a voice, and may be there are others who have a different system.


I like this explanation.

Every year, or so, I'll design a programming language on paper to work through my "latest thoughts" as impacted by languages and ideas I've picked-up since. Each time the design is different, with different priorities etc.

I think what I'm doing is exactly what you describe: clarifying my mental model of programming languages -- so that I arrive at something which "feels right".

I can then come back to actually-existing languages and express my "prelinguistic" ideas using whatever syntax they make available.

Absent this activity, I think I gradually end up too conceptually confused -- blending a mixture of whatever languges i'm working in.

The power of a single radical paradigm solves this problem for people, like me, who require a theory to feel comfortable; but without all the effort i go to.

(Though for me, of course, it's a hobby --- I like to see how close I can get to designing a language which, at that moment, is the one I'd wish to program in).


> people, like me, who require a theory to feel comfortable

Reminds me of the paper "Programming as Theory Building" https://pages.cs.wisc.edu/~remzi/Naur.pdf


Naur is probably the closet computer scientist to my world view -- on many fronts.

I'm reminded of his genuine attempt to go into neurobiology, and via William James, take seriously biological adaption and plasticity.

I think over the last 20 years, engineers and mathematicians seem to have "taken over" computer science -- against the tradition of "scientific philosophy" which Naur represents.


> Every year, or so, I'll design a programming language on paper ...

Where do I subscribe?

> Each time the design is different, with different priorities etc.

Classic trilemmas. Nice.

Scott McCloud's triangle for style of illustrations was a eureka moment for me. The tips are ideographic, textual, and realism (IIRC). All comics lie somewhere within that triangle (solution space). Mind blown.

There are many either-or tradeoffs: closer to the metal vs abstractions, explicit memory management vs garbage collectors, etc.

But there's probably also a few trilemmas, like functional vs imperative vs declarative.

Anywho. Just a casual notion. I'd love to see a big list of design decisions for programming languages.

Today's embarrassment of riches has reduced the enthusiasm for language jihads. But it'd still be nice to have something more than esthetics and vibes to guide our choices.


Your description of your work process sounds like descriptions I’ve read of language driven development.

Could you expand on why this is so powerful for you?

Not doubting it’s power, but I’m having a hard time understanding _why_ it’s so powerful for some people.


This morning I was sketching how i'd do syntax for affine and linear types, which are (very basically) where variables can be used "at most once" or "only once".

In sketching I iterated various designs, my notepad is below. It began trying to think about how to express scopes visually, or using tags/lables -- then moved into how that integrates with other langauge features etc.

By doing this I understand much more about what really a language feature is trying to do -- I understand the tradeoffs, etc.

  program a:

  let x:a = 10

  if 10 < 5 b:
    let x:b = x
    print(x)



  program UseyTypes:
    x : once  = new Resource() # at most once -- rust's default
    y : never = ...
    z : once! = ... # only once
    q : many = ...  



  program MoveSemantics:
    a = new 10
    x = 10
    y = move x 
    z = copy y
    
    i0 = new(auto) 10
    i1 = new(memo.alloc.heap) 10 # default
    g1 = new(heap) Graph.init({...})
    g2 = copy(Graph) g1 # uses Graph's (deep)copy method
    g2 = copy(Graph.copy) g1 # eqv to above

  program : 
    global heap = Allocator.tmp(...)

    if True:
      local xs = Vec.new(heap) { 1, 2, 3, 4 }

      memo.lifetimeOf(x)      # eg., local scope a = lineno 10 - 20
      memo.lifetimeOf(x.data) # eg., global via heap allocator

      repeat x <= xs:
          x = 10 # error
          print(x)
          
      repeat i <= int.range(10):
          x[i] += 1
          
      repeat:
          const answer = input("What's 2 + 2?").to(int)

    print(compiler.scope)
    print(compiler.lineno)


    const int z = 10
    const ref(int) y = ref(z)

    print(y, deref(y))

    const x : global  = 10

    if 10 < 5:
      int x : local = 10
      int y : parent = 10



  # polymorphic .to

  fn str.to(int):
      parseInt(.)    



  program EffectSystem:
    fn[IO] println(const ...data: ...any):
      repeat d <- data:
          IO.writeline(d.to(str))

    pure fn calc(): # pure = no effects
      return 10 + 20 

    # by default, all fns are impure, ie., have IO auto-passed ?
    

    println("Hello", "World")

    with IO = open("output.txt", "w"):
      println("Into some file!")

    println[open("output.txt")]("hello world")


Thanks!

Out of curiosity, have you ever tried developing DSLs in Racket? One of its explicit reasons for existence is to enable fast development of custom DSLs.


The art of designing a language is expressing semantics in intuitive syntax -- it's an art because "intuitive" is essentially a psycho-social cultural condition. (ie., I reject Lisp)

C was "intuitively mind-expanding" to assembly developers and PDP-machine programmers -- and so on.

My aim is always to express a semantic concept in syntax so that it is so obvious that it's originating language developers will be shocked.

You can do that both with, eg.,

    map fn over collection
and

   xs/fn

and

   repeat x from xs: fn(x)

and

   { fn(x) st. x <- xs & st. x > 0 }

etc.

In that each syntax resonates with a certain programming culture.

For novices, I suppose the following might be "consciousness raising",

    set result :=
      repeat:
        set x := next xs:
          save fn(x)


Chuck Moore was a student of John McCarthy, so the Lisp inventor and Forth inventor have been in the same room many times.


I did have to use Lisp, C and PostScript in the same project once...


I don't think I've ever seen a FORTH programmer and a LISP programmer in the same room; I can't imagine the discussion terminating.

The ultimate halting problem?


I'm a Lisper. One of my good friends at Apple was a hardcore FORTH guy. I had other friends there that were Lisp or Smalltalk enthusiasts.

We got along great.

The early meetings at Apple for the design of the Dylan language definitely had both Lisp and Smalltalk folks participating. I wouldn't be surprised if some of the participants were FORTH folks, too.


the discussion would indeed recurse over whether to reimplement lisp over forth or forth on top of lisp


I'm a FORTH programmer and a LISP programmer, and of course a PostScript programmer, which is actually a lot more like LISP than FORTH.

https://www.donhopkins.com/home/catalog/text/interactive-pro...

https://donhopkins.com/home/archive/forth/forth-postscript.t...

David Singer at Schlumberger developed a Lisp-to-PostScript compiler in 1987 called "LispScript".

https://news.ycombinator.com/item?id=21968842

https://donhopkins.com/home/archive/NeWS/NeScheme.txt

Arthur van Hoff at the Turing Institute in Glasgow developed an object oriented C to PostScript compiler called "PdB" around 1990-1993:

https://compilers.iecc.com/comparch/article/93-01-152

We used PdB to develop HyperLook for NeWS, integrate The NeWS Toolkit components into HyperLook, and implement the SimCity user interface on SunOS/SVR4/X11/NeWS.

https://news.ycombinator.com/item?id=22456471

ChatGPT Summary of the above thread:

This discussion thread revolves around the concept of implementing Lisp-like macros in PostScript for creating more efficient drawing functions. The user "DonHopkins" highlights their work on the Open Look Sliders for The NeWS Toolkit 2.0, where they leveraged a Lisp "backquote" like technique to optimize special effects drawings. The user explains that this approach accelerates drawing at the expense of increased space utilization. They also propose a potential solution to space conservation by only expanding and promoting macros during tracking, then demoting them upon tracking completion.

DonHopkins shares several resources on NeWS, LispScript, and the PostScript compiler, and also refers to window management tasks in Forth and PostScript for comparison. Additionally, they discuss a paper on syntactic extensions to support TNT Classing Mechanisms and share a demonstration of the Pie Menu Tab Window Manager for The NeWS Toolkit 2.0.

Another user, "gnufx", appreciates the shared resources and brings up the metacircular evaluator in HyperNeWS or HyperLook as a potential speed bottleneck in the system.

DonHopkins responds by explaining the use of a metacircular evaluator (ps.ps) they wrote for debugging. They clarify that speed was not a concern as the evaluator was not used in any speed-critical situations. DonHopkins also discusses the technique of "PostScript capture," likening it to partial evaluation of arbitrary PostScript code with respect to the imaging model. They relate this concept to Adobe Acrobat's "Distiller" and Glenn Reid's "PostScript Distillery".


I honestly love posts about Forth (or Factor) and Common Lisp (or Lisps in general). I love both languages. On top of that, I use C a lot, along with OCaml, Lua, and Erlang (and rarely Ada). I find each one of them beautiful. :)


I don't think I've ever seen a FORTH programmer and a LISP programmer in the same room

I have! There was a knife fight and they both stabbed themselves, horrific.


The Forth programmer walked into the room backwards, like a moron. The Lisp programmer already in there had a perfect opportunity to get him in the back. Stupidly, his weapon was lying in a heap of stuff, and was boxed; he couldn't get it out in time. Moreover, it blew up in his face because he imported it, he had falsely declared it to be of toy type.


I suddenly stopped worrying about both LISP and FORTH, when my CS professor mentioned (around 1995) that it would be trivial to write a transformer between the LISP and FORTH.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: