Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The Janet Language (janet-lang.org)
299 points by xrd on Feb 18, 2023 | hide | past | favorite | 195 comments


Oh hey! Nice to see this on the front page here. I love Janet -- I've been using it for about year and a half, and it's now my go-to scripting language when I need something more powerful than bash, or when I want to hack on goofy little side project (some examples in my profile).

Parsing expression grammars (think, like, declarative parser combinators?) are a really great feature for ad-hoc text parsing -- and nicer than regular expressions, in my opinion, for anything longer than about ten characters.

The language itself is also a great introductory language-with-lots-of-parentheses, so you can explore compile-time metaprogramming with familiar runtime semantics -- think JavaScript plus value types minus all the wats. The embeddability and easy distribution (compiles to a static binary on any platform) is a huge plus as well.

Honestly I like Janet so much that I'm writing a book about it, because I think it's a shame that the language isn't more well-known. It's great! You should check it out right now!


Is there somewhere I can keep up to date with books progress?


I'll announce it on my RSS feed/newsletter/twitter once it's done, if you use any of those. I'd estimate that it'll be out by the end of March; I won't write any public updates until it's ready for people to read.


Look forward to it.


I went there, saw the Lisp syntax, and noped back out.


Agreed.

This is why I don't do any math I can't do on my fingers.

Parentheses are just too scary, and there's no way that parenthesis math junk actually has any useful ideas.


I understand the reaction, but some people feel just as negatively about parentheses as lispers feel positively. If I can feel positively about a language purely because of the parens, it's no sillier to feel negatively for the same reason.


I actually agree with that entirely, and the point of my jest was just to say that we should look deeper for the important ideas, not refuse to read the book because we don't like the color of the cover.

Lispers don't feel positive about Lisp because of parentheses; change them to curly braces or brackets or ^ and $—that's really not what matters. Lisps with brackets go all the way back to the beginning (https://en.wikipedia.org/wiki/M-expression). Indentation-based Lisps have been done too (https://readable.sourceforge.io/).

The point is an expression-based syntax that directly models the code tree, is written in the data structures of language, and is convenient for meta-programming. It's a fundamentally different approach that yields massive benefits (see my other comment in the thread if you want to hear that spelled out in more detail).

But we don't see that when we just stop at unfamiliar syntax.

Lispers have been structurally-editing code as a matter of course since at least 1970. Most of the rest of the world only got a taste of that when tree-sitter came out circa 2018 (I know I'm rounding the edges here, but the point stands). Half a century later! Why is that? It's not just curly braces vs parens—something deeper is happening here.

I do apologize if I came off rude. I'm just so frustrated at hearing this same line year after year after year from people who are missing out some of the most powerful ideas in programming because they prefer this ASCII glyph over that one. It's nothing more than parochialism.

It just makes me want to scream (perhaps uncharitably) "surely you're not a serious engineer who works on serious problems if your biggest concern while coding is which character is used to group code?!" I want tools to help me think more clearly, ways to operate at higher levels of abstraction, better concurrency semantics—surface characteristics be damned. Sure, I have my preferences about orthography, but the tail doesn't wag the dog.

Look deeper! Learn what each language has to teach you! Then keep the parts that move our craft forward and use whatever glyphs you want. But don't reject the automobile because it doesn't have handlebars.

Moreover, the things that look familiar probably have the least to teach you.

I believe we have the ability to do so much better as an industry, but it's not going to happen if we reject the unfamiliar just for being so.


Lisp syntax is a huge plus for me, and if a language doesn't have it I know it's not for me.


Same reaction here... wow that is a rough looking language. Then again I always disliked those kinds of languages like Clojure. The syntax is just too much for me. I feel like if I used it, it would atrophy my skills in other more traditional languages.


The funny thing is that it's _not_ more parentheses, roughly, than one would expect to find in any Algol/C-inspired language, like C# or JavaScript.

The opening paren is simply relocated to the other side of the function name or keyword.

What we're experiencing is just a cognitive bias which causes us to prefer the more familiar over the less familiar ([the mere-exposure effect, also called the familiarity principle](https://en.wikipedia.org/wiki/Mere-exposure_effect)).

I've never used a Lisp-family language, but I find the reaction against parentheses to be overblown.

The reading order of the code is also _consistent,_ rather than the frequent switching between infix notation, prefix notation, and postfix notation which we have to learn and parse in most languages outside the Lisp family. This is a benefit which deceptively looks like it is _more_ complicated, despite being simpler to parse visually (and otherwise). Another example of the familiarity principle at work.


The funny thing is that it's _not_ more parentheses, roughly, than one would expect to find in any Algol/C-inspired language, like C# or JavaScript.

This tired trope needs to die. Yes, there are more parentheses, because lisp also uses parentheses where other languages use [] or {} or ;.

The real thing is that people from C-like languages are used to seeing different block markers for different constructs. It takes effort to read Lisp coming from other languages, because those other languages have a richer symbol vocabulary. Learning to read code without those symbols is like reading English where all punctuation has been replaced with a single space. Sure you ll get there eventually but it s a very cheap straw man argument to pretend that the only complaint people have about Lisps is the positioning of the parentheses.


Not a lisp apologist haha, but I think Clojure tries to use fewer smooth parens “()” in general. Square parens “[]” and curly parens “{}” are used to help with things like variable declarations and stuff like data structures.

I recently started using Clojure and I’ve used languages like C#, JavaScript, and Python a lot. My two cents is that a Clojure-like language should try to embrace the aesthetics of a white-space language like Python, but use the parens as clues for scopes or blocks. So much could be done with formatting rules that just make parens easier to scan without some extra IDE highlighting or something.

The best part of parens is that you can try to pick a consistent format, but ya know that sometimes doesn’t happen because everybody likes to use parens differently lol.


There are strictly more parentheses since other languages also use [] and {} (and <>) instead for different things. Or even ; or whitespace. Ignoring the majority of available brace types is just stupid if you ask me. The parentheses being at the wrong positions is something I can deal with (still no fan of RPN, but w/e), but intentionally crippling syntax that could help is just not a good idea. Syntax is your friend, use it.


Janet also uses [] and {}, and there is a reason people like the really consistent syntax. For me that reason is blind aesthetics, but I swear to you I sincerely find lisp code easier to read than non-lisp code. It's not "intentionally crippling syntax" it's a genuine difference in taste.


Oh, I guess Markdown doesn't work. Huh. Well, shoot.


HN comment formatting options can be found here: https://news.ycombinator.com/formatdoc


Quite the opposite. Learning a Lisp or an Array language will build your skills and make you a better programmer. Syntax should fade as a criteria as you become proficient in various PLs.

For example, the sum of a list (not running or cumulative sum, but total sum. Should equal 10, not 1, 3, 6, 10):

Lisp:

  (+ 1 2 3 4)
J/APL[0]:

  +/1 2 3 4

Python:

  Sum = sum([1,2,3,4])
  print(Sum)
They all do the same. I prefer the conciseness of J/APL and Lisp in this case, and the application of a function over the list or vector. The beauty of the REPL is that you see the result without a 'print' statement.

Solving problems in these other languages will influence how you program in your base language as well, and usually for the better. I am also guilty of syntax bias. I prefer LFE (Lisp Flavoured Erlang)[3] over Elixir and Gleam. Gleam[1], another language that runs on the Erlang VM (BEAM), had a more ML syntax, but then it chose to join the syntax popularity contest and move to a more Algol/C/Rust syntax. I prefer vanilla Erlang over it too.

[0] https://www.jsoftware.com/#/ [1] https://gleam.run/ [3] https://lfe.io/


I've been using Lisp for hobby projects for a few years. Yes the syntax takes some time, but

> I feel like if I used it, it would atrophy my skills in other more traditional languages.

was not the case for me at all. If you go into a text editor and remove all the parentheses, I find that's how Lisp programmers tend to see Lisp, (function argument) isn't that far from function(argument).

Learning Lisp has only improved my skills as a programmer, after getting ideas like code as data, macros, let over lambda, CLOS and the metaobject protocol. It's a simple model that to me shows how other languages have picked an abstraction and stuck with it, but Lisp has all the tools to implement those abstractions and more.

More mainstream languages are great at focusing the developer, and that makes them very practical. It is amusing though to watch many of the "new features" in languages come out even though Lisp had them years ago.


Reading Peter Norvig's PAIP (https://github.com/norvig/paip-lisp) in 1998 totally blew my mind. It completely changed how I think about programming in every other language I use(d). I love it still, and always will. And yes, my experience is the same as yours: learning lisp made me a better programmer in every other language I use (especially -- but not only -- Python).

The simplicity and symmetry of the syntax is a big part of that love for me. Being able to manipulate lisp code as lisp data, using the full power of the language to do so, is just brilliant.

Janet looks lovely! Looking forward to the book.


> I feel like if I used it, it would atrophy my skills in other more traditional languages.

If your skills in other languages are tied to a syntax then you never had any skills to begin with. I've used pretty much every syntax (and too many languages) out there and the only difference I've ever found is that ML syntax is nicer for automatically curried functions and LISP syntax is much nicer for meta-programming. The rest is effectively all down to paradigms, runtimes and libraries.


I certainly don't fault you for having that reaction, but I would encourage you to look deeper and not dismiss one of the most interesting families of programming languages on the basis that it looks weird. Far from atrophying your skills in other languages, many people say that learning Lisp makes them better programmers even when they're coding in other languages.

Lisps don't arbitrarily look weird—there's a deep, principled, elegant reason for it; Lisp code represents how the code will be evaluated in the most direct way, without relying on (some would say needlessly) complex parsing / precedence rules. There are no surprises and no arbitrary rules to learn. There are no useless semicolons to forget, and you'll never have to wonder if `+=` returns the RHS or the result of the operation (or does it even have a return value?).

You don't have this meaningless distinction where you can't directly reduce with `+` because—ugh—it's not a function, it's an operator. You just say `(reduce + [1 2 3])`.

You never have to do this ugly Ruby stuff...

  words.map &:length
  # or
  words.map { |w| w.length }
...because methods are really just polymorphic functions, but language designers chose syntax that doesn't compose elegantly.

You don't have this useless distinction between statements and expressions that limits how you can compose code. You never have to drop down to some ugly, limited tertiary expression form (`COND ? X : Y`) of `if` because—whoops—`if` is a "statement". You just write:

  (println (if me? "me" "you"))
Because, duh, we wanted an `if`.

What do we gain by adding all of this noise?:

  if (is_me) {
    println("me");
  } else {
    println("you");
  }
Absolutely nothing. The parens on the conditional, the curly braces, the semicolons, the `else` keyword—they're essentially meaningless incantations to appease the compiler. And we've introduced an undesirable opportunity for the two branches to accidentally diverge over time.

But most importantly, our code is written in the data structures of our language. Code as data means we can manipulate code as easily as we manipulate data, which means we can trivially (re-)write code with code (i.e. macros). And not shitty string generating macros, or macros that can only do a handful of sanctioned things—we can write our own control structures in couple lines of code. We can add new abstractions to the programming language from user space.

Wish the language had an `if-not` construct? You can add it with, like, 3 lines of code. Wish functions could have optional parameters? Add it. Wish it had a pattern matching functions like SML or Erlang? Cool. Java-style annotations? Logging that is fully removed when running in high performance mode? A different OO model? Multi-line string literals? String interpolation? A graph of dependent calculations that only get run when used? A more convenient way to load dependencies? It's all easily doable.

I've coded in Lisps (and a dozen other languages) for at least 20 years, and every time I have to use a non-Lisp syntax I just think "wow, these people really missed the boat". It's like having to write math in Roman numerals (would you rather calculate "D + L + IX" or "500 + 50 + 9"?); there's a better way, and that better way has elegant, recursive, underlying design principles that make the ergonomics way better.

But, yeah, it doesn't look like C code. And people seem to be really attached to their C syntax.


That example is quite unfair. You could just write

  println(is_me ? "me" : "you")
which has exactly the same amount of symbols.


I specifically called out the tertiary operator in my text so as to be fair, but tried to keep things simple in my example. Perhaps too simple.

Of course you can do that simple case with a tertiary operator but:

1. It's a construct that really has no reason to exist (I argue) as distinct from `if`.

2. It doesn't compose with statements.

This duality is primarily what I'm arguing against.

A better example would have been a case statement inside of the `println`:

  (println
   "Log in by"
   (case user-id
     0 "root"
     1 "local admin"
     (format "regular user (id: %d)" user-id)))
In C, you have to introduce a variable for no good reason (or do some non-idiomatic, ugly, nested tertiary operators that get uglier the more cases we have).

And even then, you can't just say

  user_name = switch { ... }
Because switch is an statement.


I mean sure, more things being expressions instead of statements is pretty nice (e.g. in Rust you can do that). But you don't need to introduce different syntax for that.


Your loss.


Nope as a verb is new to me - just heard it for the second time this week. First time was from my child.


It's somewhat youthful internet slang. You'd hear it at the American public highschool I was at 12 years ago.


Me too


I love Lisp (particularly Scheme), and heard that Janet was strongly inspired by Lisp and is "really Lisp underneath", but is really fast and strong at math, so I thought I'd give it a try, and after learning it I started wondering to myself, "why am I not simply using Lisp?"

Though arguably lispy, Janet just wasn't lispy enough for me. It was missing the simple, elegant sexp syntax I dearly loved, and I started to wonder what huge win I was getting from using it instead of just using Lisp or Scheme? Having not found a good answer, I did just that.

That was the last time I bothered trying to learn a "Lisp-like" language that wasn't actually a Lisp, and decided to just stick to Lisp/Scheme. They do everything I need, are good at math, and are plenty fast enough for me.


>It was missing the simple, elegant sexp syntax I dearly love

In what sense does Janet not have sexp syntax? Seems plenty sexpy to me. Purists seem to say it's not a lisp because its underlying data structure is not (cons-based) lists as in classical lisp, but I don't see what syntactic difference there is.


Op was understandably confusing Janet with Julia


Uhm, sorry for asking, but are you sure you're not mistaking Janet and Julia?


Oops.. I think you're right.. It was Julia, not Janet... now I feel like an idiot, and apologize for my uncalled-for, unfair, and undeserved mischaracterization of Janet, a language I never tried.

I wish I could take it back, but HN won't let me delete my post. I apologize. Mea culpa.


You’re confusing your exes, it’s understandable. Now give Janet a fair chance if you’re single


I think its a pretty nice language.

Those extra bits of syntax that makes it "not a lisp" are mostly around defining "not list" kind if data structures. I find it practical.


Lisp usually has a lot of non-list data structures like arrays. records, strings, classes/objects, hashtables, ... For some data structures there is built-in syntax and with an extensible reader (the extensible parser for s-expressions) the user can add additional syntax. Janet uses a non-extensible parser for the data syntax.

What Janet makes a 'not really a Lisp' is that "LISP" stands for "List Processor". Janet isn't exactly that, since it is not using linked lists as a core data structure - unlike Lisp where its List Processing features are built on top of linked lists made of cons cells.

(1 2 3) is called a "tuple" in Janet and represents something like an immutable array.


An array is just a different way to implement a list data structure.


Lisp especially is built upon linked lists, which have different costs for (more or less) primitive operations (add to the front, get the front item, get a rest list, get a random element, add to the end, ...) compare to a primitive vector. There are also other features not easily replicated by vectors.

  CL-USER 40 > (rest '(1 2 3))
  (2 3)
Above REST operation does not allocate any memory.

  CL-USER 41 > (subseq '#(1 2 3) 1)
  #(2 3)
Above SUBSEQ returns a new vector. Alternatively it would need a more clever implementation underneath.

OTOH getting a random element has a different complexity in a linked list vs. a vector.


What data structure isn't just a list with extra steps?


An array in C is a list with fewer steps. Just a continguous chunk of memory of sizeof type * length.

You could fill that with linked list nodes, but it would be pointless.


…anything that requires hashing as a fundamental part of the guarantee. Which happens to be quite a lot of the structures most of us use every day. Sets, maps, etc.


Sets and maps do not required hashing. Specifically std::map and std::set in the C++ standard library are based on ordered trees.


They’re often called “hash sets” or “hash maps” in other languages - they are called this for a reason, and certainly not because they could be implemented as a list.

std::map is not a good example anyway, you want to consider std::unordered_map for a more appropriate comparison. C++ is weird that way. (What C++ calls a map is not what most languages call a map. std::map doesn’t even satisfy O(1). You’d be surprised how many working C++ developers don’t even realize the unnecessary performance cost they take when they decide to use std::map, because it’s not a proper hash map. )

But this thread is not about the finer differences in implementation of maps but rather whether or not they are basically just lists. They are not.


Agree broadly that this thread has gotten too long off what was basically a joke, but!

Please measure before you make changes to your maps for perf reasons. Yeah this forum all knows their big-O, but B-tree maps like std::map often perform better than hash maps on real-world architectures.


You said "requires" hashing. Sets and Maps do not require hashing. Through it is correct to observe the unfortunate naming convention in the c++ std lib.

std::set and std::map should be std::ordered_set and std::ordered_map sts::unordered_set and std::unordered_map should be std::hash_set and std::hash_map

If this were so then it might make the incorrect usage of these two options less prevalent. But they are both map and set structures just each version has other guarantees that in some scenarios may be more or less useful.


You interpreted my reply as “all sets and maps require hashing“ instead of “sets and maps are examples of data structures that can require hashing”… which they are.

Imagine someone says, every fruit in the world is sour, and someone answers, no there are plenty of fruits that are sweet, such as apples, and then an entire thread gets launched in an irrelevant direction pointing out that actually some apples are sour, which has no bearing on the original point.


> But they are both map and set structures just each version has other guarantees that in some scenarios may be more or less useful.

And similarly, linked lists and arrays are just different "versions” of the same data structure: a list.


Let's simplify it even more - everything is just ones and zeroes.


Almost anything but a list


Any examples to add, or you just want to leave it at pithy comment?


Sure, my thinking is that a cons cell can build singly linked lists and node-based binary trees. Some data structures are based only on those, but most involve an array of some kind. In scheme, for example, it's the combination of cons (i.e. lists and trees) and vector (i.e. arrays) that allows for arbitrary data structures. It's very constraining to have only the lists.

- array: not a singly-linked list.

- hash table: often an array of singly-linked lists, so not a list.

- red-black or AVL tree: can be built with cons cells.

- doubly-linked list: not a singly-linked list

- double-ended queue: array of double-ended queues, so not a list. Could also be implemented as a doubly-linked list.


So what about streams? Functions? Closures? Call/cc structures?

I understand your point if you are speaking in literal terms about just simple cons cells, but in practical Lisp/Scheme code, you don’t really rely on just the basics to do things.

I think there’s a hyperfocus sometimes on the simplicity of the core of lisp, the apply/eval balance, but it’s quite possible and often easy and convenient to perform normal programming tasks with these languages as well.


> I understand your point if you are speaking in literal terms about just simple cons cells, but in practical Lisp/Scheme code, you don’t really rely on just the basics to do things.

Agreed.

> So what about streams? Functions? Closures? Call/cc structures?

Those are interesting examples. They are all data structures in a sense (especially streams and closures), but to me they are more like functions than data (yes, yes, functions are values, blah, blah). Call/cc is a reification of execution control; thinking of it in terms of data stretches my brain.


If by "list" you mean "linked list", then those are never used in practice. There is no place for this data structure in a modern CS toolbox.


That's a ridiculous statement. Linked lists have real performance benefits in some applications. A good "modern CS toolbox" includes the ability to make the right choice. Which, if you believe they are fundamentally useless, clearly you lack.


> Linked lists have real performance benefits in some applications.

Maybe in 0.01% of the cases. In reality they just ruin your cache and memory allocator performance for no real good reason.


In many cases you can get around these issues by being a little more clever in how you allocate nodes. In some cases you don't have the luxury of allocating all elements next to eachother anyway, in which case an intrusive linked list is often the best option to minimise copying. You might say use a vector of pointers or a circular buffer, but if you're in a timing sensitive context you might be unable to realloc.

Hell, memory allocators themselves are often implemented using some form of linked list. You tend to see them quite a bit at very low levels like in kernels.


The people behind the Elixir programming language certainly disagree.


Maybe that's why Elixir never gained real traction.


Op was thinking of Julia


The list of features available out if the box is pretty impressive, especially if you are scripting something concurrent. I would tolerate the syntax quirks in exchange for that.

Or maybe your particular Scheme has all of that out of the box, too. Which one do you normally use, if you don't mind?


Op was thinking of Julia


One thing that bothers me about languages that compose functions like this f(g(x)) is that when programming interactively f is typically an afterthought. So, you start out with g(x) and then need to go all the way back and add f. Similarly, when x turns out like it needs to be elaborated, and you either have to go all the way back, or edit the expression in place making it longer and inevitably facing problems with terminal length and terminal input being determined by when you press "Enter".

I like Lisp languages, and would take Scheme over Python for my job in a heartbeat, if I was allowed to. But, I think, that if we want interactive Shell-like programming, we really need to address the issues above in the language. Shell pipes are a good start, but they are quite restricted in what they can do and require xargs "extension". Some languages also have the "where" form, where arguments to a function can be elaborated after the function call.

If I was ever to design a language for interactive programming, I'd certainly try to have these two features in it.


> So, you start out with g(x) and then need to go all the way back and add f.

That's one thing I will say after coming from Perl/PHP to Java, is that despite its verbosity and the uselessness of having to write .stream(), I much prefer Java's stream.map(...).filter(...) syntax over the more functional-style filter(map(list, ..), ..) syntax. The Java syntax reads left-to-right, which is the order you want when you're thinking about code, and also as you say writing it. I think if I were creating a programming language I too would try to make stuff read left-to-right as much as possible.


It's notable that raku (previously known as perl6) lets you write things in either direction, if I remember correctly something like

  @source >>> map { ... } >>> grep { ... } >>> my @sink;
though note I'm typing from memory on my second coffee so I may have got that slightly wrong.

Plus of course there's many languages with a |> operator so you can do

  g(x) |> f
I also (the example is specialised for I/O but the implementation technique could trivially be borrowed for something that wasn't) implemented something sort of in this direction for perl once: http://p3rl.org/IO::Pipeline

I'm not convinced that left-to-right is -always- the best option and prefer having the choice of both, but I wouldn't be at all surprised if a survey of developers found that if they could only pick one they'd pick left-to-right, and while I'd find it hard to choose for myself alone I'd probably pick left-to-right on the basis that it'd likely make it easier to onboard people to any given codebase.


Raku also has map and grep methods for collections, so it could be written as...

  my @sink = @source.map({...}).grep({...}).sort;
Which also makes multithreading the operation easy:

  my @sink = @source.hyper.map({...}).grep({...}).sort.list;
That said, I think the operator you're looking for is the feed operator, ==>:

  my @sink;
  @source ==> map {...} ==> grep {...} ==> sort ==> @sink;
It also has the corresponding reverse, <==, because why not. The docs* mention that ==> may at some point do automatic parallelization, but I don't know what the status of that feature is.

* https://docs.raku.org/routine/==%3E


> That said, I think the operator you're looking for is the feed operator, ==>

Yes, yes it was.

Your clarifications, corrections and elaborations are much appreciated.


Aligning reading order with flow of actions is so important!

That's why I use and msybe tend to abuse the `->` and `->>` macros in Clojure and the pipe operator `|>` in Elixir.

Hopefully soon in JS as well, if I've read correctly.


Not sure about other Lisp languages, but Clojure has thread operators that allow you to compose function calls this way, allowing you to visualize code in that preferred left-to-right (or top-to-bottom) order.


In addition to the examples other people have given, I think Ruby does this very nicely by making map, filter, and reduce methods on collection data structures.

Of course, Ruby is a bit of a mixed bag, but for the applications where it fits, it can be very nice.


Thank you for making an actually relevant point about syntax. I agree with this 100%, and I love Janet, and was recently doing a lot of interactive Janet programming for a generative art playground.

So I added postfix function application. So instead of (f (g x)), you can write (g x | f).

I liked the syntax a lot, but it looked really weird with operators: (calculate x | + 1). So I made operators automatically infix: (calculate x + 1).

I also didn't like that the transformation from foo to (foo :something) (record field access) required going back and adding parentheses before the value, so I added foo.something syntax that means the same thing.

The result is something that's very easy for me to type and read:

    (def eyes
      (eye-shapes
      | color (c + 0.5)
      | color (c - (dot normal (normalize eye-target) - 0.72
              | clamp -1 0
              | step 0))
      | color [c.b c.g c.r]))
(Excerpt from the logo of https://toodle.studio -- https://gist.github.com/ianthehenry/612c980f0db04ea3c2ccab27...)

Is this even Janet anymore? I dunno. It's a Janet dialect, and it's implemented as regular old Janet macros. But it's much easier for me to type like this. I recognize that it makes my code necessarily single-player, but that's fine for the sorts of dumb projects that I do for fun.

I think a lot of lisp programmers use paredit exactly so that they can write (f (g x)) in the order g x f, but with their editor re-positioning their cursor and taking care of paren-wrapping automatically. But I don't use paredit, and I don't want to wed myself to a particular mode of editing anyway. So I like a syntax that lets me type in the order that I think.


The 'terminal' restriction is long gone. Most Lisp read-eval-print-loops run inside an editor or another specialized tool.

In Common Lisp:

  CL-USER 37 > (sin 3)
  0.14112

  CL-USER 38 > (cos *)
  0.9900591
Above really is (cos (sin 3)). The variable * is bound to the last result.


How'd you go about multiple-value-bind? Same problem with handler-case.

When used interactively, Python also has _ to store the previous value (but Python only ever really returns single value, which is sometimes a tuple or a list that can be deconstructed into variables, iirc in CL if you don't request other return values, they are gone.)

More generally, you'd want more of xargs-like functionality (eg. split result into chunks and feed them to the next function in chunks, perhaps in parallel). Or maybe you'd want something like a tee, to split the results of the previous function into multiple streams and processed by different functions? Java-like languages don't immediately support something like that, but Shell-like do with redirection syntax, tee, xargs.


> Java-like languages don't immediately support something like that, but Shell-like do with redirection syntax, tee, xargs.

But in Lisp you are not bound to the language syntax of Java. You can inside the language write tools to process forms. That's one of the main differences between Java and Lisp. Lisp has reader macros (to change the surface syntax of s-expressions) and macros to change the expression syntax. That would allow you to write tools to process code and results in arbitrary ways.

Multiple-values results in a REPL are handled by the variable /. That one is the list of the last values.

  CL-USER 4 > (values 1 2 3)
  1
  2
  3

  CL-USER 5 > (multiple-value-bind (a b c) (values-list /) (list a b c))
  (1 2 3)
But anyway, I would not write Common Lisp in a pure terminal without editing support.

Something like GNU Emacs can use tools like SLIME or has a SHELL mode. That one works fine in a terminal.

SLIME is one of the Common Lisp IDE extensions for GNU Emacs and works fine in a terminal. Writing even the most complex code inside an GNU Emacs & SLIME terminal session is no problem at all. If I have a function call (foo a) and I want to wrap code to the front, I would just move the cursor one s-expression back and type. I'm sure that's easier with one of the other editor extensions which make editing s-expressions kind of structural.

EVEN then, many people write the actual Lisp code in an editor buffer (say GNU Emacs in a terminal connected to a Lisp process via SLIME) and send the expression for evaluation to a connected or underlying interactive Lisp sessions. The editing of s-expressions in a terminal to move forward, backward, upward, etc is really a non-issue.

Interactive read-eval-print-loops does not mean one is forced to use a terminal without editing support.


I've been using CL and SLIME for over 10 years and had no idea about special meaning of "/"... TIL.

Anyways, the problem with handler-case still stands, as well as the other aspects s.a. chunked output, tee and redirects. It's something that would have to be programmed on top of the existing stuff, which was my point originally.


you're totally right that it's often easier to read in the order the functions are being applied. That's why just about every lisp/scheme family language has thread-first and thread-last macros -> and ->> so f(g(h(x))) could be written as: (-> x h g f) Janet, clojure and racket and probably others have them built in, emacs lisp has it in dash.el, common lisp has it in cl-arrows and other libraries

It's really just about readability preference though not ease of editing, lisp like languages will have paredit/sexp editing shortcuts in your editor, so when you're on (f x), you press one key and it's turned into (<cursor here> (f x))


Emacs Lisp actually has threading macros natively (seemingly since at least 2016), but they're (IMHO) obnoxiously named `thread-first` and `thread-last`.


I'm not familiar with Janet, but I know it's Clojure-inspired, so it probably has threading macros, which are like shell pipes on steroids. In practice, when doing REPL-based development, it's very common to use these as opposed to (g(f x)).


Janet does indeed have those.


Maybe what you are looking for is threading https://stuartsierra.com/2018/07/06/threading-with-style



Thanks! Macroexpanded:

Show HN: Make 3D art in your browser using Lisp and math - https://news.ycombinator.com/item?id=32738654 - Sept 2022 (38 comments)

Janet – a Lisp-like functional, imperative programming language - https://news.ycombinator.com/item?id=28850861 - Oct 2021 (135 comments)

Janet Programming Language - https://news.ycombinator.com/item?id=28255116 - Aug 2021 (114 comments)

Janet: a lightweight, expressive and modern Lisp - https://news.ycombinator.com/item?id=23164614 - May 2020 (269 comments)

Janet – A dynamic language and bytecode VM - https://news.ycombinator.com/item?id=19179963 - Feb 2019 (50 comments)

Janet, a Clojure inspired language for scripting, or embedding in other programs - https://news.ycombinator.com/item?id=19172510 - Feb 2019 (1 comment)

Janet, a bytecode Lisp vm - https://news.ycombinator.com/item?id=18759277 - Dec 2018 (1 comment)


Janet is really powerful. Here's a small TUI text editor I'm writing in it, if you want to see a non-trivial example: https://www.github.com/CFiggers/joule-editor


Ooh, nice. I've been messing around with a weird text editor that started with that same tutorial, but in Nim. Janet is intriguing, I'll be digging into your code.

How was the debugging experience?


> I'll be digging into your code.

My apologies in advance, then! (Ha.) The "main" function is at the very bottom of src/joule.janet. So I'd recommend starting there.

As for debugging, Janet embraces REPL-driven development, so the debugging experience is all about setting up, interactively querying, and step-wise updating your program's state, live in memory, using the REPL. It's quite a bit different from a lot of languages—not better, intrinsically, just different. But I like it a lot.

Is your Nim editor public anywhere? I've heard a lot of good things about Nim and wouldn't mind exploring a real-world example.


Thanks :) Mine's not public yet, so far it barely does anything. I'm using a piece table and some other stuff that ended up being tricky to get right (hence my debugging question!), then I ended up taking a break. I haven't done much REPL-based work, maybe that would have sped things up.

Maybe I should start looking at that again.


Thought the link was broke but GH looks broken now?

Edit: Back up now


If you like things like Janet, you might also like s7 Scheme. It is also a minimal Scheme built entirely in C and dead easy to embed. I used it to make Scheme for Max and Scheme for Pd, extensions to the Max and Pd computer music platform to allow scripting them in Scheme. (https://github.com/iainctduncan/scheme-for-max) Janet was one of the options I looked pretty closely at before choosing s7.

The author (Bill Schottstaedt, Stanford CCRMA) is not too interested in making pretty web pages, ha, but the language is great! https://ccrma.stanford.edu/software/snd/snd/s7.html


So this is kind of like a lightweight non-jvm clojure? I like it. Looks like it could be a nice swiss army knife for data munging.


You mean Python, not Clojure. A number of key underpinnings, such as (almost) exclusively immutable datastructures are missing from this language.

It is more of a Python with a LISP syntax.


If anything it’s major inspirations seem to be Clojure and Lua.

It has some of the affordances of the former in terms of semantics, but is stripped down and friendly like Lua.

The use cases also seem to overlap with Lua, as a fast, embeddable scripting language that you can easily keep in your head (see docs).

It seems to be simpler than Lua because it doesn’t complect arrays and dicts into tables, as arrays are a separate construct. And it affords you with immutable versions of those.

To me it looks like a Clojure-like for Lua use cases.


The Lua similarity is not exactly a coincidence, the author of Janet also previously created Fennel[0], a Lisp for Lua. [0]: https://fennel-lang.org/


I'm pretty sure GP meant Closure, considering the language has matching immutable data structures for mutable ones [^1] and even its own (limited) version of EDN called JDN. It’s definitely also inspired by it.

[^1]: https://janet-lang.org/docs/data_structures/index.html


> I'm pretty sure GP meant Closure

Closure is not a language so hopefully not?

> the language has matching immutable data structures for mutable ones [^1]

Looking at the lingo being used, the extremely limited breadth of ABI, and the complexity they assert around "immutable" data structures, they're clearly imperative data structures you can't mutate rather than the persistent data structures you'd expect from a strong clj inspiration.

This page looks a lot more like a description of python datatypes than clojure, the only bit that Python lacks is an official frozendict.


> Closure is not a language so hopefully not?

Very true, the typo is strong in this one.

> rather than the persistent data structures you'd expect from a strong clj inspiration

What do you mean by persistent here? I assume some kind of shared memory instead of copying? Or something with more practical implications like deep immutability?

> looks a lot more like a description of Python datatypes

I am surprised where this sentiment comes from. Because of the initial comment? Basically any language nowadays has maps, sequences and byte strings. If I had to draw a similarity to Python it would be something along the lines of first-class generators.


> What do you mean by persistent here?

I mean the class of data structures called persistent: https://en.wikipedia.org/wiki/Persistent_data_structure

> I assume some kind of shared memory instead of copying?

Sure.

> I am surprised where this sentiment comes from. Because of the initial comment?

Because "tuple" for an immutable sequence is rather specific to Python.

And data structures related to clojure (or functional languages in general) would have some sort of logarithmic component because they'd be tree-based. Pretty much the only O(1) operation in functional data structures is prepending to a linked list.


Thanks for clarifying, I did not find anything decisive searching persistent in combination with Clojure, weird.

> Because "tuple" for an immutable sequence is rather specific to Python.

As someone with a background in Swift, Rust and C#, all of which have the concept of a tuple, I did not make that connection, but thanks again.


> As someone with a background in Swift, Rust and C#, all of which have the concept of a tuple

They all have tuples, but AFAIK in none of them is a tuple a sequence (well not sure about C# it might be there, but I'd be surprised). Usually a tuple is a form of lightweight, anonymous, structure, so it's addressed by field (even if the fields are positions), you usually can't iterate through tuples, or "index" them using runtime variables (except through reflection).

That it is so in Python is somewhat peculiar, but makes sense within the context of the language (unpacking works on iterables, and thus sequences).


The .NET Tuple types Tuple<T1,…> and ValueTuple<T1,…> are not sequences — though the ITuple interface does enable item access by index without reflection — and ValueTuple<T1,…> instances are mutable, but .NET actually has two different immutable sequence types, ImmutableArray<T> and ImmutableList<T>, which are functionally similar but have different performance characteristics[1].

Along with the rest of the .NET immutable collection types[2], both are persistent data structures in the sense noted above.

In contrast, .NET tuple types are, as you say, lightweight, anonymous structures addressed by field. The ValueTuple<T1,…> types, in particular, are used in the underlying implementation of the C# tuple language feature[3].

AFAIK, Python has no built-in anonymous mutable structure types, though since type names in Python are basically only used for display purposes, you can easily create them at runtime, e.g.,

    def anon(**kwargs):
        class _:
            __slots__ = tuple(kwargs.keys())
    
            def __repr__(self):
                return 'anon(' \
                    + ', '.join((f'{i}={getattr(self, i)!r}' \
                                 for i in self.__slots__)) \
                    + ')'
    
            def __eq__(self, other):
                if not hasattr(other, '__slots__') \
                   or sorted(self.__slots__) \
                   != sorted(other.__slots__):
                    return False
                for i in self.__slots__:
                    if getattr(self, i) != getattr(other, i):
                        return False
                return True
            
        o = _()
        for k,v in kwargs.items():
            setattr(o, k, v)
        return o
    
Note that this differs in two notable ways from C# tuples:

1. Each object created has a unique type, so

    type(anon(x=1, y=2)) != type(anon(x=1, y=2))
More importantly, this means a distinct type object is created and stored for every call to anon, which could have significant performance implications at scale.

This could be easily fixed with a cache of already-created anonymous types (trading off slightly increased object-creation time, of course).

2. Equality in the above implementation is based on the equality of identically-named values rather than identically positioned values; in C#, we have

    (x: 1, y: 2) != (y: 2, x: 1)
and

    (x: 1, y: 2) == (y: 1, x: 2),
while in my implementation,

    anon(x=1, y=2) == anon(y=2, x=1)
and

    anon(x=1, y=2) != anon(y=1, x=2).
This was by choice, as it seems more intuitive; C# behavior is no more difficult to implement.

For read-only anonymous structure types, the Python standard library has namedtuple[4] (which, incidentally, bases value equality on position, not attribute name, so C#'s behavior is arguably more "Pythonic" than my own).

[1] https://learn.microsoft.com/en-us/dotnet/api/system.collecti...

[2] https://learn.microsoft.com/en-us/dotnet/api/system.collecti...

[3] https://learn.microsoft.com/en-us/dotnet/csharp/language-ref...

[4] https://docs.python.org/3/library/collections.html#collectio...


Or python with clojure syntax?

The author also wrote fennel, a lisp/clojure-ish wrapper for lua, so I think that's the influence.


I meant Clojure. Janet's "more default looking" data structures are the immutable ones which is a pretty strong nudge. I don't see any explicit callout about data sharing for Janet but if that's really all that's missing I still feel good about citing Clojure first.


> I meant Clojure. Janet's "more default looking" data structures are the immutable ones which is a pretty strong nudge.

Except they're not immutable in the clojure sense (of persistent data structures), they're just fozen / readonly, as can be seen from their complexity bounds (and the lack of transients). And a quick check didn't show any incompatibility between the two worlds either.


This is really cool, surprised it has so few stars and I've never seen it talked about here. I like the LISPs but I've always thought not having a CPython type thing was a bit of a deal breaker for making it a go-to "primary" language that I would use everyday.

I'm definitely going to try with Janet though.


> I've always thought not having a CPython type thing

What cpython type thing? Do you mean a rich(er) set of built-in datatypes?

Because lots of lisps have that, some (e.g. clojure) also have reader macros for pseudo-literals.

Even Scheme has had a built-in hashmap since R6RS (2007), and R5RS implementations usually provided hashmaps even if they were not spec-defined. Common Lisp has had a hashmap more or less all along (at least since before ANSI standardisation)


> surprised it has so few stars and I've never seen it talked about here

Why? It's a niche language in a dated syntax, with use cases already covered by other languages.

You will downvote because you don't like that, but it doesn't make it less true.


How is the syntax dated? I'd argue that S-expressions are timeless.


((Oh,) (I (don't (know.))) (Just (a (random (thought (I (got.))))


How(is(it(different(from(C-style(function(calls)))))))?


There is also the jank language [0] that plays in similar fields.

> jank is a general-purpose programming language which embraces the interactive, value-oriented nature of Clojure as well as the desire for native compilation and minimal runtimes. jank is strongly compatible with Clojure.

[0]: https://jank-lang.org/


I think the language is unreadable. For me, the purpose of a programming language is to let a human talk to a machine. Show me (for instance) control flow for the example function on the home page. It just isn't making things easier


I tried to translate it to python, I translated

    (defn sum3
      "Solve the 3SUM problem in O(n^2) time."
      [s]

      (def tab @{})
      (def solutions @{})
      (def len (length s))

      (for k 0 len
        (put tab (s k) k))

      (for i 0 len
        (for j 0 len

          (def k (get tab (- 0 (s i) (s j))))

          (when (and k (not= k i) (not= k j) (not= i j))
            (put solutions {i true j true k true} true))))

       solutions)
into

    def sum3(s) :
        """Solve the 3SUM problem in O(n^2) time."""
    
        tab = {}
        solutions = {}
        l = len(s)
    
        for k in range(0,l) :
            tab[ s[k] ] = k
            
        for i in range(0,l) :
            for j in range(0,l) :
                
                k = tab.get( -s[i]-s[j] ) 
                
                if k and k != i and k != j and i != j :
                    solutions[ {i:True, j:True, k:True} ] = True
                    
        return solutions 
pretty much the same. Python is not working because it can't hash dicts, while janet interprets {1:True, 2:True} the same as {2:True,1:True} when these are keys (I think?).

In the example janet returns

    (map keys (keys solutions))
instead of the "solution" dict that converts dicts like {{1:True, 2:True} : True} into [[1,2]], but I don't get it how.

But syntactically janet is not much worse(?).


One thing the Python version doesn't suffer from is lines that have to end in something like "))))". Syntax readability may be subjective, and I agree that the business logic in both examples are equally readable, but I think having to read and write a bunch of repeated punctuation at the end of a line, depending on how deeply the final statement is nested, is annoying. I don't know how Janet in particular handles error messaging around unmatched parens, but most languages have trouble localizing errors to the place where missing parens, curlies, etc actually are. This can be alleviated by rainbow highlighting paren pairs, or even having the editor auto-insert closing parens to make you less likely to forget one. But I think if you need additional help from the editor to make using the syntax a nice experience, that may be an objective sign that the syntax isn't the best.


You don't HAVE to write lines that way, though it is the convention which as a mostly outsider I've never been a huge fan of. Think about languages like c# where you often have multiple closing curly braces but the convention is separate lines. You could write them like Lisp if you wanted but no one does.


I assume "))))" is idiomatic because of the typical depth of nesting required in languages like this. If you always used newlines and indents to align the closing character, you'd waste a lot of vertical space. That's less of an issue in something like C# where you don't have to nest as deeply.


This is kind of unfair, unless you're in an indent based language you cannot get rid of this. The only other solution I've seen was in a pascal-like language, where you did end function NAME and that would close everything inside.


True, but there are two things languages like C have over this one: (1) Enclosing stuff in parens and curlies happens less often than they do in this one. So the depth of nesting is typically shallower. (2) The idiomatic way to handle closing multi-line blocks is to use newlines and indents to visually align the closing character with the start of the line containing the opening character. But, like a sibling comment points out, nobody does that and "))))" is idiomatic here, so I feel like it's a fair call-out.


Readability can be a subjective property.

For anyone who’s spent even a casual amount of time with S-Expressions, the example code is extremely readable. But if ALGOL-like code is your main source of experience, then Janet will look like executable line noise.


I guess: f x -- obscure and mathematical

(f x) -- too many parentheses

f(x) -- PERFECT


Well hold on, what's wrong with (f) x?


>> (f x) -- too many parentheses

>> f(x) -- PERFECT

They have the same number of parentheses.

f(x) is probably something you have seen for the majority of your life as this is how math is taught.


I was just saying that syntax arguments most of the time are silly. I like LISP and ML languages, so for me it doesn't really matter if I need to write LISPy code or ML code.


I assume you've not done LISP or LISP-like languages before?


How does the example not show control flow?


Interesting! Janet clearly stands on its own with many unique features, that said if you are looking for a "scripting Clojure", also consider Babashka which can tap into many very high quality libraries from the Clojure ecosystem.

https://github.com/babashka/babashka


Syntax looks a whole lot like Clojure. I immediately appreciate the optional immutability built in too.


Make a note though that the immutable counterparts are not persistent, as the Clojure ones are.


Is it a good place Janet or a bad place Janet though?


I decline to use this language unless it's Disco Janet.


This is Lisp, it’s definitely the Good Place (TM).


It's a shame the new Lisps tend to not rip off Common Lisp's type system, being able to declare my types and having SBCL admonish me when I'm doing something stupid is something I would hate to lose.


Isn’t it more accurate to call that unspecified SBCL behavior than “common lisp’s type system”?


The syntax to declare types is specified. What to do with the declared types is then to be specified by a specific implementation.

Typical possibilities:

* ignore

* use for optimizations

* use for type assertions and type checking


It is. SBCL does a lot of really nice things and it would be nice if someone who were to design a modern Lisp would take inspiration from it.

That said, Common Lisp is hyper advanced alien technology, so it is hard to improve upon.


this is really cool. I love how small it is. with the Windows package for example, you actually only need "janet.exe", which is only 791 KB.


a bunch on examples on the first page - https://janetdocs.com/


I just can't let go of infix notation.


With macros, someone can write an infix sublanguage for you to use in those mathy bits of code.

Probably better just to get used to prefix notation, though. Just pretend everything is a function call.


> Just pretend everything is a function call

D uses Universal Function Call Syntax, where:

    f(a)
    g(f(a))
can be written as:

    a.f
    a.f.g
It's a very popular feature.


Scheme has a shorthand for (cons a b) that's just (a . b), and (cons a (cons b c)) as (a b . c)

Racket, a scheme-based language, extended the syntax so that any symbol appearing between two periods and not at the end of a list gets moved to the head of the list, so (a . b . c) becomes (b a c), which some people use so that (a . + . b) becomes (+ a b).

The D syntax you describe is what schemes call the "threading macro," or the thrush combinator in non-macro contexts.


I believe the general consensus is that the double dot notation might not have been the best idea.

FWIW it's mostly used for inequalities like `(a . <= . b)` which some people find easier to read than `(<= a b)`.

But `<=` allows more arguments as in `(<= a b c d)` and here the double dot notation can't be used.


Sounds a lot like the space of Lua. How do they compare?


They're similar! The author of Janet previously wrote https://fennel-lang.org/, a compile-to-Lua language.

Janet has more traditional scoping rules than Lua. Tables and arrays are separate types in Janet, and arrays are 0-indexed. Biggest runtime difference is probably that Janet has value types.

I think the compile-time programming is the real differentiator, but it's hard to summarize what that means in a comment.

Performance is pretty similar to the vanilla Lua interpreter in the benchmarks I've seen and run (Janet typically wins slightly), but there's no LuaJIT.


The lisp style syntax is jarring


Honestly, to me it is unreadable. Defining function inside a function inside yet another function where you use a for loop kind of function.

It requires a much different line of thinking to be applicable in the real world, which goes against historic human nature of following specific instructions, one instruction at a time.


Not really, you're looking at this very superficially. It's just an unfamiliar syntax.

If you're really interested, just try to get through that syntax once and you'll find that it's not that alien.


> Honestly, to me it is unreadable. Defining function inside a function inside yet another function where you use a for loop kind of function.

Perhaps you think the same thing about JS where that was (and maybe still is?) a common idiom. Except that people didn't give those functions names.

And for sure, it's possible to do it in all sorts of languages: python, ruby, js, go, and of course lisp

I get that the syntax is "weird", but honestly, the only two real difference is that the parens are "on the wrong side" for functions/expressions, and the indentation is truncated (the trailing side of the paren pyramid is accumulated and stuck to the end of the last functional line).


Doing that stuff in Python would further slow down an already slow language to a crawl.

Calling non-builtin functions in cPython is super expensive.


[dead]


Janet doesn’t have regular expressions built in to the language. Instead, you write parsing expression grammars. https://janet-lang.org/docs/peg.html

Spork has a regex-like syntax for constructing PEGs, but the semantics are slightly different.

And http is also part of spork, which you have to install separately. (jpm install spork)


[flagged]


Please don't take HN threads into programming language flamewar. It's the last thing we need here.

https://news.ycombinator.com/newsguidelines.html


> I feel the need to say it every time I see new lisp-like language. Normal people don't like lisp, it looks weird.

So? Normal people aren’t computing enthusiasts, so...

> Are lisp-like language designers aware of that?

No, lisp-like language designers have never hears the parenphobes that jump into every discussion of lisp-like languages to make their feelings known.

Also, no one involved in Python, YAML, etc., has ever heard anyone complain about indent-sensitive syntax.

> You are throwing away sizeable portion of the possible user base.

Parenphobes are throwing away a sizable portion of their potential toolkit. If they are okay with that, I am sure lisp-like language designers (especially the ones doing it as a labor of love and not getting paid for it) can live with losing the neediest, most entitled segment of their “potential user base”.


> So? Normal people aren’t computing enthusiasts, so...

even many (most?) computing enthusiasts aren’t into lisp. I am. Since this language is lisp-y but not deeply lisp, you’re now finding that not only are anti-lispers put off, but also lispers.


Would you also say that musicians should only create and perform popular music?


absolutely not! people should do whatever they want but if you’re going to make atonal music in a non-traditional scale, you shouldn’t be surprised if you have a limited audience.


So? Nobody is surprised that lisps aren't popular.


vaguely kinda related. I want to like lisp for its features but the syntax is too much for me to pick up without serious time investment, which I am not sure I want to make.

are there any other languages that support for example lisp-like interactive REPL, eg analyzing program state while it is running?


Lisp syntax is a lack of syntax, it takes five minutes to learn. Everything being an expression, expressions being replaceable by their evaluation, quote/unquote/quasiquote, macros is what'll take time to adapt to.

Conveniently analysing program state while it's running is more a feature of languages that run in an image than something lisplikes have in common, like Common Lisp, Factor and Pharo, but not Racket or Guile.


> I want to like lisp for its features but the syntax is too much for me to pick up without serious time investment

There's atoms, and lists, and at the highest level of a program, the head of the list is the thing to be invoked (function, macro, special form) and the tail is a list of args. At lower levels, the higher level context says what a list means, but either a list itself or the same head/tail interpretation are the main things.

And there are some quoting constructs that control if things are bare atoms or references to the values of names, etc., and that's about it. Lisp has very little syntax. I mean, unless your main language is raw assembler or some esoteric toy like brainfuck, lisp almost certainly has much less syntax.

> are there any other languages that support for example lisp-like interactive REPL, eg analyzing program state while it is running?

Erlang.

Ruby (via Pry).

Lots of others.


> the syntax is too much for me to pick up without serious time investment

Umm, do you know any language with simpler syntax than Lisp has?

For me, the main problem with Lisp is that there's so many Lisps. I think I philosophically prefer Scheme over Common Lisp, but then again there are so many Schemes to choose from!


Funnily, I'm the exact opposite. I love the simple and consistent syntax of Lisp, but I don't want to use a dynamically typed language if I can avoid it.


That's why I fell in love with SBCL, it'll warn me about things like "hey, you told me this variable is an integer, but you're calling length on it, that doesn't make sense."


i do not get why they keep writing heavy metal music. "Normal people" (sic) do not like it.

Are metal bands aware of that? They are throwing away a sizeable portion of fans and record sales.

--

I'm not sure you're acting in good faith when you say normal people hate Lisp, thus it's only weirdos that use it. There is no constructive discussion to be had here.


I feel humming Marilyn Manson's s/The beautiful people/The normal people/g (aaah) is now appropriate! (:


There was a time when computerey people prided themselves in not being "normal." You can take your normie rust, I'm quite happy with my parentheses.


Turns out, even computer things are social since we're at our best when apes work together to build complex things.


That is true, unfortunately most apes just build messy things these days. So, select your tribe with care.


> Normal people don't like lisp, it looks weird.

Have you considered that this is perhaps not a problem with lisp? Normal people do not much like writing computer programs at all.


The empirical evident would be against you, as tens of millions of programmers prefer writing code in those other languages that lack the parent :-)


They may or may not. I prefer lisp, but I mostly program in TypeScript and Go. My theory is that if most programmers tried lisp for a month or two, they’d end up preferring it.


I learned lisp (well, mostly scheme) just enough to be dangerous - and to have a lot of fun.

I don't end to find myself -writing- lisp very often but in a lot of cases I end up -thinking- in lisp and then translating across to whatever language I'm implementing in before I start typing.


My claim was that normal people preferred not to write any programs at all in any language. As to programmers, obviously there are smarter ones and less smart ones and to each his own.

Here, a weirdly misplaced smiley back at you :-)


Smugness doesn't make one smart. Lispers have plenty of the former, though ;-)


Well, projects like Wisp* or Readable** exist, but the people who use Lisp tend to prefer S-expressions, so those projects don't gain a lot of mindshare.

It's kinda like M-expressions, a "normal" syntax gets proposed, but Lisp hackers prefer S-expressions, so the new syntax just kind of founders.

* https://srfi.schemers.org/srfi-119/srfi-119.html

** https://readable.sourceforge.io/


Normal people prefer syntax-less, unambiguous lisp-languages over the mess we have now. Just look at Javascript, rust or C++ and tell me normal people prefer that.

Only worse languages win, because people are masochochists, but throwing away this possible user base is always a win.


> Normal people prefer syntax-less, unambiguous lisp-languages over the mess we have now.

Normal people prefer REBOL, but timing and licensing choices...


Well, REBOL was based on Logo, which was based on Lisp...


Yes, they do prefer JS over Lisp.

> throwing away this possible user base is always a win.

Ok, then.


var distance = sqrt(x * x + y * y)

That is something very similar they may had in high school.


Math notation is familiar, yes. Are you also saying math notation is good? Is it an example to emulate?

I always thought math notation was atrocious to the point of discouraging me from the subject. Who knows, had mathematics used saner (simpler, consistent, perhaps slightly more verbose) notation, I might've been better at it.


> Are you also saying math notation is good?

If mathematics notation were as terrible, horrible ad you say, wouldn’t mathematicians have changed it by now?


Math notation is terrible! I write like 4 or 6 midterms for Algebra and Calculus for the first year in the university. We are using LaTeX (for most of them), but we need to write LaTeX assuming the copy machine is not very well maintained.

For example, we can't write $e^{-\frac{x}{2}}$ because the minus and the bar of the fraction may not be clearly separated. Also x and 2 will be too small. A solution is to use \dfrac to make them bigger, but the 2D structure of the formula will make the e and 2 be too close and confusing. After some year of trying, one of my coworkers insisted in using $e^{-x/2}$ that is ugly but copy-machine-poof.

We also need huge integrals for $\int_0^7\dfrac{1}{1+x^2}\ dx$. The standard one is too small. I tried to write a macro for that, but the 0 and the 7 must be in weird positions. Also, there are too many cases, because the limits may be missing. Finally, we are using https://ctan.org/tex-archive/macros/latex/contrib/bigints with some tiny macro to rename it to \lint because I don't like the names the author used.

The math notation is very good to write in dead tree paper with a pencil in 2D, but in 1D with a keyboard or automatic processing is horrible.


Math notation also features one-letter variable names, weak typing and lots of ambiguity. Do you want those in code?


> weak typing

Let me disagree! The type is always one, Number (not Int or Float).


No. There are several numeric types: ℕ, ℤ, ℚ, ℝ, ℂ, … However, mathematicians nearly always implicitly convert between them because having explicit conversions all over the place would be extremely annoying and useless.


Literally a one-liner that is way more readable than whatever that was in the OP example.


Yoir sample of "normal people" is vastly different than mine.


Good so


The parentheses would appear to antagonize the humours.


[flagged]


Dammit Janet.


line noise much? man, that example is not pretty


If you're not used to working with lisps, your reaction is basically just unfamiliarity, and not actual discernment. Lisps are inarguably and objectively syntactically simpler than non-s-expression languages.


It's a Lisp. For anyone acquainted with other Lisp dialects, it's about as readable as it gets.


Pro-tip: move every open-paren to the right of its first enclosed token.


It also help to move each closing paren to it's own line, so instead of

      )))

 it looks like

      )
    )
  )
It's almost the same number of parens, just a different formatting.


Please don’t do this… There’s plugins for working effectively with lisps. You probably just need the rainbow parens or better contextual highlighting.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: