Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I feel like this is a lot less impressive nowadays that first-class functions are common. It might as well be Python:

    dx = .0001
    def deriv(f):
        def f_prime(x):
            return (f(x+dx) - f(x)) / dx
        return f_prime


Python does not do tail recursion, though!

    def f(n):
      if n <= 0:
        return n
      return f(n-1)
    f(1000)
will blow out your stack. Meanwhile,

    scheme@(guile-user)> (define (f n) (if (<= n 0) n (f (- n 1))))
    scheme@(guile-user)> (f 1000000)
    $1 = 0
Also bear in mind that this was lesson 1 of an intro programming course that culminates in writing a Scheme interpreter in Scheme. There's something to be said about the simplicity / consistency of a language's syntax such that a first-year undergrad can implement it.


It seems like a lot of Lisps don't do tail call optimization either though, such as Emacs lisp, common lisp, and Clojure.


That was the point of the lecture talking about how their system /could/ do this. I get the impression that scheme was one of the first languages that mandated that, though I suspect someone with a better view of history can correct that.

For common lisp, I thought it was just not required, but that it was common to do so?


True!

Although, a more nuanced point: it may not be required by the Common Lisp spec, but that's not to say that it's not done.

clisp:

    [1]> (defun f (n) (if (<= n 0) n (f (- n 1))))
    F
    [2]> (f 1000000)
    
    *** - Lisp stack overflow. RESET
sbcl:

    * (defun f (n) (if (<= n 0) n (f (- n 1))))
    F
    * (f 1000000)
    0
(Apparently clisp does do some tail call optimization, but I can't figure what it'd be used for other than an non-terminating loop such as a REPL, if it doesn't even handle this simple case properly.)

Heck, even GCC can do tail call optimization for C, even though that's definitely not in the spec.

    #include <stdio.h>
    
    int f(int n) {
      if (n <= 0) {
        return n;
      }
      return f(n - 1);
    }
    
    int main(int argc, char **argv) {
      printf("%d\n", f(10000000));
    }
Try compiling that with gcc -O0 (wherein you'll get a segfault) vs -O2 (wherein it'll print 0 as hoped).


> Apparently clisp does do some tail call optimization, but I can't figure what it'd be used for other than an non-terminating loop such as a REPL, if it doesn't even handle this simple case properly.)

You simply forgot to compile the code!

    [1]> (defun f (n) (if (<= n 0) n (f (- n 1))))
    F
    [2]> (compile 'f)
    F ;
    NIL ;
    NIL
    [3]> (f 1000000)
    0
The limited form of tail call in CLISP is a feature of its compiler only, not of the interpreter.

SBCL compiles everything, so there is no need.

But SBCL requires an existing Common Lisp installation for bootstrapping, whereas CLISP bootstraps itself just from C thanks to its interpreter.


> But SBCL requires an existing Common Lisp installation for bootstrapping, whereas CLISP bootstraps itself just from C thanks to its interpreter.

The SBCL people claim you can bootstrap SBCL with CLISP.

http://www.sbcl.org/getting.html#compile



Aha, thanks. I had found some docs that indicated that clisp supports TCO, but it wasn't clear to me how to make that work.

(I am nominally familiar with Common Lisp, but have never used it in a serious capacity.)


There are Python implementations that avoid that problem as well, e.g. Stackless Python.


Clojure is so close. I never understood why it went for a special form rather than proper TCO.


This is what Rich said about TCO in Clojure in 2010:

```quote:

When speaking about general TCO, we are not just talking about recursive self-calls, but also tail calls to other functions. Full TCO in the latter case is not possible on the JVM at present whilst preserving Java calling conventions (i.e without interpreting or inserting a trampoline etc).

While making self tail-calls into jumps would be easy (after all, that's what recur does), doing so implicitly would create the wrong expectations for those coming from, e.g. Scheme, which has full TCO. So, instead we have an explicit recur construct.

Essentially it boils down to the difference between a mere optimization and a semantic promise. Until I can make it a promise, I'd rather not have partial TCO.

Some people even prefer 'recur' to the redundant restatement of the function name. In addition, recur can enforce tail-call position.

```endquote

https://groups.google.com/g/clojure/c/4bSdsbperNE/m/tXdcmbiv...


If I remember correctly it is a restriction of JVM. That's why Scala also needs a special annotation for TCO.


That's not quite right. In Scala, the annotation is there so you get a complier error if it does not do the optimization. It doesn't affect when the compiler actually does the annotation.

As you can imagine, there are cases where the developer is counting on the optimization but the code isn't such that it can be made.


Your post got me curious and I did some googling. I think this explains it,

https://softwareengineering.stackexchange.com/questions/1576...

Looks like it's a scalac optimization that doesn't always work hence the need for the annotation to give the developer feedback.

There is also a link to a Closure discussion board exploring why Clojure doesn't have the annotation, by design. I didn't read it as I'm not that interested.


Not true, SBCL definitely has TCO.


The Common Lisp standard doesn't mandate tail call optimization, but it also doesn't even mandate garbage collection. It can be safely assumed that any non-toy implementation will include both.


> Clojure

It's been a while since I've messed with Clojure, but I thought Clojure had a function specifically for tail calls. I think it was recur?


Right, though there are some additional caveats, e.g. that it can't be used to implement TCO for mutual recursion


Aren't those basically 90% of the Lisps used in production environments, though? :-(


Isn't Project Loom going to solve that problem for Clojure?


> Also bear in mind that this was lesson 1 of an intro programming course

Back in '83 when the author took 6.001 only a minority of students had previously used a computer (note that Hal is quoted as having said, “If you already know how to program..." (emphasis mine). For the majority of students back then this wasn't just "introduction to programming" but "introduction to computers".

Lots of non-EECS students took the class.


The truly mind blowing one, to me, is it is one of the first handful of lessons that they go over symbolic derivatives.


But this is not symbolic differentiation, it's numerical differentiation. A true symbolic integrator would not give 48.00120000993502 when asked for the value of 3x^2 at 4.

This kind of approximation exercise is given to calculus students all the time ("Use the definition of derivative to approximate the derivative of f(x) at 3 by taking dx=.001," etc.).

edit: I misread! See reply below.


Right, I was referencing later lessons. Section 2.3 of the book. Surprisingly early.

Granted Knuth's work also did symbolic rather early. A bit harder to work with in his.

Edit: To be fair, I could have worded that first post more clearly! :D


Yann LeCun created a lisp called Lush which allowed for the intermixing of C code with lisp and also allowed for interaction with Python. Eventually he would later make Torch which used Lua which was inspired by Scheme. Then Tensorflow and Pytorch basically are the two large players in deep learning. Now Google has JAX which allows for symbolic differentiation of Python code.


Oh, I forgot that Lush was LeCun's work. It was such an odd beast...


This is certainly one of the earlier motivations on why you should make procedures "first-class" in the language. And, indeed, it should be equally impressive in any language that supports things as a first class member.

I have come to feel that "first class" will eventually be expanded to "can operate against definitions as easily as with them." (That is, can your language let you modify code with the same style code that can be used to execute it?)


It's also pretty rare to write a derivative function. CRUD and ETL--that's what a language needs to be good at.


If there's already a language that's good at CRUD and ETL (or aims to be), and most programs are that, then it's not a problem to have other languages that are good at other things at the expense of being good at CRUD and ETL.

Even if it's true, as you say, that it's rare to write a derivative function, it still does come up. And there should be a good way to express it in at least some languages.


Not all of us write web apps. :)


My life is so much better since I got out of that domain :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: