To a man with a Haskell all the world is a nail...and every nail has a type and requires a special hamner. I.e.:
(define side-effect (cons print "This is not an IO monad"))
(define its-already-generic (cons number->string 4))
The more I think about Lisp the more I believe that the best reference language is C. [1] Lisp took form as a first step away from assembly, not a first step toward the singularity. What is garbage collection but some lazy programner automating memory management?
The mistake is to take the ideological fanaticism of Lispheads for ideological purity of the language. There is none. Even the great schism of Lisp1 versus Lisp is rooted largely in arguments from precedent...and precedent for Lisp is largely that it can shave a few precious bytes and register loads by allowing a single symbol to point to more things.
Yeah, nobody is ever going to agree about language statics. From each person's POV: "you either get it or you don't".
But totally agree about the pragmatism of Lisp. It's clever self-interpreter is clever but not terrifically interesting as a proof of simplicity of language but instead a proof of simplicity of self-implementation.
It's not that I don't think that staticly typed code isn't advantageous. It's that it seems to often impede finding better abstractions by locking our thinking around local minima.
'a . 'b -> 'b
Is a great place for a procedure to end up in production code. Yet, it's not necessarily a great place for me to start because I'm lazy and might try to force
'c . 'd -> 'd
into working kludgetasticly simply because like most people I find it hard not to love my own ideas, and inertia makes it easy to stick with them even when I might find better alternatives.
Static typing, like any heuristic, does well in some situations and poorly in others. The bad thing isn't that it's not a free lunch. The bad thing is thinking that it is.
The same of course is true for dynamic typing and strong typing and duck typing and Haskell and Lisp. Though I'm not sure that Lisp was ever intended to prove anything as a programming language [1] in the way that Haskell was, which might explain why after 50 years we can graft enough onto it to argue about its comparison to Haskell.
[1] As distinct from it's invention as a mathematical formalism for describing lambda calculus.
I think that personally that's the place that genericity and inference live. I never write types as concrete as
'a . 'b -> 'b
but instead write code and let the inferencer tell me how general it might be then use that information to peel apart layers which don't need to be conflated.
This isn't tied to this example, of course. I think generally static types get misused the thought is that they don't let you experiment with code. I think that they don't let you experiment with code in the same way that dynamic types do, but I don't think they're at all impeded for that. It just takes investment into a different kind of experimentation.