From what I've seen of Ocaml it's similar but with strict evaluation. Then again, it has no support for unicode strings so I never really looked at it again.
But algebraic data types, exhaustive pattern matching, and modules make it the next best thing, I think.
Lack of well-made strings puts the language as a whole in a bad light: if you don't bother supporting fundamental practical needs, asking me to use your experimental, half-engineered proof of concept is arrogant.
Absolutely. I know many engineers hate the word but this is simply extremely bad marketing.
I get it, you want to work on interesting scientific problems and/or you work for Jane Street (a huge financial company); but the lack of desire to circle back to certain basics and nail them once and for all sends a hostile message to me as a programmer looking to add OCaml to his tool-belt. It tells me "we don't care".
I even spent two evenings getting the tools set up so that I can install a library that apparently supports unicode. I got nowhere. The core language might be interesting, but everything around it - including unicode strings - is a mess.
I'm sure that very smart people have risen to the challenge of supporting strings in Haskell, and it is therefore possible for a sufficiently motivated user to process text decently after all.
But the problem is that strings must be a built-in feature in any programming language that wants to be taken seriously (with an exemption for specialized ones that don't need strings, like GLSL). What would you think of a hotel that doesn't have mattresses but allows guests to bring their own?
Huh, the problem isn't that people have made their own. The string types are all in the standard "boot" packages. The problem is that there is one that is obsolete from another era (String) and four usable ones, each combination of strict/lazy and Unicode/Bytes.
Haskell gets twice as many as Python (say) because of the desire to have strict and lazy versions.
Languages are judged by their standard library, and standard Haskell strings are part of the problem and not part of the solution.
Multiple moderately bad string types, with strict typing that exacerbates interoperability problems, are worse than one really bad string type (like char pointers in C) or nothing at all.
But algebraic data types, exhaustive pattern matching, and modules make it the next best thing, I think.