Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
OCaml 5.0 Alpha Release (ocaml.org)
226 points by momentoftop on June 15, 2022 | hide | past | favorite | 121 comments


For almost as long as I've been programming, I find Ocaml a fantastic language, which just lacked some development tools and improved concurrency. I still believe that under the right circumstances it could become a mainstream language.


I like the language, but I'm not sure that it actually targets any mainstream application. If you look at something like golang, it's very geared towards online server work bc that's what the language maintainers wrote it for, hence a lot of focus on GC-induced tail latency. Python and its package distribution system are great for glue code that's going to pick from a wide library of modules. C++ is amazing for portability and performance / zero cost abstractions. Java has its weird cult in enterpriseware and also is an OK glue layer for mobile UI stuff.

Comparatively, OCaml was initially written with symbolic operations in mind, and thus I personally think it's a great language for writing parsers, compilers, modeling systems etc. The type system makes it easy to avoid certain kinds of common errors in those programs and the tail latency of GC operations is typically not as important for those applications as overall throughput. I don't think that heavy symbolic computations are a mainstream target though. To be clear I think that's absolutely OK for a language not to aim for mainstream use; but if one wishes for mainstream use, then it's important to be clear about the kinds of use-cases within mainstream that will be targeted.

It has been a while since I played with the language though, so maybe things have changed since. Happy to learn more if I'm wrong and OCaml now has a clear mainstream target use case in mind.


Why does OCaml need a 'clear mainstream target use case in mind'? It's a general-purpose language. It can be used for anything that Java, Go, or Python can be used for, and many things that C++ can be used for to boot.

Sure, OCaml's heritage is descended from a theorem prover helper language, but that doesn't meant it's forever stuck at that use case. Python was originally an educational teaching language, nowadays it's a data science and glue language. None of that was planned out in advance by the Python folks.

In fact even if you ask the Go team, Go is arguably the most targeted language of the ones you listed, in terms of its use case--and they were surprised that a lot of their userbase came from ex-Python devs who wanted something more reliable and efficient!

OCaml with version 5 is just hitting its stride. With the advances in the language, the multicore support, and tooling, it's going to be competitive with Rust for many use cases. It's worth the wait to see where it goes.


I think to gain adoption any general purpose language needs a clear niche to gain a foothold. If you are choosing a tech stack then choosing a language that is not already widely adopted is always a risk. The tooling is usually inferior, there are not necessarily high-quality libraries in domains you care about and you pay a cost in training new employees in a new tech stack.

Java was an OO, garbage collected language that supported (sort of...) hot swapping code. Go was a reasonably performant, garbage collected language that compiled to static binaries and has great cross-platform support. Some grad student decided to write Spark in Scala so you had to use it for Spark jobs Rust is a systems programming language with zero-cost abstractions and no GC that allow you to write mostly memory-safe code.

What is the killer feature of OCaml that makes it worth the investment?


OCaml has many (almost all) of the advantages that you attributed to the other languages, and adds many more besides. What makes them 'killer features' for those languages but not for OCaml?


Nothing really, but they are mainstream languages and OCaml is (currently) not. So if you really need one of those then there is already a more widely adopted option that gives it to you. It's not that OCaml is bad (I like it a lot) it's that to adopt it you need a really compelling reason to do so.


Not saying it needs to be planned in advance, but that there needs to be a very strong competitive advantage in some use case, not just checking the feature boxes. Continuing on the case of Python users moving to Go, they're probably not doing so for interactive use-cases or avec to scientific libraries...

Does checking the concurrency box make OCaml an awesome choice for some use case?

You mention Rust for example but that language is sort of built around a zero cost abstraction principle, which is very much not the case of OCaml, so I still think it's going to be viewed as risky for highly performance sensitive work.


Yeah I mentioned Rust because it's muscling its way in to many use cases which don't really need 'zero-cost abstractions' and can definitely afford a GC, e.g. say microservices. A lot of work has been (and continues to be) put into OCaml to make it very competitive in these areas.

For truly performance-sensitive work of course people are going to choose C/C++/Ada/Rust.


If I have learned something is that software, not the language, sells the language.

Mainstream languages have killer software (Spring, Synfony, Pandas, whatever) which then prompts adoption.

Languages for mainstream programming are less of an issue than the software.

Haskell, CommonLisp, etc, plenty of great languages lack that killer software or clear benefits over other mainstream languages.


I remember Python being already popular in 2000 as a scripting and glue language, a Perl replacement for people who recoiled from Perl, or whose coworkers rejected Perl. Then Eric Raymond wrote a glowing article about his first experience with Python[0], which brought it even more attention. Pandas didn't exist until much later.

Same thing for Spring. Spring came about as an attempt to make J2EE more usable, which is to say, it didn't exist until years after Java was already being widely adopted in the industry.

Checking on Symfony since I'd never heard of it, it was first released in 2005, which was after PHP had already become popular enough to become the archetypal "bad language" in the minds of people who had never used it.

I think what you're seeing is that the ecosystem that develops around languages reflects their popularity and the interests of their users. As a readable, low-complexity glue language, Python became popular with scientists and then (unsurprisingly in retrospect) people started using it for data analysis. Java became popular with big iron enterprise programmers, and so it sprouted enterprise frameworks where half your code was in XML. PHP was popular for small web sites, so it spawned a whole menagerie of web programming frameworks. A language without any ecosystem growing around it is probably a language that doesn't get used much.

For a more contemporary example of language adoption, look at Rust. The appetite for the language existed from the get-go based on its design and stated goals. There was a lot of excitement, even from people who said they couldn't adopt it yet because of the lack of ecosystem around it. As the ecosystem developed, more and more people adopted it the moment it was practical, establishing it as a mainstream, growing language prior to the existence of any killer software. In the future maybe there will be a killer Rust framework that brings in a lot of people with no interest in Rust the language, but Rust is thriving despite that killer app not existing yet.

[0] https://www.linuxjournal.com/article/3882


What you said is correct. But then for someone to build the "killer software", they need to be attracted to the language in the first place. The language needs to excite the programmer behind a (future) breakout project.

Sometimes the language is chosen by luck (e.g. company requires it) but many times a programmer will make an explicit choice to use that language.


If this were really true though, then wouldn't F# be bigger than it is?


F# suffers from being a stepchild in what concerns .NET product management.

They accepted to place it on the box, but never gave it the same love as VB and C#, or even C++/CLI, and now they could rename the Common Language Runtime into C# Language Runtime, given how little outside they give to anything not C# on newer workloads.

Had F# been given feature parity with C# and VB on Visual Studio tooling and core frameworks, the adoption scenario would be much different.


F# is great and it benefits from the .net ecosystem but at the same time the language sometimes feels like it is being held back now by not wanting to add new features that c# doesn't support, so its relation to .net is kind of both a blessing and a curse.

Previously it added a lot of stuff on its own like async, etc. which was cool but also resulted in compatibility issues when c# later added similar features.

Now the f# developers are very concerned with compatibility but it basically means that f# can't get new features until c# already has them. It's also limited by the runtime which is designed around c#.

For example, it doesn't support type classes because (among other reasons) that might end up being incompatible with future c# type-class like features several years down the line.

It's also hard to learn f# unless you already know some ocaml/haskell.

ocaml has failed to catch on that much so far but I think it does have potential, and adding multicore support/effects is pretty promising.

On the other hand the fragmentation with stuff like reason/rescript is pretty dumb.


From a pure pro-ocaml perspective I don't really appreciate rescript per se. But it truly does seemed to have raised awareness of the language among web devs who otherwise don't have a reason to worry about much outside of js/ts. The timing is also good because typescript has directly shown devs the value of types, and taught them enough about them to understand what ocaml is really offering above ts-style static type checking.

That's my impression from my last two typescript jobs, anyway. I don't know what, if anything, will come out of this, but I can't see it being bad.


I am curious. What does Rescript have that Reason did not have ?

I know the syntax is now incompatible syntax with OCaml so I see the broken eggs but I don't see the omelet.


I don't think it's that different at the moment but they apparently wanted to not be constrained by reason/ocaml compatibility going forward.

Reason seems to be pretty dead so now the relationship between ocaml and rescript is kind of like that between haskell and purescript.

This may be good if you only want to run it on the frontend but it's not as good if you're using ocaml on the backend and want to share code, but I guess there's js_of_ocaml

I also don't think the existence of rescript is in itself a bad thing, it's just that the whole confusion around reason/rescript may have harmed development of ocaml for a while.


> whole confusion around reason/rescript may have harmed development of ocaml for a while

OCaml is an anchor language for many important projects (e.g. Coq) and companies (INRIA, Janestreet etc). The momentum behind OCaml has grown and I don't think the rescript/reason separation affected it _that_ much.

But I am curious to know if rescript/reason issue harmed rescript. While rescript gained a lot of technical freedom from the breakaway from OCaml it lost some people along the way. Was the breakaway worth it in retrospect?


> if rescript/reason issue harmed rescript.

Personal anecdote: yes, it has.

I was very interested in Reason when it appeared, and it seemed to have immense momentum: exploring arguably better (or more familiar) syntax, tool integrations etc. I know that people ran regular OCaml workflows/projects with it.

And then the whole split happened ... why? "We don't want to be constrained by OCaml" while keeping all of OCaml's syntactic idiosyncrasies among other things doesn't sound like a proper, well, reason.

This is where I stopped being interested (and as I imagine, many people stopped, too). Because a slit in a niche miniature language (which it was at the time) means only one thing: not enough resources to continue with either one.

It doesn't help that the whole split was confusing to everyone. Good description here: https://ersin-akinci.medium.com/confused-about-rescript-resc...


I too lost interest in Reason/Rescript after Rescript became its own thing.

A lot of talent has moved onto other things (or stayed with Reason) with arguably minor gains for Rescript in terms of technical freedom gained.

Typescript is so dominant in this space that it really didn't make sense to split an already small community.


Oh I have no idea I didn't hear about any of these until after the schism or whatever, and haven't bothered to understand the differences and relationships. I use ocaml and hear js/ts people talk about rescript sometimes so I looked into it instead of the others.


Exactly. Functional programming has been around for quite a while, it just recently became the FAD of the day, just search youtube for "functional programming in javascript". This will fade away in just a couple of years.

F# is an interesting language and yes, I used it at work a couple of times. However finding a programmer who is into that is hard and the HR "partners" will kill you because people like these can't be treated as cheap replaceable resources. So it was F# prototype -> C# production. Attempts to introduce a new language or a tech stack usually failed because the decision makers (managers) were concerned about "how many people can I hire within a month" than anything else - nobody was fired for picking Java, C# or JavaScript (TypeScript) for the next project and staffing accordingly.

Languages like OCaml, Haskell, Lisp (whatever flavour you pick) or Prolog will never become mainstream. Should they even? My favourite is one of them for my general hacking or research projects; not sure I'd like to use it in a corporate job (which I have right now). Small efficient team in a tech start-up? Hell yeah! Mainstream mundane programming? OMG NO! Horses for courses. Hearses are not mainstream vehicles yet all of us will need a ride in one occasionally. Does it mean they should become popular and mainstream? :-D


Functional programming is not just the FAD of the day. This is evident by the strong presence of functional programming in Rust, an imperative language. Functional programming is also the foundational idea behind React. As with most things in life, extreme ends of the spectrum rarely pan out. Pure functional programming languages are harder to work with but functional programming has a ton of merit. It's here to stay.

Regarding Ocaml, the functional aspect of the language is not hard. I have trained several junior programmers to write code in it (ReasonML). It is not a pure functional language. The biggest challenge for most people is dealing with types.

Ocaml's standard library is a huge sore point. It also lacks a lot of proper tooling. The biggest problem with Ocaml/ReasonML is that they are unable to rally everyone to a unifying vision to gain traction.


I dare to disagree - I still think that functional programming is a FAD of the day right now. Like object oriented was the FAD of the day in the 1990s and many then procedural languages saw OOP extensions on top of them (Perl where you need a bless() to work with an object? :-D ) and some even were like "yeah everything is an object so let's wrap every procedure into a class and use Command Pattern instead of closures and build a Kingdom of Nouns where people will name their classes after design patterns and everything will be perfect with a cup of hot Java").

Nowadays C++ has closures. What will become the FAD of the day in 10 years? Will wee see embedded Prolog in C# 12 or C++31? Who knows? Like OOP has been with us since 1968? and hasn't disappeared anywhere, functional programming features will not disappear from mainstream languages. But the "cool new shiny silver bullet" will be something else. Like Simula-68 and Smalltalk-80 paved way for OOP, Haskell and OCaml (et. al.) have been paving way for functional programming and Prolog and Datalog have been paving way for logical programming. They won't become mainstream when logical programming becomes the FAD of the day.

BTW you mentioned you taught a couple of junior programmers an ML-ish language. That's awesome! We all (programming community) need more legends like you.


Just a heads up: the word "fad" isn't an acronym, there's no need to capitalize it (and it looks strange to) the way both you and the person you're replying to are.


Thanks. English is not my native language.


>fad

By definition, fad is an intense and widely shared enthusiasm for something, especially one that is short-lived; a craze. I don't see OOP enthusiasm to be short lived.

Many are still living in the Kingdom of Nouns, are follow Uncle Bob's preaching with due diligence and judge you harshly if you don't make everything a class and don't use lots of design patterns even if they are not needed.


> [...] and some even were like "yeah everything is an object so let's wrap every procedure into a class and use Command Pattern instead of closures and build a Kingdom of Nouns where people will name their classes after design patterns and everything will be perfect with a cup of hot Java").

Without verdict about what else you wrote: What a fitting description, "kingdom of nouns". I'll steal that for my next anti-mainstream-OOP-everything-must-be-a-class-rant, if you do not mind.


I believe that expression originates from this blog post: http://steve-yegge.blogspot.com/2006/03/execution-in-kingdom...


When you say "fad", do you also mean over-hyped?


In this case, yes.


> Ocaml's standard library is a huge sore point.

I think this is a bit overstated. It's roughly the same size of JavaScript, missing regexes but having some stuff that JS lacks. And in lots of languages with large standard libraries, people tend to use standard library replacements. Considering that htere are also not a lot of people working on OCaml, the core team time is better spent on stuff like multicore rather than HTTP servers and clients.

The standard library is also open to contributions. For example, in OCaml 4.14, 44 functions were added in the module Seq https://v2.ocaml.org/api/Seq.html. String also had a lot of utility functions for integer decoding in 4.13 https://v2.ocaml.org/api/String.html.


Nit : From what I remember and most unintuitively Reason is the language/alternative Ocaml syntax and ReasonML is its umbrella project.


The alternative Base/Core from Jane Streets looks good though. Of course it splits the community but the work done on it is very good.


They’re great, but Base is new(ish) and not complete and Core is Linux only (maybe it work on OS X too?)


The Unix bits were broken out of core earlier this year, it’s now multi-platform. https://discuss.ocaml.org/t/ann-v0-15-release-of-jane-street...


That’s great to hear, I missed that update (still on 0.14).


It's not about functional programming, it's about managing state. And FP is good at managing state, so no it's not going away, specially with multiple cores and distributed systems. Even databases are learning from the patterns seen in FP.


But wait, OCaml is not just a functional language, even the "O" in its name stands for "Objective".


That doesn’t explain why languages like Go, Kotlin or Rust became mainstream, and why languages like OCaml or F# didn’t. This is the real question. Is it because most programmers prefer languages that are mainly imperative instead of functional? Is it because of where those languages originated? Is it because of the runtime, the tools, the documentation and/or the standard library, and not the language itself?


I think its because influential companies in the tech space pushed them. For instance, Go is backed by Google, and they made alot of rounds evangelizing Go.

Kotlin is a Jetbrains project and it certainly didn't hurt that Google made Kotlin the preferred language over Java for development on Android, and that Kotlin itself has great Java ecosystem compatibility.

For years, Rust was Mozilla's baby, and they did a lot of good work evangelizing Rust for its use case, and other companies adopted it as well, continuing the cycle.

There is nothing I can think of that is comparable for F#, OCaml, Haskell and many other fine languages


The corporate backing is a good working hypothesis for C, C++, Java, C#, Go, Kotlin, Swift, Dart, JavaScript, TypeScript and Rust. But it doesn't explain how PHP, Python and Ruby became mainstream. Maybe those ones are just outliers, products of a very special period of time when "dynamic" programming languages were popular (and it's easier for a small team to develop such a language) and when the web was growing very fast (and many devs were bored with the "ceremony" of "static" programming languages like Java).


They were first round languages. When Python came out, it was largely a C / C++ driven world (Python predates Java by 3 years). It's original use case was around scripting C / C++ programs. If I recall correctly one of the biggest use cases for Python early on was doing test harnesses for C++ codebases in particular. Python's ability to basically be "wrapper for C / C++ libraries" is still its main strength, though clearly the language evolved since this time period. Pascal was a big player too back then, and has fallen quite substantially from developer mind-share. (in the 1980s, predating this era, there was a lot of work and energy around Objective Pascal and getting that on every platform they could. It almost worked. This is what Embarcadero[0] is all about via Delphi[1])

PHP was "easier" Perl. Perl was also ubiquitous at one time because of CGI and the web. PHP was one of the first languages to have the same ease of deployment story as Perl but a much easier syntax to deal with. Again, language space wasn't as crowded then. Certainly both were easier to use than C / C++.

It was easier to gain a foothold in the first round (web 1.0 era if you will) of software and development. There was simply less competition. Java didn't even come out till 1996.

The second era of languages, I think really starting with Go and continuing through to Rust, they faced a bigger uphill battle. There simply isn't the mindshare to capture as easily as there was back then.

Admittedly, I'm still not sure to this day why Ruby took off, other than it offered a great developer first experience as a language, from what I understand. It seems people who use Ruby tend to really like it. It seems it has a stickiness there that other languages don't, maybe? I've always been confused by this one, and I admit that's due to personal bias of not enjoying its syntax at all, however there are clearly lots of people who do. I just prefer one true way of doing things.

[0]: https://www.embarcadero.com/

[1]: https://www.embarcadero.com/products/delphi/object-pascal-ha...


i think ruby took off because it was a better perl at a time people were really wanting one. python didn't really fill that gap because it did not embrace perl's shell scripting and quick text processing aspects, whereas ruby did, as well as providing a much cleaner and more consistent language. also the developer experience really is wonderful.


Python became mainstream thanks to be a saner Perl replacement and Zope.

CERN and Fermilab were already playing with Python as administration tool and data analysis during the early 2000's.

PHP became mainstream thanks to being a better Perl for Web applications, and the only option in many ISPs.

Ruby has Rails to thank for its adoption, a product produced by Basecamp.


PHP I feel like was mostly all the old web providers you could sign up for to run your own site always had LAMP stacks available as options either default installed or trivially installable, often with instructions.

Ruby I'm not sure what caused the hype unless it was something like Heroku making deployable Rails easy.


Didn't python become mainstream until numpy/pandas? In my mind, data science was Python's killer app, but maybe that's just my revisionist brain. Similarly, I think Ruby had Rails as its vehicle.


The vast majority of startups in the previous decade were built on Python web frameworks

Youtube, Instagram, Pinterest, Reddit etc.


Rust also brought something completely novel to the mainstream, so I wouldn’t put them next to Go and Kotlin, which are at most different combinations of existing features.


I'm pretty confident we will find a fairly direct correlation between how mainstream the language has become and how much money was poured into its development (i.e. headcount of engineers working on the project and their salary levels).

Now I know correlation does not equal causation. But you have to think it must have some effect when Google is paying its Distinguished Engineers--the likes of Rob Pike, Ken Thompson--plus at least several more engineers, project managers, and others. All to work on Go. Can you imagine what the OCaml ecosystem could do with that kind of money and dedicated engineering talent? Heck, any technology for that matter.


F# is a fantastic language! However, its official documentation is a bit lacking (like the C# ones). Also having developed some reactive applications I'd say .NET core needs some improvements before people can have fun developing desktop or web apps with it


F# can't become big for the same reason C# isn't bigger and Swift probably won't be bigger either. It's locked into an OS-specific platform and very specific non-free dev tools. (Even if .net is open now, it still took too long to be available on Linux, and now it's reputation is as a Windows only platform).


F# isn't bigger because it's a functional language and everyone starts learning programming with a C based language.


> everyone starts learning programming with a C-based language

Idk about that, especially in the past decade. Python seems to be the most popular language that people pick up on their own as their first one.

As for the intro college courses for programming, they mostly seem to be Python or Java, with occasional C (havent seen that one myself yet for an intro class), and one incidence of OCaml (apparently Cornell teach their intro to CS in it).



I think it could've been huge with Reason, but they decided to split the language, the development, and the community apart.


Was used to bootstrap Rust. Others clearly also think it's a great language!


I heard that 5.0 does not include (user visible) effects (yet?).

Is this planned for after 5.0? Anywhere I can learn more?

Getting an effects system with handlers in a (semi)large and mature language is a pretty big step.

The Koka language [1] has a brilliant implementation of algebraic effects. Sadly it's a research language that doesn't see much development.

[1] https://koka-lang.github.io/koka/doc/book.html#why-handlers


Effects are included in OCaml 5, but considered experimental. This doesn't stop us from building libraries that take advantage of them internally, in order to provide really nice external interfaces.

The best developed one is "eio", which uses effects (and io_uring on Linux) internally in order to provide a really high performance, direct-style IO library for OCaml. You can walk through some of the code here: https://github.com/ocaml-multicore/eio#getting-started

Also a video talk about our experiences with using effects for writing parsers, from the OCaml Workshop last year. https://watch.ocaml.org/videos/watch/74ece0a8-380f-4e2a-bef5...


The effect system is here, and it can already be used. However, the integration within the (surface) language and its type system is still a work in progress. Thus to explore the use of effects, you need to use the experimental Effect (https://github.com/ocaml/ocaml/blob/trunk/stdlib/effect.mli) module which gives you access to low-level primitives for defining effects, installing an effect handler and resuming a continuation. The module will be considered as experimental in OCaml 5.0 and subsequent version of OCaml 5 will refine the integration within the language. The objective here was to decouple the multicore runtime work from the higher-level language design questions.


I would also recommend reading the effect handlers tutorial in OCaml 5.0 manual: https://kcsrk.info/webman/manual/effects.html.


Is there any plan to have effect types like in koka, where the effects that a function performs are visible in the type, and handling an effect eliminates it from the function's type?


The medium-term plan is indeed to have such a typed effect system. This one of the reason why the effect system is still considered to be experimental: we expect to see some breaking changes once effects become typed.


They plan to mainstream effects in OCaml 5.1 IIRC.


Whenever an OCaml post comes up the biggest comment about the language was "no concurrency/parallelism". I hope this brings more people into the OCaml world.

Awesome job to all the devs.


For me it's the several stdlibs and Unicode strings.


Haskell has many preludes also but people seem to embrace that flexibility there.

I don't know much about the intricacies of unicode but as usual there are some OCaml libraries available. Are they incomplete/insufficient in some way?


Hoping this release will bring the below advantages.

1. A statically compiled language with GC-d runtime, which compiles quicker than golang

2. Something that brings algebraic effects to the mainstream and with it an arguably better model for concurrency / parallelism

3. Support value types to take advantage of modern CPU caches

Finally golang finds some real competition (from a very unlikely source though). Hoping ReasonML will become more popular with this release with true parallelism and nice concurrency.


ReasonML is now Rescript, and is still using the 4.06 compiler. I think the idea is to move ahead largely independently of Ocaml, and that a move to 5.0 now is probably seriously ambitious given the runtime overhaul.


So it's Reason, not ReasonML which the umbrella project's name, and Rescript is a imcompatible syntax split from the Bucklescript team (that previously transpiled Reason to JS). Bucklescript's new name is... Rescript.

But not everyone agrees with the split and work is being done on Melange to replace Bucklescript : https://github.com/melange-re/melange

Ultimately JsOfOcaml can directly transpile Ocaml to JS.


> ReasonML is now Rescript

Apparently it's not. The whole thing is a mess :)

https://ersin-akinci.medium.com/confused-about-rescript-resc...


Reason and ReScript are two different projects. You can still use Reason to compile natively and to JavaScript via js_of_ocaml.


Can't wait for tinkering with OCaml when 5.0 is stable. Pretty sure the community will invent amazing libraries for parallel computation as well.

Maybe finally Rust+tokio projects will have some actual competition!


OCaml is possibly my favorite language, along with Rust, and what I hope from the Santa is projects like https://github.com/tezedge/ocaml-interop to become mature.

I think all these "properly typed" languages should aspire to have great interoperability: after all, they have types to help. But I realize there can be big technical difficulties in making it safe, in particular with garbage collection..


I'm a bit confused...

the readme of ocaml-interop says it's "inspired by" ocaml-rs

while ocaml-rs says it "uses ocaml-interop behind the scenes" and "also exports an interop module, which is an alias for ocaml_interop and the two interfaces can be combined if desired"

from a cursory look over ocaml-rs seems possibly the one to use, as the more comprehensive project


The development of the OCaml-Rust interface is a bit organic. `ocaml-interop` aims to be a low-level but safe interface. At Inria we worked with the author to integrate ideas about using Rust's type system to safely work with a GC. We have more ideas to make it better; if someone wants to commit engineering time in this area they should feel encouraged to contact me.


How does it compare to Rust? What do you like about each?


The garbage collector is nice because sometimes you don't want to think about memory management or the borrow checker.

The syntax is really cool, it looks like python but it's completely whitespace insensitive.

The module system is neat, and I actually like writing module interfaces (which are basically analogous to header files in C/C++). It makes dependency injection trivial, because you can have a module say "I depend on a module with this interface, but the caller can choose which one", which is more useful than it sounds.

It's also a functional language, so you have very little mutability, and it supports """monads""" with its super innovative "monadic let" syntax. (It's basically like overloading the semicolon, which probably doesn't make any sense, but it's really cool.)

Dune is a very pleasant build system and Esy is a very pleasant package manager. The language server is good. I'd say in this dimension it's competitive with Rust.

There's really quite a lot to like about OCaml. My biggest gripe with it is no typeclasses/traits. But I have hope that modular implicits will land within our lifetimes.


What exactly is Esy ? I hadn't heard of it before and its website doesn't seem to provide an obvious answer to that question.

You say it is a package manager. Is it supposed to replace opam ? If so what does it offer that opam doesn't ?


It's a package manager that replaces Opam. It doesn't require switches and it caches builds of packages across projects.


It's like Rust's cargo or Elixir's mix. A task runner, mostly.


> The syntax is really cool, it looks like python

I'm squinting really hard but not seeing the resemblance.


I said that because there's no curly braces and you don't usually have to use semicolons


> you don't usually have to use semicolons

That's compensated by the use of double semicolons :O)


Ha! That's only in the REPL, which I never use


For nested parallel computations (think Scientific Programming, where one would use OpenMP, Rust Rayon, etc), we have domainslib [1]. Eio, a direct-style, effect-based IO library is pretty competitive against Rust Tokio [2]. The performance will only get better as we get closer to the 5.0 release.

[1] https://github.com/ocaml-multicore/domainslib

[2] See the http server performance graphs at https://tarides.com/blog/2022-03-01-segfault-systems-joins-t...


Very impressive, thank you. I admit I'm not much of an early adopter -- hence my comment that I'll be waiting for the official 5.0 release.

I've scanned several articles (some made by you) and I very much like what I'm seeing.


An ML that competes with rust+tokio is an exact description of haskell. It's had the best multicore runtime and concurrency primitives of any general purpose language for quite some time. No need to wait.


My impression is that Haskell's laziness and focus on purity though make it rather a pain to work with, whereas OCaml is a lot more pragmatic like Rust.


That's why OCaml won over Haskell and Standard ML for me. The authors are not afraid to “get their hands dirty” with procedural constructs or even OOP for the cases where those things are useful.


This impression does not match reality. It's great to work with. I'd certainly take it over ocaml (having spent probably thousands of hours on both).


Can you give a few examples about why did you pick Haskell over OCaml, please? And let's skip "elegance" and other pie-in-the-sky magic.

Let's talk productivity in commercial projects. Or making scripts for your own use (if you're not happy with bash/zsh/fish... which I'm not).


> An ML that competes with rust+tokio is an exact description of haskell.

Haskell only exceeds rust+tokio in terms of overall code elegance and succinctness.

When it comes to performance, rust+tokio is going to be much more performant by default as there will be no GC, no hidden traps with laziness, no subtle memory leaks etc.

Sure, a Haskell wizard will be able to reduce the gap further and solve issues but it is going to be difficult to consistently beat rust+tokio. By the time said Haskell wizard makes changes to the code, the famed elegance of Haskell is going suffer as the code will be littered with strange incantations of strictness annotations, possibly some C-ffi and other magic. This will not be the beautiful Haskell that you learn in textbooks. Only very few people know how to make Haskell truly fly. If you're one of them, then you're lucky!


> When it comes to performance, rust+tokio is going to be much more performant

This is nonsense without significant further qualification.

I guarantee the tokio scheduler and async stack model gives you net worse results for many real workloads.


While quoting me you omitted the phrase “by default”. I said “When it comes to performance, rust+tokio is going to be much more performant by default”.

What I meant was in the typical case, you can expect rust+tokio to be faster than Haskell. This is not a surprise as rust is much more low level, does not have a GC and its compiled output maps better to modern processors rather than the pure functional lazy style of Haskell.

Plenty of benchmarks across a variety of workloads and program confirm that Haskell is slower than rust on average. For webserver type use cases (which uses a lot of Async) Rust is faster. Check out tech empower benchmarks for example.


Including work-stealing scheduler and runtime? Ability to just spawn 50_000 tasks -- regardless of whether they're CPU-intensive or I/O bound -- and have the runtime handle it speedily?

I've heard Haskell praise a good amount of times but at the same time many people also said that doing actual work with it has more friction than it should, so I don't know.


I don't know if those things you ask are in Haskell specifically, but what I do know is that Haskell concurrency is pretty much best in class. You get channels (Control.Concurrent.Chan is in the stdlib), STM, green threads (forkIO). You can also get stuff like async (from the async library) or streamly (even higher level than traditional async). I'm not 100% sure if you can push insane speeds with it (even though Haskell is really fast, it's just very tricky to optimize correctly) like you can with Rust, but the developer experience on concurrency is just through the roof, imho (dare I say even better than rust). STM is just the best kind of magic.


Yes. Go spawn a million tasks in haskell. It will work fine. 50k is child's play for GHC runtime. Only serious competitor is BEAM.


Yep, that's what I was getting at: Erlang/Elixir do this effortlessly. I'd love for the popular system languages to have the BEAM runtime.

Well, based on your comment, I might reevaluate Haskell. Last time I was severely put off by lack of good tooling (but I did hear cabal was improving) and a fragmented ecosystem. Maybe things have changed.


both cabal and stack have improved, there's now a pretty good language server in HLS, there's ghcup to download and install both GHC and tooling and manage it (both on linux and on windows). And for quick one-off experiments I usually use nix (I don't fully use it, too complex for my brain, but being able to do

    nix-shell -p 'haskell.packages.ghc{version).ghcWithPackages (pkgs: with pkgs; [any number of packages here])'
is just an insane superpower that lets me experiment stuff in ways that most other ecosystems would dream of)


You're probably aware of this but not many share the love for Nix. I tried it and got put off by the unnecessarily alien syntax. They honestly didn't need to invent their own language, there are plenty out there that are pretty close to what they are aiming at.

But it's not only that. It's a general problem of a high initial learning curve.

Modern tool inventors really have to finally learn that everyone is super busy these days. Make it brain-dead quick to learn or your tool will forever remain a niche curiosity for hobbyists.


I totally get that sentiment about nix, and actually I sort of share a variation of it too: I really want to learn it, and I like the featureset in theory, but I can't get used to it in practice, and I don't have the time to invest myself into something that difficult to get going.

The reason I brought up the nix command is that I only use nix, for haskell development, for that specific command: I found it once in a blog post, saved it, then put it under a function into my bashrc, and I use nix quite literally for only that purpose. I've done a lot of development on various functional languages (with a dayjob in F# that lasted 3 years) and being able to quickly experiment with libraries was something that I sorely missed when doing repl experimentation in those languages (I think F# recently got a #nuget directive, but that was after I stopped using it).


Curious and interesting.

Not to be the party pooper: didn't Docker work well for your Haskell use-case?


I personally haven't had much chance to use docker with Haskell specifically, but I imagine that it would definitely work, it would just take a bit more work to get going, so I just take the option that is less friction (at least for my specific usecases). Nix is also somewhat popular in the haskell community - especially when it comes to GHCJS (with tools like obelisk making it easier to develop cross-platform apps using web tech). So I just go along with the flow of the community, personally, especially considering that I use Haskell exclusively for my own personal stuff (sadly).


Yes, as they say back home, "when you are going I am coming back".

"Parallel and Concurrent Programming in Haskell" (2013)

https://www.oreilly.com/library/view/parallel-and-concurrent...


I've heard that many times, still skeptical after trying Haskell thrice.

Have Haskell's ergonomics improved?

I feel your snark is unjustified. You might be putting people in two extremes: wise elders and hip kids. There's a huge amount of people in-between however.

(Also, the saying actually is: "You're going to where I am coming back from".)

Find me something like Erlang/Elixir with the speed of Rust and I won't learn another programming language ever again.

No? Then the search continues.

So far Haskell hasn't impressed. In all honesty Rust is quite hard to put in that niche as well since its `async` stuff is extremely annoying and hard to get right but I guess the tries are still ongoing. OCaml is progressing but who knows when will they get there.


You might find an answer at Facebook infrastructure....

https://engineering.fb.com/2015/06/26/security/fighting-spam...


I'll go through that, thanks. But so far it still seems that Haskell is an acquired taste and that's a shame. You have to overcome a number of idiosyncrasies to get productive.

As a guy who went through at least 8 languages and 30+ frameworks, it gets tiring. I want something that ticks most boxes from the get go.


> Find me something like Erlang/Elixir with the speed of Rust and I won't learn another programming language ever again.

You're describing haskell again.

You haven't articulated why you have disliked haskell in the past.


Mostly the seemingly big initial learning curve; you have to get extensively onboarded in "the Haskell way" (monads are not hard to get but the community is hell-bent on avoiding wording that makes it easier to grok; why?).

That could be okay for many but as I get older, I tend to take people/organizations that require big upfront investment less seriously.

Example: one of the things Elixir has won me over with were its bite-sized introductions and practices. You can be a 3/10 Elixir dev and you can be a 8/10 one, and that's mostly depending on how many of the official tutorials you've covered. The road is mostly a straight line to an acceptable level of proficiency at the end of which you can start choosing to specialize.

Rust, OCaml, Haskell -- they all failed that test for me.

I picked Rust mostly because of the no-GC situation and because of `cargo`. Many older programmers handwave away the importance of good tooling and this is where they lose a lot of potential mind share that can rejuvenate their languages / ecosystems.

Example on this: OCaml's tooling. A lot of people in this ecosystem always degrade the importance of a good task runner + builder + project manager. I spent half a weekend learning `esy` once and mostly tamed it by making it imitate mix/cargo but it wasn't trivial. The end result is a build script that does 80% of what mix/cargo do. The exercise made me scratch my head wondering why what I did back then isn't upstreamed and made official and why is everyone happy to pretend that building an OCaml project is a solved problem when it (very!) clearly isn't.

Haskell's cabal didn't fare better last I tried it -- admittedly that was more than a year ago.

If Haskell has good bite-sized lessons that lead to an actual real job productivity (less academic exercises, please!) then I'd be happy to give it a fair try and maybe make it a part of my toolbox.


I use Elixir at my company and Ive gone thru three phases: 1.: love it for the polished ecosystem and functional programming, get frustrated because even though it is functional it is not at all like writing ML-style code and not having types, and now I am starting to grok BEAM and OTP.

Specifically, I realized how many problems we solved using OTP that would be much more challenging to get right otherwise. We can use processes to get transactional behavior, spawn workers very simply.

We use event sourcing.

Our state snapshots are just a process that receives events. It was easy to evict snapshots by killing processes idle for too long. Beautiful!


Yep, all true, I love it myself. But nowadays it gets harder for me to love it due to the lack of strong static typing. :(

Test coverage becomes mostly wishful thinking. And it's extremely easy to do non-exhaustive pattern-matching which is something that just kills me.

It's absolutely true that Elixir is mega-productive though. And for a ton of projects out there it's good enough and more.


You've done your fair share of trying!

Your assessment reflects mine entirely. Elixir has that charm in that it leads you to productivity quickly, just as Go does. Someone could argue that's only because of much larger pool of users that's paved the path before.

Picking up Haskell again for fun, and the Effective Haskell book has been a fun learning experience. Not too beginner-ish, and doesn't take too much time to explain concepts.

I'll have to try esy and dune again sometime when OCaml 5's stable. Their commands are just different enough from go/cargo/npm to be annoying.


Does anyone know what are the main new features of Ocaml 5.0?


Essentially, OCaml 5.0 is OCaml 4.14 with a fully rewritten runtime which introduces support for parallelism (no more global runtime lock) and an experimental version of algebraic effects.


> no more global runtime lock

Perhaps Python devs should consider transpiling to OCaml ;)


The naming potential is incredible at least.


Parallelism [1] and native-support for concurrency through effect handlers [2].

[1] https://kcsrk.info/webman/manual/parallelism.html

[2] https://kcsrk.info/webman/manual/effects.html


Checked in OCaml a few years ago. I remember there being plans for improved concurrency/parallelism, set for the future. Any progress on this in this release?


That's exactly what version 5 brings: multicore is fully merged in now.


Cool, thanks for info!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: