Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

While I understand that with Graal and .Net you can ostensibly make native, static binaries for Kotlin or C#, I'm very skeptical that it works well in practice. In particular, I'm guessing it will feel like swimming upstream, fighting an ecosystem and build tooling which mostly assume you're running on a VM. And then there's the question of performance...

And of course, with Python your code will run 100x slower than either of the above, your dependency management will suck, you won't be able to statically compile, and to top it all off everyone will give you recommendations for your ailments which take a long time to try out and inevitably will fail miserably for one glaring reason or another. By way of example: "just rewrite the slow bits in C" -> you rewrite the slow bits in C -> your code is now slower because the marshaling costs exceed the gains + your build system is dramatically more complex and you have the sheer joy of debugging segfaults and undefined behavior (yeah, I know Cython exists).



With the modern JVM you actually have to work quite hard to write native code (Rust/C++/C) that outperforms an equivalent Java/Kotlin/Scala implementation. And it is quite easy to perform worse with a native implementation. Of course that is subject to various caveats:

1. Anything running on the JVM will need a 50-500ms of startup time.

2. The JVM implementation will not reach max performance until the runtime has optimized and JIT'd the relevant code paths.

3. There is memory overhead of the VM itself so your runtime will (all else equal) probably require more memory.

If you do need to use Graal to generate native images though it can actually be quite nice. You can run locally (or in benchmarking environments) on the VM and get all the tooling and metrics that come along with that, but build a native binary to actually deploy. I agree that it can be kind of a pain though.


> With the modern JVM you actually have to work quite hard to write native code (Rust/C++/C) that outperforms an equivalent Java/Kotlin/Scala implementation.

For what kind of code? I've never experienced comparable performance between JVM and C++ implementations, and I've had the misfortune of writing a couple parallel implementations in recent years where it was literally an "apples to apples" comparison. C++ is faster by default and isn't particularly close, even if you ignore Java's startup and warmup time. I've never seen a native implementation run slower than a JVM one in my entire career, which has included a lot of Java.

This is the expected outcome and easily explainable in technical terms. Performance in modern systems is dominated by memory handling efficiency, where C++ is very strong and JVM is not.


Those are some very large qualifications, though, right? I think a better way of saying this is something like “you actually have to work quite hard to write native code that outperforms an equivalent Java/Kotlin/Scala implementation for a specific kind of long-running process which can amortize the startup costs and the JIT time.” And that can be true! (Though in my experience naïve Rust often substantially outperforms naïve Java/C#/etc.)

The other thing is that true warmup (not just startup) time to reaching optimization is a lot weirder and less predictable than people realize. See e.g. [1] and [2] for a fascinating analysis (caveat: about a decade old) of the actual warmup characteristics of a number of different VMs (Graal, HHVM, HotSpot, LuaJIT, PyPy, TruffleRuby, and V8). Spoilers: there are bizarre deopt cliffs, scenarios where it appears to optimize and then de-optimizes if you run it long enough, and programs which just never optimize… and this is on benchmark tests!

[1]: https://tratt.net/laurie/blog/entries/why_arent_more_users_m...

[2]: https://tratt.net/laurie/blog/entries/why_arent_more_users_m...


That's a fair point. But I think that "long-running processes which can amortize the startup cost and JIT time" covers a very large swath of software development.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: