Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In this space, it is more they you don't want to lose all of the progress made in these older libraries. Coming up with a new ecosystem is an immediate race to parity with the older. And a lot of smart people were involved with that. (Such that you aren't competing with a single idea or implementation, but an ecosystem of them.)

Hats off for making a good shot. But don't be surprised to see reluctance to move.



There's very promising work on upgrading older libraries/legacy code with a technique called verified lifting. The technique has been used successfully at Adobe to automatically lift image processing code written in C++ to use Halide. The technique also guarantees semantic equivalence so users can trust the lifted code.

Paper: https://dl.acm.org/doi/pdf/10.1145/3355089.3356549


Before reading the paper: is "verified lifting" kind of like "deterministic decompilation" — where you ensure, with every modification to the generated HLL source, that it continues to compile back into the original LL object code?

(See e.g. the Mario 64 decompilation — https://github.com/n64decomp/sm64 — , whose reverse-engineering process at all times kept a source tree that could build the original byte-identical ROMs of the games, despite being increasingly [manually] rewritten in an HLL.)


To clarify, I’m not in any way involved with the paper described in the article. Just interested in the space :)




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: