Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Those programs communicate by serialization and deserialization, usually in bespoke, poorly documented data formats.

Unix doesn't put a strong emphasis on protocols. It just says "everything is text, except when it's not". It's not very helpful.



Modern unix gives you mostly low-level mechanisms, not inter-application protocols. It is better to leave it up to applications/administrators to figure out the best protocol (policy) for their use case, to make the best use of those mechanisms.

Those mechanisms aren't textual, they are byte-sequence based.

It turns out that many text-based protocols are more popular than binary/object alternatives. The penalty for textual redundancy is negligible/acceptable for many of them. Where not, Unix allows people to use unix/create new mechanisms to implement specialized protocols (rpc, protobuffers,...).


> Unix doesn't put a strong emphasis on protocols. It just says "everything is text, except when it's not". It's not very helpful.

OK, I probably misspoke by claiming it put an emphasis in protocols, but I think it's fair to say that Unix does emphasize component integration by sharing data instead of trying to integrate through a common runtime.

I do think that having everything be (mostly) text is helpful though. It's true that the data formats are often poorly specified, but that is compensated for by the fact text is a format that has a very rich set of tooling. text editors, regular expressions, parser-generators, etc. all make it possible to capture, analyze, and manipulate the text data exchanged.

Perhaps a better example of protocol-based integration would be the internet. It has it's flaws as well, but it's also enabled collective engineering projects on a vast scale.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: