> I've spent a few years in C#, Java, Swift, C++, JavaScript and Haskell. In all of them, the passing of time has always been a problem. What worked two years ago is broken today
Java, C# and Javascript are obsessive about backwards compatibility (one of the reasons Java's changes to the language are moving so slow).
There's nothing magical about Wirth's languages or Go. What worked two years ago will be broken today, and the language has very little to do with it.
> Switching several projects to Go mitigated this almost immediately
As in: you haven't run into things that are broken yet.
> Java, C# and Javascript are obsessive about backwards compatibility
This might be true for Java, but this has not been my experience for the other two. NET especially. Incrementally migrating from 4.6.* to CORE3 has been challenging, even when we've had external (expert) .NET consultants to help us.
> There's nothing magical about Wirth's languages or Go.
I don't think so either! On the contrary, they're everything but magical.
> What worked two years ago will be broken today, and the language has very little to do with it.
Almost on the dot two years since those rewrites happened and nothing (to my knowledge) has broken. On the contrary, looking at code examples and best practices from 2011 look exactly the same. Writing Swift, anything older than 2019 made me sweat.
> As in: you haven't run into things that are broken yet.
If you ignore the trend and stick to a single version of any language you've mentioned the churn would have been minimized. Therefore it is not a (core) language problem but a social or ecosystem problem. Not to say that it is not a problem, but it would be probably nothing to do with anything you've claimed in the OP.
You almost never want to get stuck on an old unsupported language version. Language is defined by culture, ecosystem and their application of intention.
> Almost on the dot two years since those rewrites happened and nothing (to my knowledge) has broken. On the contrary, looking at code examples and best practices from 2011 look exactly the same. Writing Swift, anything older than 2019 made me sweat.
Currently working on a Java codebase that has been around for ~8 years, previously developed with JDK7, now migrated over to JDK8 where it's probably going to stick for the foreseeable future.
I think that there's a lot to be said about the orders of magnitude that people think of when talking about the longevity of the code:
- someone may believe that code running with almost no changes for 2 years is good enough, which may indicate relatively stable libraries/frameworks or approaches like deprecating functionality without explicitly removing it
- someone else may believe that code running with few changes for 10 years is good enough, which probably also indicates stability of the underlying platform as a whole (for example, JDK8 still receives updates and has been around for 2014 and will be maintained until 2026 or 2030 depending on the distribution) at the expense of slower pace of change
- someone else might expect their code to work as good in 40 years as it does now, perhaps statically compiled code in very particular domains (e.g. code that typically has few dependencies and runs on hardware directly), though i'd argue that this is a bit of a rarity
One can probably make observations about the different libraries, frameworks, platforms, ecosystems, ways of thinking and perhaps about us as a society based on that, though i doubt that i should necessarily be that person. Anyone know of people who've made similar observations, perhaps?
Regardless, i think it's interesting to look all of this and to wonder about how long any particular language, piece of code or platform will survive. Personally, i really enjoy the ones that are developed at a slower pace and don't need constant churn to keep the code running.
Writing Elixir the last few years has been great from this perspective of stability. There are breaking changes occasionally but they're mostly small or relatively trivial to fix due to the functional programming nature of Elixir.
For embedded stuff Nim has been great as well and code from pre-1.0 Nim code often just works or requires a couple of module renames.
Alas, the Linux kernel seems to drop or swap api subsystems every time I hit refresh on lwn (ok, thats a bit exaggerated) but hey.
Doing embedded projects means I really don't want to rewrite code to keep up with fads for a device thats intended to work for years.
> You haven't lived through switching to the new modules structure then :) That's quite a big breaking change (and poorly executed in my opinion)
Ugh. I was on the other side of this argument until you brought this out. It’s a great point. The actual language is very good at compatibility but the dependency management did go through a big painful change that some internal code bases I work with but do not own still haven’t worked up the motivation to deal with.
This bit me really hard as I only come into contact with Go occasionally.
First, I couldn't understand how GOPATH worked (many years ago :) ), then that changed, and recently (last year) I spent something like three days trying to figure out why some dependency wouldn't fetch (it was because modules).
:)
For a person who works with a language daily such changes are not as obvious.
Java, C# and Javascript are obsessive about backwards compatibility (one of the reasons Java's changes to the language are moving so slow).
There's nothing magical about Wirth's languages or Go. What worked two years ago will be broken today, and the language has very little to do with it.
> Switching several projects to Go mitigated this almost immediately
As in: you haven't run into things that are broken yet.