> Has updating the Go compiler actually been an issue for you in the past? To me, with Go's stability, it's never been more disruptive than updating a library in practice
I've run into issues with several go version updates.
Off the top of my head, all of the following caused breakages:
1. go 1.4 making directories named 'internal' special and un-importable. Cross-package imports that used to work no longer would compile with a compiler error.
2. go 1.9 adding monotonic clock readings in a breaking way, i.e. this program changed output from 1.8 to 1.9: https://go.dev/play/p/Mi6cGCPd0rS (I know it looks contrived, but I'm not digging up the actual code that broke)
3. The change of the http.Server default to serving http2 instead of http/1.1 broke stuff. Of course it did. How can that possibly _not_ break stuff?
4. The changes in 'GO111MODULE' defaults broke many imports which had either malformed or incorrect go.mod files. This one was quite painful for the whole ecosystem.
5. go1.17 switched to silently truncating a lot of query strings. Of course that broke stuff, how could it not? https://go.dev/play/p/azODBvkb-zK
Those are all intentional breaking changes which were not fixed upstream (i.e. are "working as intended"). The unintentional breaking changes, from changing error messages to cause string-based error detection to fail (because so many stdlib errors aren't exported so you have to do string matching), to just plain dumb bugs in the stdlib.... those are vastly more common. Those usually do get fixed in point releases. Take a gander at those release notes, many of the issues highlighted in those changelogs come from pain people hit during upgrades.
I think the majority of go version upgrades have had some amount of pain, and most of them have been far more disruptive than updating a well-built library.
I would much rather update just my fuzz-testing library in a commit, and be confident that it's only used in tests so CI is good enough to validate it, than have to update that and my http package and my tls package and my os package all at once and have to look for bugs _everywhere_.
I admit I wasn't bit by these changes and had a much better experience overall. Thank you for the long write-up.
However, I think you only mentioned changes in major releases, whereas in this scenario (vulnerability fix) a minor release would suffice (the parent mentioned updating to a point release of a library). Did you also have issues with minor releases?
It's true that minor releases have been much less rocky, but I think the overall point that upgrading a lot of things at once is more annoying than updating a smaller number of things at once still holds.
It's unlikely a fuzzing library has a security issue anyway since it's for test code, so the more pragmatic concern is that new fuzzing features may be adopted slowly because i.e. the feature requires go 1.20, but go 1.20 broke some part of net/http for the 10th time.
The most recent 2 major releases get the fix in case of security issues[0]. This means you can be up to 6 months behind the newest release to never be forced to do a major version update under time pressure.
Your complaints #3 and #5 are library changes, not compiler changes, yet you were complaining about them in the context of compiler updates. They would have bitten you exactly as much if net/http was not in the stdlib.
It's true that those are library changes, but I think when the parent post asked "Has updating the Go compiler actually been an issue for you in the past", they meant "updating the Go distribution", not just compiler, since those two things have been conflated.
And in a sense, yes updating the go compiler was an issue with those because updating the go compiler forcibly updates net/http, with no option to not tie those exactly.
I've run into issues with several go version updates.
Off the top of my head, all of the following caused breakages:
1. go 1.4 making directories named 'internal' special and un-importable. Cross-package imports that used to work no longer would compile with a compiler error.
2. go 1.9 adding monotonic clock readings in a breaking way, i.e. this program changed output from 1.8 to 1.9: https://go.dev/play/p/Mi6cGCPd0rS (I know it looks contrived, but I'm not digging up the actual code that broke)
3. The change of the http.Server default to serving http2 instead of http/1.1 broke stuff. Of course it did. How can that possibly _not_ break stuff?
4. The changes in 'GO111MODULE' defaults broke many imports which had either malformed or incorrect go.mod files. This one was quite painful for the whole ecosystem.
5. go1.17 switched to silently truncating a lot of query strings. Of course that broke stuff, how could it not? https://go.dev/play/p/azODBvkb-zK
Those are all intentional breaking changes which were not fixed upstream (i.e. are "working as intended"). The unintentional breaking changes, from changing error messages to cause string-based error detection to fail (because so many stdlib errors aren't exported so you have to do string matching), to just plain dumb bugs in the stdlib.... those are vastly more common. Those usually do get fixed in point releases. Take a gander at those release notes, many of the issues highlighted in those changelogs come from pain people hit during upgrades.
I think the majority of go version upgrades have had some amount of pain, and most of them have been far more disruptive than updating a well-built library.
I would much rather update just my fuzz-testing library in a commit, and be confident that it's only used in tests so CI is good enough to validate it, than have to update that and my http package and my tls package and my os package all at once and have to look for bugs _everywhere_.