There’s no clear definition (in most languages, of major/minor/patch versioning). Amazon did this reasonably well when I was there, though the patch version was implicitly assigned and the major and minor required humans to follow the convention:
You could not depend on a patch version directly in source. You could force a patch version other ways, but each package would depend on a specific major/minor and the patch version was decided at build time. It was expected that differences in the patch version were binary compatible.
Minor version changes were typically were source compatible, but not necessarily binary compatible. You couldn’t just arbitrarily choose a new minor version for deployment (well, you could, but without expecting it to go well).
Major versions were reserved for source or logic breaking changes. Together the major and minor versions were considered the interface version.
There was none of this pinning to arbitrary versions or hashes (though, you could absolutely lock that in at build time).
Any concept of package (version) set was managed by metadata at a higher level. For something like your last example, we would “import” pkg2 from 2025, bringing in its dependency graph. The 2025 graph is known to work, so only packages that declare dependencies on any of those versions would be rebuilt. At the end of the operation you’d have a hybrid graph of 2015, 2025, and whatever new unique versions were created during the merge, and no individual package dependencies were ever touched.
The rules were also clear. There were no arbitrary expressions describing version ranges.
For the record, Amazon's Builder Tools org (or ASBX or whatever) built a replacement system years ago, because this absolutely doesn't work for a lot of projects and is unsustainable. They have been struggling for years to figure out how to move people off it.
Speaking at an even higher level, their system has been a blocker to innovation, and introduces unique challenges to solving software supply chain issues
Not saying there aren't good things about the system (I like cascading builds, reproducibility, buffering from 3p volatility) but I wouldn't hype this up too much.
You could not depend on a patch version directly in source. You could force a patch version other ways, but each package would depend on a specific major/minor and the patch version was decided at build time. It was expected that differences in the patch version were binary compatible.
Minor version changes were typically were source compatible, but not necessarily binary compatible. You couldn’t just arbitrarily choose a new minor version for deployment (well, you could, but without expecting it to go well).
Major versions were reserved for source or logic breaking changes. Together the major and minor versions were considered the interface version.
There was none of this pinning to arbitrary versions or hashes (though, you could absolutely lock that in at build time).
Any concept of package (version) set was managed by metadata at a higher level. For something like your last example, we would “import” pkg2 from 2025, bringing in its dependency graph. The 2025 graph is known to work, so only packages that declare dependencies on any of those versions would be rebuilt. At the end of the operation you’d have a hybrid graph of 2015, 2025, and whatever new unique versions were created during the merge, and no individual package dependencies were ever touched.
The rules were also clear. There were no arbitrary expressions describing version ranges.