I initially found the idea of monolithic repositories hard to digest. But now I think it's a good idea for some of the reasons outlined in the article. Namely, it's very easy to depend on other code that the organization has created.
In the open source world, I have found some Unix distros use the same model. I know it's not as extreme, but the principle is quite similar. For example, in Nixpkgs all package definitions (which are actually code in the functional language Nix) are in the same repository and thus they can all depend on each other in a very easy and transparent way.
I initially found the idea of monolithic repositories hard to digest. But now I think it's a good idea for some of the reasons outlined in the article. Namely, it's very easy to depend on other code that the organization has created.
I remember the days when monorepo was the norm, and distributed version control was the weird, kooky idea. Mainstream programmers had knee-jerk notions that all managed environments were too slow.
For game development, monorepo is simpler. If one is using git, one needs to use some other software to turn the part of your repository for media into a monorepo, otherwise the asset files become a burden. (gitannex, for example)
Do you really mean a monorepo, though, or just a project repo whose scope is one entire game? Because a monorepo for multiple games - including released and in progress ones - seems likely to create a lot of pain in the long term. The release cycle for games seems much more suited to a release branch model, which would kind of require per-game repositories, and some sort of package versioning for common dependencies. I guess maybe with things like mobile games where you have a constantly moving target platform even ‘released’ games are live code so maybe I’m just betraying an outdated ‘gold master’ kind of mindset here?
>For game development, monorepo is simpler. If one is using git, one needs to use some other software to turn the part of your repository for media into a monorepo, otherwise the asset files become a burden. (gitannex, for example)
I think this has more to do with how git handles diffs more than monorepo vs distribution. As you said, git lfs solves the issue with centralization but that's not the same as a monorepo. You can still split all your libraries out in such a system without issue.
That's an approach. Don't know if I'd call it simple. Your monorepo would start looking like an artifact repository at some point, with multiple versions of products.
And Google's approach, I'm sure, requires a bit of standardization and tooling investment. It's not clear to me that equivalent conformance and investment in monorepos and a package manager wouldn't work just as well.
>Your monorepo would start looking like an artifact repository at some point, with multiple versions of products.
It shouldn't, that's one of the big gains of a monorepo, is that there's only one version of everything. You don't need to version your dependencies within the repo, which means you only need to maintain one version of any external dependency.
I wouldn't call getting every project, internal or external, on the exact same version of every dependency "simple". That's a lot of hand waving around a really hard problem.
Its hard to do if you don't start out that way early on, yes. I don't really think its hard to maintain that state.
With multi-repo environments, btw you still need to do that kind of dependency version management for certain upgrades. Essentially you can desync but you have to occasionally re-sync everything. I na monorepo environment you can just prevent desynchronization.
It's a hard problem, but it's not necessarily harder than other approaches to dependency management. There's pretty much no approach to dependency management that isn't hard in one way or another.
A large chunk of third party usage is open source Google libraries like Guava. Anecdotally, I don't end up using that many libraries from third party.
As for JavaScript, the hassle of importing the entire transitive closure of an NPM library you want into third party means it's much more attractive to go with NIH syndrome. I looked at importing ESLint, but it has something like 110 dependencies.
In the open source world, I have found some Unix distros use the same model. I know it's not as extreme, but the principle is quite similar. For example, in Nixpkgs all package definitions (which are actually code in the functional language Nix) are in the same repository and thus they can all depend on each other in a very easy and transparent way.