Right, which is why I replied in the first place. The parent seemed to think LFS was black magic and didn't want to adopt git without it. In reality LFS is just another versioned file storage scheme similar to a Nexus repo or Linux distro package repo. Any FTP server and a script to push artifacts at the end of a build would get the job done.
I meant it seems like black magic to get it set up properly. Especially on windows.
> Any FTP server and a script to push artifacts at the end of a build
The binaries in this case aren't produced by the build, they are test input data, test output data, graphics files etc., and without them a developers' build or tests will likely fail.
So any time a developer updates his working copy (git pull, svn update, whatever) the the binaries need to be in sync, preferably automatically, since otherwise the source code itself is useless. It seems to me that LFS was designed to allow the user to not download all the binaries, but instead leave the small pointer files. That might be handy, but in my scenario I need ALL the binaries in place and up to date all the time. Basically I need to alias any syncing to also do the git lfs fetch.
Any time the user commits changes (svn commit, git push etc) the binaries must also be pushed.
I'd be very happy to adopt git but for acceptance within the organization it needs that seamless integration of binary versioning without extra steps, and also file locking.
Right, which is why I replied in the first place. The parent seemed to think LFS was black magic and didn't want to adopt git without it. In reality LFS is just another versioned file storage scheme similar to a Nexus repo or Linux distro package repo. Any FTP server and a script to push artifacts at the end of a build would get the job done.