I'm yet to want to defrag my computer and worrying about still open deleted files.
I face builds failing because I have a terminal open in a build output directory or a textfile in an editor open is far more often, and annoys me more. (or being unable to replace a running service binary of a service being developed/tested, needing to stop the service, replace it, and start again. Or failing log rotations because a logfile is open in a Notepad. Or...)
Also see my link for a solution on unix, where you can indeed fix this problem, or simply kill the process holding the file. I didn't need to defrag my computer in the last 20 years, neither on Linux, nor on Windows, but hey, it makes me happy that my daily work is hindered for this hypothetical possibility. (which could and is solved in other OSs with appropriate APIs for the job)
Also the original post is about Windows installers... don't get me started on the topic (or windows services), please.
I wasn't just talking about defragging. I was also talking about live volume shrinking.
> Also see my link for a solution on unix, where you can indeed fix this problem
Looping through every FD of every process just to find ones that reside in your volume of interest is... a hack. From the user's perspective, sure, it might work when you don't have something better. From the vendor's perspective, it's not the kind of solution you design for & tell people to use.
In fact, I think that "solution" is buggy. Every time you open a an object that doesn't belong to you, you extend its lifetime. I think that can break stuff. Like imagine you open a socket in some server, then the server closes it. Then that server (or another one) starts up again and tries to bind to the same port. But you're still holding it open, so now it can't do that, and it errors out.
> or simply kill the process holding the file.
That process might be a long-running process you want to keep running, or a system process. At that point you might as well not support live volume shrinking or defrags, and just tell people to reboot.
> Also the original post is about Windows installers... don't get me started on the topic (or windows services), please.
This seems pretty irrelevant to the point? It's not like they would design the kernel to say "we'll let you do this if you promise you're an installer".
> I face builds failing because I have a terminal open in a build output directory or a textfile in an editor open is far more often [...]
Yes, I agree it's frustrating. But have you considered the UX issues here? The user has C:\blah\foo.txt open, and you delete C:\blah\. The user saves foo.txt happily, then reopens it and... their data is gone? You: "Yeah, because I deleted it." User: "Wait but I was still using it??!"
I have considered it. Never had any serious problem about it during 15 years of desktop linux use as a developer machine. Grandma would not have more problems than unplugging the pendrive where the file was opened from, and trying to save it, for example... Modern operating systems have far worse and more user hostile patterns.
And for the live volume shrinking: the kernel can solve this problem, it there is a need for this, there is no need for this invariant for this feature, as it is not only possible to do it via the same APIs offered for ordinary basic file manipulation gruntwork. On unix basically a filename is disassociated from the inode, but afaik the inode holding the blocklist still exists, will be cleaned up later, thus it can be updated if its blocks are moved under the high level filesystem APIs.
> Never had any serious problem about it during 15 years of desktop linux use as a developer machine.
You're not the typical customer of Windows.
> Grandma would not have more problems than unplugging the pendrive where the file was opened from, and trying to save it, for example
Actually she would, because in that case writing to the same file handle would error, not happily write into the ether.
Also, you have one tech-savvy grandma. I don't think mine even knows what a "pendrive" is (though she's seen one), let alone try to open a file on one, let alone try to save her files on it, let alone use pen drives on any regular basis.
> You just made a strawman you are sticking to.
The only strawman I see here is your grandma using pen drives to save files.
What I'm pointing at are real issues for some people or in some situations. Some of them you might be able to solve differently at a higher investment/cost, or with hacks. Some of them (like the UX issue) are just trade-offs that don't automatically make sense for every other user just because they make sense for you. Right now Windows supports some things Linux doesn't, and vice-versa. Could they be doing something better? Perhaps with more effort maybe they could both support a common superset of what they support, but it's not without costs.
I face builds failing because I have a terminal open in a build output directory or a textfile in an editor open is far more often, and annoys me more. (or being unable to replace a running service binary of a service being developed/tested, needing to stop the service, replace it, and start again. Or failing log rotations because a logfile is open in a Notepad. Or...)
Also see my link for a solution on unix, where you can indeed fix this problem, or simply kill the process holding the file. I didn't need to defrag my computer in the last 20 years, neither on Linux, nor on Windows, but hey, it makes me happy that my daily work is hindered for this hypothetical possibility. (which could and is solved in other OSs with appropriate APIs for the job)
Also the original post is about Windows installers... don't get me started on the topic (or windows services), please.