Some people run complex simulations that take days or weeks to run, and a forced reboot in the middle ruins progress. If someone needs to skip an update on their own machine and their own network that is their right.
I have no idea why this scenario gets talked about so little. It is an edge case, but its a huge edge-case. There are entire ecosystems of Windows-only software where running batch operations overnight are a daily fact of life.
I'm nearly resigned to the fact that I may have to segregate engineering workstations to their own network and whitelist their traffic.
So what about the small majority ? Do they get a "sorry not sorry, f*ck you" card ?
You should assume that windows users doing batch stuff over night are people who have better things to do with their computers during their day or want their computer to work for them while they're not sitting in front of it.
Try considering the whole instead of splitting to only address the large majority.
> If you want to run batch operations, use the right product.
Sorry, but this does come across as a bit asinine.
What's the right product for running software written for Microsoft Windows, if not Microsoft Windows?
"Users running their applications" isn't just Grandma on Facebook, or your little sister playing Candy Crush Saga. Microsoft's unquestionable strength has been it's support for enterprise, small-business and professional desktop users. There is a lot of deserved anger thrown their way as a result of deliberate decisions to force flexibility and choice-limiting behavior down the throat of their core market.
My shop is small, but I can count COMSOL, SolidWorks, OrCAD, LabVIEW, various SPICE frontends, and a dozen other small odds and ends for instrument control and data capture without even leaving the engineering suite. All of which are going to be forever stuck on Win 7, at this rate.
Most of those are products that are not expected run over night in batching mode.
You may get the need to have things to run overnight using engineering, simulation, rendering products with some sort of long running process. But these normally have server component that you install onto Windows Server, which the client offloads to.
> Most of those are products that are not expected run over night in batching mode.
I'll be sure to let them know that long-running computations deskside are now only an experimental feature /s
In all seriousness though, we've managed over a decade without the need for dedicated compute nodes along with the costs that would entail. The only case I can make for that right now is that current Windows now ships with unpredictable uptime.
The good part of this is that I prefer using Linux for these things anyways and the instability of windows has forced both our government clients and our office to focus half our efforts into supporting Linux correctly.
Good luck if you're in a branch which has Windows-only software, such as 2D/3D graphics. Blender is definitely offering some good competition, but as I hear it from CG artists, it's still not quite there yet for most projects. If your pipeline is dependent on Adobe software such as Photoshop, Illustrator or After Effects, then you are also stuck with Windows. Rendering (2D or 3D) jobs can often take days if not weeks to finish.
I agree with the person above your comment. For example some simulation loads are common to run on deskside systems.
However this IS definitely an edge case and it is possible to turn it off. I have run a physical host as a build agent that has uptime of months on Windows 10.
This is administrative competence, not a problem with the product or the use case.
WSUS + Windows server is definitely fine for most workloads though.
I have this dreadful feeling that Microsoft is A/B testing this stuff, making sure that there is just enough inconsistency between sites to sow doubt and confusion.
Our company has a dc but not that kind of control over child computers and it would be a large overhaul (and out of my control) to change the whole network setup.
What is your point ? That microsoft should be better at security in the first place ? That Sony should refrain from playing stupid games to prevent winning stupid prizes ? That computer vulnerabilities are a contagious disease ?
Man you're in for an amazing time when you will learn about that stupidity called the Internet of Things.
My point is that by connecting your machine to the internet it ceases to be solely a question of "your machine" and "your rights". There are other systems on the network too and you have a certain amount of responsibility to be a good citizen.
(and yes, I am well aware of, and utterly terrified/upset/bamboozled by, the Internet of Things)
I'm not saying never update, I'm saying let me update (manually) when I am not in the middle of using my computer for work (which may be a week or two at a time). The OS should get out of the way, not ruin what I'm working on.
Still, I see the software you're using as much a part of the problem as Windows here. Needing weeks of solid up-time to complete something is a challenge and doesn't seem like a realistic assumption by a developer for software running on Windows.
Why can't it remember progress and resume after a restart? That would not only make the impact of Windows Updates much smaller, but also power outages and the like.