Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I will never understand when a company/government entity pay people a high salary then give them crap or low-end hardware. You should be maxing out dev machines and the rest of the company should rarely see a spinner or have to wait on their tech. It's a huge waste of time that you are paying for. I once calculated how much time I wasted waiting on slow processes at a job and then multiplied it by my effective hourly rate... I could have bought multiple maxed out MBPs with the money they lost, it's ridiculous. Also it's not just the time they spend waiting, it absolutely slaughters productivity, focus, and motivation.


It doesn't even have to be low end hardware. I just gave back a laptop to a client that I was working with in healthcare. It's a 16" MBP with an i7 and 32GB of RAM. The machine itself is really fast but gets bogged down by all the other crap it has to connect to. They had McAfee endpoint security on these things and it would prevent any unauthorized apps from running as well as hog between 50-70% of the CPU during it's daily "checks" (which ran every hour). I eventually had to patch a kernel ktext just to get the endpoint to shut up and stop eating my memory.

The problem isn't hardware necessarily (well it is here), but it's also the bundled software. Not only are these machines old but they are having to load all this extra shit that slows down the whole experience that much more.


Don't get me started on MDM and the like. It's a cancer for machines and causes all sorts of issues. I understand that some industries have certain compliance that either forces them to do this (or makes them think they need to) but I will not work in those industries. Send me a clean MBP and trust me to do my job. In fact, more and more I'd rather just a stipend and I'll buy my hardware. I've managed to effectively do this at my last 2 companies. Sorry but I know my needs better than anyone else and I'd honestly rather pay out of pocket and make it up on my salary than work on underpowered hardware.


MDM itself, on macs is pretty lightweight. The vendor used to mdm might be installing unneeded stuff, but the protocol isn't heavy.

On macs MDM is also needed for a bunch of enterprise setttings. like pre authorizing apps to do stuff, or remote locking 'missing' machines.


MDM?


Mobile device management, software that runs on your computer that talks back to a command/control-type server where your IT Dept can monitor the machine and push software/updates to your machine.


God my work machine is like this. People (rightfully) complain about power use by crypto, but I wonder how much power is wasted by McAfee et al? I wish there would be public outcry over it :)


For years, techies made fun of people not running anti-virus software. Eventually, people came to accept that it was necessary. Then, the anti-virus software people launched a trove of crapware based on enterprise needs that ruined the concept of anti-virus. Now, techies are advocating to not run that crapware at all. There has to be an MBA course on this "how to build a bloated program designed by managers using all the buzzwords" type of thing.


Plot twist: McAfee is actually a crypto miner. Their secretly the most valuable company in the world.


It's astounding the amount of CPU that things use doing nothing. Zoom takes constant 4% CPU usage to show a login screen and Slack does 20% doing nothing. Turning off gifs for Slack drops that back down to ~5%. But still 5% for a chat app at idle.


The thing I wonder is why apple doesn’t fix this.

To be clear, it’s not apple’s fault —- but they do own the OS and the scheduler.

I’ve previously experimented with the kill command, stopping and starting applications. Apple could do this automatically and reduce that 5% to 1%, or even 0.1%.

This assumes that Slack is doing something over and over many times, rather that just being super-slow at doing the thing once.

Slack is a chat app, and as you say has no business requiring 5% of a CPU. If it’s doing that because it checks for new messages more often than once per second, Apple can help them with that. If they actually require more than a second to check for messages, then their requirements or their developers need to change.


Oh it's not low end hardware, its endless enterprise crapware designed to secure stuff. Enterprise malware proliferates likes admonishing signs in a government office, and for the same reason: it's easy. And it has the same effect: useless clutter that makes the place worse.


At my current position I've been using the same 2015 MBP since...2015. As time goes on corporate keeps deciding more security issues exist and they keep shoveling on the 'security' software. So not only is my hardware aging - they're making it worse by artificially slowing it down.

The last agent they installed had a nasty habit of pegging the CPU at 100% and locking up virtual machines for 10-15s at a time anytime there was heavy disk activity. Luckily I still have root on the local machine so I wrote a quick script to loop and kill that process whenever it spawned. I'm not bragging about that, but it is what it is I guess.


One big problem is misaligned incentives. At most big orgs, you have an infosec dept with authority to put all kinds of virus scanning, etc etc on every computer in the org. They get rewarded if there are no incidents; they do not get penalized if they make your machine impossible to use.


The main priority of an IT department is security.

The most secure system is one that isn't being used.

Therefore, the main priority of an IT department is to make the systems as difficult or unpleasant to use as possible.


The same goes for the guy in the purchasing department who gets rewarded for his cost saving measures when he goes for Acer's cheapest laptop with a Pentium and 8GB RAM rather than something with a mid range CPU and 16GB of RAM.


> they do not get penalized if they make your machine impossible to use.

An "impossible to use" machine won't have any incidents reported against it


Exactly.


Well designed security programs shouldn't slow down your PC. This is a failure on the infosec department. Sysmon/Windows logging can do most of the hardwork with very few resources.


I worked at a company where all software was designed around a 1024x768 resolution. They decided to upgrade from CRTs to LCDs. I strongly suggested they buy the 17” LCDs of the same native resolution, instead they saved a few hundred dollars per person by choosing the smaller 15” 800x600 LCDs. Users either had to deal with a headache inducing blurriness of non native resolution or spend half their days scrolling their screen up and down and left to right. These people were used to keyboard only interactions and now had to keep one hand on the mouse. It was maddening. Half of their staff ~50 people quit over just that one thing.


Hardware is not the problem. I had a truly insane spec laptop (insane for what I was doing) that I frequently lost productivity with because of waiting. It's the AV, endpoint, HIPS/HIDS, etc. software they throw on along with constant updates and restarts that really cause the issue.


I've heard an alternative, proposed often at Google: give devs the weakest systems possible: slow flaky net, slow machines with minimal memory so they are forced to make efficient systems for people in third world countries (key to rapid growth for many products).

Personally I don't care much about my laptop except htat it can drive a large external display and does fast networking. I only use it to connect to a VM I make in teh cloud, which is multiples faster and more powerful than my macbook pro.


And let's tie one hand behind their backs and cover one eye as well while we are at it. I think the idea of giving devs trash hardware because "it's what our users use" is the worst solution possible to solving that problem. Have the devs or another team test on lower-end hardware before pushing a version, don't hobble them while they are creating.


here's the argument that Brad Fitzpatrick (who espoused this idea) gave: at the time, he was creating livejournal and had a bunch of russian hackers building the site. He noticed that every time they made pages load faster or work better, more people used the site (and site growth is the simple most important metric), and that more and more people were coming from countries with poor internet infra.

So, he made his workers use crappy computers and a dodgy network he set up. He claimed it made them make better applicaitons, but I can easily say that if the job hadn't been critically important for me, I'd give my opinion and move on.

I build software for Aristocrats and I expect my tools to be first class. That said, I've reached a level of trust where my l eadership trusts me to manage millions of dollars worth of cloud inventory and much of my messaging to my users is: "please do not attept to save $400 by using $1000 of your time"


A better approach is to simply buy your devs 2 computers:

1 fast PC for speedy development and 1 slow laptop for testing. It makes no sense to give developers a slower computer simply to make them develop better. It's like saying that post office workers must now use bicycles instead of motor vehicles in order to encourage them to find shortcuts. I'm not a successful CEO, though, so maybe I'm just talking out my ass.


yes, as the chromeos person mentioned upthread, you have a dev infra that is fat, and a user testing environment that represents the user experience. Devs don't need to hobble their machines. I argued with Brad about this many times...


Well, that would help with making technology more accessible.

Developers, especially highly paid ones, are as selfish as anyone else on the planet. If you have a strategy to make them care about literally a single other person, I suggest you go for it.


A little voice in the back of my head knew accessibility might be brought up with that comment. Accessibility falls into the same category for me as users with slow machines, yes we should be doing it and the right way to do it isn't to hobble developers. Lack of accessibility (and low-end testing) is often due to management refusing the extra time to do, not because developers are just selfish. I can't tell you how many times I've been promised more time or more devices to test on and then we get close to a deadline and no one cares anymore and questions about said promises are met with "just ship it, we can do that later" or similar.


According to your argument, I should be blindfolded and use a screen reader while I'm developing. That's just absurd.


My argument was that it is a rational business decision to sometimes constrain your developers by making them use what the users will.As the upper comment said people at Google often propose. If you do not, then they get utterly out of touch. I didn’t say you have to do it all the time.


Making software and using software are different


In ChromeOS we have slow devices and had a slow test network in the office but they're test machines, not a primary development laptop. Most of us develop on pixelbook or mbp sshed to a beefy desktop or cloud vm.

Testing on slow devices is important but using a slow device as primary dev machine would just waste too much time.


No need for that, just test everything in a throttled VM. They are just lazy to optimize


TFA seems to be from the perspective of said users, most of whom are not devs.


> maxing out dev machines

This is how we get software catastrophes like Teams. There is no earthly reason for it to suffer from the bugs and performance issues that it does and I have to believe the developers responsible for it are completely unaware of how much this software sucks to use because of their hot rod developer workstations. They still dog food at Microsoft don’t they?


Failing to test on lower-end hardware or refusing to optimize is not the result of giving devs good computers, it's management that doesn't care. Doesn't care to test on lower end, doesn't care about how much ram/cpu, doesn't care how laggy it is. Even if a developer wanted to do this work good luck getting management to sign off on it, I've seen this happen myself more times than I can count.


> They still dog food at Microsoft don’t they?

Ha ha ha, no, probably not. A relatively recent post from an alleged ex-Microsoft employee said that all the designers use Macs, which might be the reason Windows keeps getting worse.


I use a Mac for work in a corp environment and don't experience any of these insane problems.


Part of this is that IT is viewed as a cost center (well, because it is), and often "owns" the expense of issuing laptops. IT's objective is to ship you a laptop that will adequately run the software you need to perform your job duties. How efficiently you perform them becomes a problem for you and your manager.


I understand the basic reasons why but anyone who actually examines this has to come to the conclusion that it's a good idea to issue sub-par tech is penny wise and pound foolish. Heck, for developer satisfaction alone I think it's good idea to upgrade them yearly or every 2 years at most. Cycle those machines through your org or resell them. The depreciation lost in resale is tiny compared to increased efficiency and how happy your devs are.


In this case we aren't even talking about developers who are struggling with slow computers: Even generic office workers struggle with slow machines and it impacts their productivity.

I'm not defending the practice and I am firmly in the "High performance orgs require people to have the best tools you can reasonably buy"


There's no where in the ledgers to record employee happiness


A company's job is to provide a service or sell a product. The idea that doing that is a "cost" that needs to be reduced is ass-backwards.

Bean counters will, by contrast, rarely cut their own budgets because they control the budget.


> well, because it is

Is it? Without IT can a business even run?

> and often "owns" the expense of issuing laptops.

That's just accounting done wrong. The correct way to do accounting around internal IT is to have IT bill projects/departments for their use.


You're right but sadly neither of us really shape business culture.


Sometimes companies spend extra to slow their employees down. We hired a dev team to develop a custom dev environment that takes twenty minutes to stand up. Our old off the shelf tech took about thirty seconds but lacked all the buzzwords of today.

VMs and plain Docker containers are for suckers apparently. We went from one not like prod environment to another, but at least this one costs us a few millions in salary per year to create. Someone is getting a promotion, right?


I agree with you (and that’s basically what we do), but there’s a valid point of view that if you’re developing on machines 4-8x faster than your users, that you might not notice and fix small performance issues because your computer takes them under the threshold of perception. (That doesn’t mean your devs should suffer all the time on their daily driver, but it is a concern that should have some process to address.)


Haha, I knew that line of "maxing out dev machines" might get a response about "machines being faster than users", I'm fully in support of testing software on lower-end hardware or in VM's that are scaled to what your users might be using. That said (as you also pointed out) forcing devs to suffer because "this is what our users use" is just silly.


I had a company at one point come out with detailed justification about who gets what. Only the data analysts were allowed to get two monitors. This was actually spelled out in the policy. None of the standard laptops had high end GPUs, even those assigned to people doing CUDA ML research - they had to rely on client machines, which means remote work was cut off at the knees.


They don't buy a laptop but a seat for some amount of years. So they're guaranteed X amount of laptops for Y years.

The selling company is obviously going to min/max this contract as much as they can. You could order a bunch of laptops if you had the authority I guess, but it will not be making on to the network since that is under contract too (probably the same company)


I understand that's how they are used to operating, I just think it's stupid. I'm sure some beancounter likes everything nice, neat, and predicable but that doesn't mean it's a good idea. If your org has differing compute needs that are all subset/supersets of each other then cycle machines through your workforce (buy new and rotate down hardware to those who need less compute). If that's not possible or the number of people in each group are wildly different then suck it up as a cost of doing business and get the beancounters focused on how to write off as much of the machine as they can with depreciation.


It's because the people making the decisions about what computers everyone else should get buy themselves the best computer possible and don't feel the pain. Surely X company cant afford to get everyone the best macbook possible. That would be like a few thousand more per device per employee!!!


Often the hardware procurement, maintenance and network/account operation are outsourced to an IT contractor. They charge top dollar and provide as little as possible. They are often big multi-national firms.

Companies have been doing this for so long they’ve lost all knowledge of how modern IT is supposed to work.


The question is why are those company/government the most stable and profitable, far more than the startup that gives you 32GB RAM workstations to fresh graduates. I'm not justifying those inhumane working environments, but they seems to just work for shareholders.


Basically big orgs have more to lose than gain because they are so big. Crippling an employee's machine when you have 15 employees to reduce a risk that has a 5% chance of occurrence is a bad tradeoff. Whereas if you have 1500 employees and a risk that has a 5% chance of happening, that means by crippling the machines you have have prevented 75 incidents. Also you're only crippling the machines of a certain portion of people the secretaries, HR, and Marketing departments that just use their computers for email and facebook don't notice a problem.


In places such as Air Force, I always assume that kickbacks and other corruption are part of the culture. I'm not going to be surprised if they pay higher $$ for shittier equipment.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: