Why do people still expect modern software to run at incredible speeds on hard disk drives?
Run old technology with old software. It is ridiculous that consumer hardware is still being sold with crappy 5400 RPM disks in 2021. No, I have no interest in optimising my software for speeds that are 1/10th of my broadband.
Unless we're talking archival or huge storage necessities, stop complaining about modern OS or games running slow on a technology that hasn't realistically been updated in 20 years.
Do you expect Windows 10 to run on a Pentium II with 512MB RAM as well, because some version of Linux does?
> Why do people still expect modern software to run at incredible speeds on hard disk drives?
Modern software usually doesn't run at incredible speeds on an nvme drive with top of the line hardware. Modern software is often just slow. What can I do on a modern operating system today that I couldn't do on a predecessor released 20 years ago? What justifies that Windows 10 feels slower to use then Windows XP while Windows 10 runs on hardware that is tens of thousands of times faster?
Under this lens, what really defines modern software is slowness. Take away the advantage of orders of magnitude of hardware improvements and you would be left with something unusably slow.
Exactly. I am of the opinion that anything that was possible in computing 15 years ago should be perceptibly instant today. There are obvious exceptions such as cross-continent communication being limited by the speed of light, but as a general rule. Instead, we get software that's written as if Dennard scaling were still occurring.
When I was a child, I had a reverent wonder for technology and all it might do in the future. Realisations like this replaced that with a dry cynicism. My inner child is disappointed.
When I was a child, I was thinking how amazing it would be if I could store all my games floppy disk on a single, network-accessible drive so that I could access them all without installing and swapping them all the time.
Nowadays I have a NAS that stores both the raw media and installation scripts for those games, and I have wine/dosbox/mame/mess startup scripts for many of them. Can't say that my inner child is disappointed about that, but building and curating that system took a lot more effort than I'd like to admit.
I don't buy games that require online activation, because I can't store and play them the same way. My inner child is very disappointed about that.
My desktop built in 2020 is perceptibly instant today. The slowest booting application is VSCode taking 2 whole seconds to start from cold.
None of the figures shown in the article have any relevance if one has a slight knowledge of their operating system and are not running on decades old technology.
I dunno, I think I disagree. I have much higher expectations of software. I have these expectations simply because most of what we do these days we could do decades ago, and using orders of magnitude fewer CPU cycles than the "state of the art" now (e.g. your example of VS Code). Most software falls far, far short of these expectations, which, IMO, is an ongoing embarrassment for the entire field.
Even your example of 2 whole seconds seems wayyy too slow to me. I don't even tolerate that sort of delay when working on my 10+ year old commodity hardware --- I've already tweaked my tools to be totally instantaneous so I can continue typing as soon as I release the short-cut key.
Am I the only person who has a laptop that boots as soon as I push the power button and also the only person who remembers when doing an OS update was an entire days work and involved babysitting the whole process to swap out disks and make sure nothing broke? My wife recently threw her hands up and said “Five minutes?” When I told her how long her over the air software update would take on her phone. There are plenty of things that could be better about new systems and I would love nothing more than to see more optimization of our software and hardware, but let’s not look too long through those rose colored glasses. The past was painful too. Maybe we’re stuck playing whack-a-mole, but maybe things will get better.
I haven't had hardware with a spinning disk in active use in years, so my experience might be coloured by the improved performance from SSDs and may thus be irrelevant to whether things are too slow with a HDD, but that's how I also see it.
Even if you don't go so far back as to require swapping disks, dist-upgrading a desktop Ubuntu or Debian system still took several hours ~10 years ago. Booting the system (again Linux, because that's what I had) took at least a minute, probably more. My laptop that's not bleeding edge by any standard is not booting instantly to the desktop, but it's definitely faster than that.
It is in some sense eye-opening to think that pure desktop latency might not have improved over the years. It makes you wonder if things could be much faster still, and being limited to old enough hardware might also change one's perspective. But to me, having a reasonably snappy desktop experience on a laptop from ~7 years ago without having to resort to lightweight desktop environments or older software is something that would have been outlandish for most of the history of PCs.
Being blunt: you're a terrible developer if you think like that.
What the heck does it matter if 5400 RPM disks are old technology or not?? Do you have any actual reason that your program runs slowly on such hardware?
It's very arrogant to assume that since you can afford to upgrade to the latest technologies, then everyone who can't should be left behind. There are probably billions of people using spinning rust in their machines. If your software does not run with acceptable performance then that's a problem with your software, not their hardware.
The issue is that the speed -characteristics- are so very different between HDDs and SSDs. Not just the raw speed. Decent performance of anything with IO on an HDD requires that the IO is specifically structured to minimize seeks and maximize sequential access. That takes time and skill. If very few modern machines don’t have an SSD then it is understandable to simply skip this difficulty. That time and effort is often better spent elsewhere, and this is a valid tradeoff to make.
That depends. Many developers are now using Electron, .NET and other tech to develop software. Those are heavy and do not like old computers with limited resources."
My desktop products are made in Delphi (compiles to native code) and are very responsive and frugal on resources. Run fine on very old hardware. Desktop products done in C++ would be even better but I am not masochistic enough to develop GUI in C++. At least for now.
Is performance really comparable between .Net and Electron now? Neither is my specialty but anecdotally I see far more criticism of Electron for its slowness.
In theory .NET should be "better". In practice this really depends on way too many factors including how experienced were the programmers who implemented the software. In any way I am not an expert in this area.
I do use VS code as for some tasks there is no viable alternative for me and since my development workstation is a monster it runs fine. My own desktop apps however are native and I test them on really shitty hardware where one would not dream of running VS Code / Slack / whatever.
For fun I've compiled my main desktop product to 32 bit and it runs on a friggin old netbook just fine. Also from my point of view speed of development is not any slower when developing native applications. So unless client specifically requires it or I am doing some front end in browser my desktop apps are all native and coded in Delphi/Lazarus and my backends are C++.
I doubt you'll see any performance difference between Delphi and C++ (even with the most optimizing C++ compilers) outside of heavy number crunching anyway.
For whatever reasons most of my software (desktop and servers) ends up having this "heavy crunching" part in it be it. I usually do not create products that only serve as a simple conduit to a database.
Neither do i (in fact i use Delphi and Free Pascal for practically decades now and i never touched a database :-P) - most of my stuff are graphics and geometry related and yet i haven't seen much of a difference between my C or C++ and my Free Pascal (which AFAIK has worse optimizations than modern LLVM-based Delphi) code. Of course i refer to optimizations based on the generated code, algorithmic optimizations are another topic.
The only time i saw a difference when i made a raytracer benchmark[0] explicitly to benchmark the codegen where the Free Pascal codegen at its best was at 177% of the speed of the Clang 8 at its best. In my experience this is not a realistic metric though (according to the benchmark my C 3D engine should be five times slower if compiled in BC++ than Clang but in practice since it does a variety more stuff than just a single thing the performance difference is barely perceptible).
Can you explain why you expect modern software to run on old hardware first?
Should I be surprised that my turn of the millennium PC can't even run Slack? Yes, in absolute terms it's a bit ridiculous, but we're not talking philosophy here, Moore's law is a thing and software has been getting more complex as hardware got faster. Deal with it.
In a lot of cases new software isn't delivering utility over old software, yet consumes more resources. Users justifiably feel miffed that they're expected to buy new hardware merely to keep up.
There are Slack clients out there, which use a hundreth of the official one and they can run on these old machines, too. The problem aren't more ressource itensive features not possible before, but delivering a services we have for decades with way worse ressource usage because you can.
Maybe referring to Ripcord. I haven't used for Slack though, I don't know what Slack's position is on third-party clients - using them with Discord is liable to get you banned.
> Can you explain why you expect modern software to run on old hardware first?
People are expected to run modern software for reasons of security, to retain interoperability with the latest standards, or to reduce support costs for the vendor. It is not very surprising that people don't want to upgrade their hardware for reasons that are outside of their control.
"Deal with it" ? How is that an argument? Things are getting worse and it should not be a problem then? If software is getting more complex, what kind of feature is it bringing or adding? I mean it's not more complex without a good reason.
If you quote Moore's law, let me quote Wirth's law:
As a C & C++ dev that has always cared that my software works on machines and networks that are a fraction of what I use for testing, I get so tired of careless programs that have clearly been designed for the latest beefiest machines that the devs had at that time.
Same for the web. It should be mandatory by Law to verify websites with the baseline of a 2G connection. (just... half kidding)
It is not my place to justify why modern software is so inefficient, pointing the finger to Windows 10 in particular as if it's an outlier.
Why are YOU using inefficient languages, heavyweight virtual machines, GB sized binaries, Electron and Javascript to write your applications? Why is your boss asking to add feature upon feature upon feature? There is a reason we're in this place, but until then, my point is, stop complaining everything is slow on your old PC. You know exactly why that is, and it is your fault as much as mine.
It's because of us, software engineers, that everything is slow, so it is completely hypocritical that on HN of all places one has to explain that sadly to run modern software one needs modern hardware.
We've put ourselves in this situation and now there's a lot of surprised Pikachu faces around, complaining about Windows getting slower and alt-tabbing to their day job writing yet another crappy Javascript abstraction layer on their shiny M1 MBP.
It's a bit more complex than that. VMs are a necessity because we need to run the same code on a Mac, a Windows and a Linux machine. Since these 3 OSs don't want to agree to a common standard we are left with applications that ship their own JRE etc. In some cases this isn't terrible as a runtime environment can take up just under 100MB and performance isn't that bad
Where do people even pick this sort of stuff up to take with them into industry? Are they teaching Electron development now in undergrad CSE programs instead of C?
C++ still does need more resources to compile programs than AAA games and keep running for 20min / 1h / 2h / 4h?
What were they teaching those compilers engineers and language designers back in the day? nothing about writing performant code, performance tests, regressions and yada yada?
This is actually true, LLVM had no performance regression infrastructure until recently, even though it was marketed as the fastest compiler.
C++'s main speed issue is of course that it's the most text-based language ever and many entire libraries are implemented as headers due to the fragile base class issue.
I am using an AMD Turion with 2GB of RAM with Void Linux and XFCE.
I don't care on software "engineers" if you are a bunch of engineer wannabes where in my country in order to be an enginner you should be able to write your own OS from scratch and learn lots of linear algebra.
That argument would be more convincing if Microsoft hadn't systematically pushed the owners of older PCs to update them to run Windows 10. How many millions of PCs dating from the early to mid 2010s are now stuck with the results despite their hardware still working normally?
The same users who tried to keep Windows XP forever. Microsoft got burned once after maintaining one version well beyond its due date, and decided to go to the other extreme.
In my experience, it wasn't that people wanted to keep XP forever, it was that Vista was worse. When 7 arrived, the number of XP holdouts fell quickly.
The trouble with 7 in that situation is that either 8.1 or 10 needed to be the next good version, but they weren't. Do you know anyone who actually wants their system to change its UI every six months? Or to install updates whether they agree with them or not, even though those updates can have serious consequences if they go wrong and cause inconvenience even if they work? Or who likes having their own computer phoning home without their consent? Or having ads inserted into their daily user experience?
If Microsoft was producing an operating system people actually wanted, they wouldn't need to push dodgy updates to get people to use it.
> Why do people still expect modern software to run at incredible speeds on hard disk drives?
> Run old technology with old software. It is ridiculous that consumer hardware is still being sold with crappy 5400 RPM disks in 2021.
You pretty much answered your own question. Whether you like it or not, people are still using hard drives because hard drives are still being sold. Just because those people prioritize capacity over performance, for a given price point, doesn't mean they want performance to be ignored altogether.
My 486 ran totally fine on 1.44MB floppy disks as well, what is your point?
As I mentioned down thread, you can get 1TB Samsung SSDs right now for $80. Not the best one, still faster than a hard disk. NVMe are a little more expensive than that, and even faster. The price is not really a valid excuse.
Yes, my 2TB NVMe is $300, and does sequential read/writes of 3,000 MB/s, which is close to 100x faster than an HDD. It's not space age technology, solid state storage has been consumer technology for 2 decades already.
>> As I mentioned down thread, you can get 1TB Samsung SSDs right now for $80.
You ever think about the college kid who can't afford that price? What about the family of five living on food stamps who can't just pony up the money for better hardware? What about large swaths of the population who are on fixed incomes?
The way software companies are going and your general attitude is, "Eh, this is old technology, anybody should be able to afford it, what's the big deal?"
Totally tone deaf to the poor and people living on the edge and others living on a fixed income. My mother in law is pushing 80 and I had to build her a new PC since she couldn't afford to purchase a new desktop to do her taxes on since now our state requires you to file electronically and shockingly, her tax software no longer runs on her 8 year old desktop.
So instead of helping her learn one of the many (free) web-based tax applications, you built her a new computer to run a new (paid) version of the tax software she was already using? How does that even make sense?
She tried several of the "Free" web based tax apps, and always had problems with them. And yeah, I built her a PC that could run a current version of the software she paid for.
It makes sense when you want to help someone to use the software they already paid for, instead of pointing an elderly person to the web and saying, "See? They have FREE versions, go grab one and figure it on your own!"
Sorry, but that to me is kind of a callous solution compared to what I did.
> It's not space age technology, solid state storage has been consumer technology for 2 decades already.
You're exaggerating. The first really plausible SSDs to appear in consumer PCs appeared in around 2006 or 2007, so maybe 15 years tops. For instance, Dell offered a computer with 32 GB of SSD storage in mid-2007. This was already not much storage at the time, and the drive alone cost over $500. This was arguably not a consumer affordable product at that price, and the laptops involved were actually the Latitude series (targeted to businesses) anyway. Apple followed this up in 2008 with a 64 GB SSD, but this was a $1000 upgrade.
Suffice it to say that a vast majority of people were only buying (and could probably only afford to buy) computers with rotational drives well into the 2010s. Many of the computers they bought are still running today. Many affordable computers that shipped with Windows 10 still had rotational drives. It's absurd to overlook a whole class of people who can't, unlike the wealthy, upgrade their MacBooks every 2 or 3 years. They probably don't have MacBooks to begin with. Parents are passing old hardware between children like hand-me-downs. That's before you even leave the United States.
I think your comment misses the point at a deeper level, which is that most people in this thread already have SSDs (I certainly do), but are disappointed that performance matters so little to many developers that they are fine with seeing software run at the same speed on today's insanely fast hardware that its predecessor did 20 years ago. Software is much more powerful than it was in the days of floppy disks, but it's not much more powerful than it was 15 years ago. We've seen a generational increase in performance, but we've wasted it. I think that's worth some disappointment.
On my 2012 mbp I have the OS and software running on the SSD in the drive bay, and a beefy hdd in the disk slot for storage. I set this up back when a 256gb SSD was pretty expensive, but maybe that sort of strategy would work for you now that SSD prices have gone up again, having a fast drive for software and the system and a big drive for storage.
I've had massive issues with Ubuntu on a HDD. Every time a big file was added to disk (like a download), the indexing system would run and render the whole system unusable for minutes (100% disk and CPU). Upgrading to an SSD solved the issue.
That depends how good Linux's support for very-low-priority background tasks is.
UNIX priorities don't handle this situation well because a low priority process is still technically immediately runnable and can cause priority inversions by clearing out all kinds of system caches that your foreground app was using.
I'm usually frustrated that I've got dozens of GB of unused RAM, and some horrible 32-bit app is thrashing and paging to disk because it's getting close to using 2 GB.
>Do you expect Windows 10 to run on a Pentium II with 512MB RAM as well, because some version of Linux does?
Void Linux, Debian, Slackware... just fine. Maybe with 720p video with a good Radeon 9800 video card for the AGP bus. But a Pentium III with SSE would run much better.
More often than not I find modern software regress in terms of feature compared to “older” software. An obvious example is softwares that has win32 and their UWP counterpart. The UWP version usually is stripped down in features, buggy and more resource hungry, e.g. Windows 7 Photo Viewer vs the Win 10 Photos app.
One would associate modern to something that is improved, better, faster. But that’s not the case with current state of software.
Run old technology with old software. It is ridiculous that consumer hardware is still being sold with crappy 5400 RPM disks in 2021. No, I have no interest in optimising my software for speeds that are 1/10th of my broadband.
Unless we're talking archival or huge storage necessities, stop complaining about modern OS or games running slow on a technology that hasn't realistically been updated in 20 years.
Do you expect Windows 10 to run on a Pentium II with 512MB RAM as well, because some version of Linux does?