Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you have moved between homes a few times, you’ll notice how you seem to occupy all available space eventually. It’s a good optimization strategy, I think. There’s just no reason to waste time tuning software for optimal efficiency if the users hardware is powerful enough they’d never even notice. Yes, that means software is less efficient than it was, old devices have more trouble running new apps, and doing a bunch of things at once hogs computers down faster—but for the majority of use cases, that is not a concern.


I see your point, but…

> There’s just no reason to waste time tuning software for optimal efficiency if the users hardware is powerful enough they’d never even notice.

…but users do notice. They notice that their browser is slow, 8GB RAM isn't enough, every app they download is multiple hundred megabytes or even multiple gigabytes. Only very rarely can apps justify being that large.

I build native apps and pick native apps over alternatives and the experience is much, much nicer.

Whether users could articulate what they're experiencing is a different question. The bad experience we're talking about had been normalised over the past 15 years or so.


> Whether users could articulate what they're experiencing is a different question. The bad experience we're talking about had been normalised over the past 15 years or so.

I could counter that by saying people have had unpleasant experiences with software before, too; being way too technical and non-intuitive is one of that.

But my point isn't just that users won't notice anyway, but optimising for that isn't a good strategy anymore—at least from the perspective of commercial software vendors. Performance tuning reaches a point of diminishing returns very quickly.


Optimising for older hardware might not be a good commercial strategy, but I still get a kick out of making my software nimble and fast.

I like to joke giving engineers fast machines is a mistake, that having them on 8GB dual-core Celerons would help optimise their software.

OTOH, it’s also a valuable strategy to give them now the machines that’ll be mainstream when the product is finished. That one I learned from Alan Kay and the Alto.


I built some native apps for macOS recently and a 10MB installer with a below 100MB footprint after running for months makes me wonder what would happen if all of my apps were truly native.


I built a modern equivalent of an old classic Macintosh app—Stapler—and the file size is roughly the same if you take unty account the fact my new app contains both Intel and Apple silicon architectures. https://news.ycombinator.com/item?id=41216055


macOS is the best platform for wondering that, too.

Apple's ecosystem is incredible, all the APIs they provide, all the neat little native functionality you can build into your apps, all virtually for free, and a user base that cares about the quality of their apps and is willing to pay for it.

There's no better platform for building a native app, in my opinion.


If we were to run with that analogy, it would be akin to saying that someone goes out to buy a new couch only to discover that they have to buy a new house in order for it to fit inside. That couch may, or may not, seat an addition person. Also, I doubt that many people would be able to occupy the space of a house that is over 1000 times larger (or a million times larger, if you bought your first house in the early 1980s).

Don't get me wrong: I recognize that there are legitimate reasons for some of the increased size of software. I am also willing to accept that some inefficiency is justified in order to improve the quality of life for developers. On the other hand, we should not be ignoring efficiency for the end user solely in favor of efficiency for the developer.


I actually just talked to a guy who bought a new house with a huge barn attached, so he could park his huge motor caravan in it. So, the analogy isn't too far fetched, IMHO.

> On the other hand, we should not be ignoring efficiency for the end user solely in favor of efficiency for the developer.

Neither do I. But the economic incentives are stacked differently.


I guess that's really the thing - when a 2TB SSD is $100 and and 64GB RAM another $130... there's just not much ROI in trying to make the app more efficient. Your users won't care.

I personally still value it, and I try to build super-efficient apps, but... I'm the minority. And I can understand why.


And as a percentage of resources, modern apps aren’t necessarily awful. If your IDE takes a gig of RAM, that’s absolutely a lot, but it’s relatively 1.6% of that 64GB system.

Sure, it can add up, and we shouldn’t waste resources just because we can, but I’m not going to lose sleep trying to optimize it when the rest of my system still has the other 98.4% of RAM available for use.


Except most users aren't running a developper's machine and have less than 8GB of ram. I think there is also still an awful lot of laptop users with 1366x768 screen resolution as it was still the default res on cheapest laptops a few years back.


Arguments in favor of poor quality are always so convoluted.


Well, my apologies. Let me try to phrase it in less complicated terms:

Modern computers are very powerful. To make cool apps, developers don't have to work extra hard to make their apps super-efficient. Developers are very expensive, so companies do not letting them work on things longer than necessary if it will not earn them a lot of money. Most people buy apps for the things they let them do, not because they are very fast. This is why many companies don't make super-efficient apps anymore.


High quality doesn't mean more time. It's a mistake to think that quality is extra work that is done after the functionality is completed.

If built properly from the beginning, fewer resources will be needed, including developer time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: