You can achieve something close to that with websites as well though.
You can use service workers and caching strategies so that the user only „downloads“ the boilerplate - i.e. page logic, JavaScript, html structure - and (current) content of your website once and only downloads new or changed content.
This is the whole idea of the offline-first model that’s originally made to enable users on mobile devices with spotty internet connections to use your (web-)app without too many hiccups.
Granted, this takes some knowledge and development effort, but if your goal is to save bandwidth, it’s absolutely possible. And something more development teams should think about.
> bandwidth costs are high for both producers and consumers.
But producers are not really paying for bandwidth - they are paying for control. At any point in time they can turn off the tap on users, who have no recourse. Unlike with local executables, there is nothing to hack.
That enables SaaS, which is the ultimate rent-seeking business model: you only "own" something only as long as you remain indentured to the company. From that perspective, the company will be more than happy to spend $5 p/m on a user who pays $20 p/m.
I've been surprised by how many use cases I've come across lately that point back to XSLT.
This is one example. A website could very easily be written such that all of the XSLT, CSS, and JS are cached. The only data that would be refetched on each visit is the XML itself, and proper cache headers could even make sure that's only fetched when the content changed.
Another example, there was a Shoptalk Show recently talking about web components. It came up that one feature to make them really useful would be HTML syntax for logical flow (for loops, if/else, etc). That's exactly what XSLT did.
Writing XSLT is a real pain until you really learn it, the mindset is very different from many common languages. And it hasn't been touched in probably 20 years, with at least a few features that were just never implemented in Firefox. It sure is an interesting tech though, and made a lot of sense when the web wasn't about web apps and advertising.
Apps, whether carefully crafted, or not, are more often than not just a means to exercise more control over users. At least all those newspaper or <whatever> browsers. That is, they are the almost perfect tracking device.
If it's a genuine app which needs a dedicated interface (e.g. gaming, hiking, navigation, even Android Auto in a HUD in my car) that's more or less unavoidable. But all these "readers" should stay in classic web browsers.
(Or, aspirational friend, or girl outta your league, or whatever we want to call it :)
But, seriously, by bringing it up you have very aptly, I think, put your finger right on one of the main underlying issues here. Craft. Excellence. Standards.-
with apps you download the executable once - and it updates infrequently. saving bandwidth. carefully crafted apps produce smaller binaries as well.