> For every extra cycle an hardware engineer can squeeze out of silicon, you will find a programmer adding one hundred cycles to their program[6]. In other terms, I fear that once devs figure out how powerful the M1 is, they will throw more "features" at it.
Can we please stop replicating the "growth" fallacy into software ecosystems? Haven't we learnt enough about how unsustainable and damaging this is?
I hope Electron/CEF die soon, and people get back to building applications that don't consume hundreds of megabytes of RAM to render a hello world.
> I hope Electron/CEF die soon, and people get back to building applications that don't consume hundreds of megabytes of RAM to render a hello world.
Web technologies are fine, but what we really need is some kind of lightweight browser which allows you to use HTML/CSS/JS, but with far lower memory usage. I found https://ultralig.ht/ which seems to be exactly what I am looking for, but the license is a major turn off for most paid services. It makes sense for smaller, indie projects to adopt it, but I haven't seen many "desktop apps" using this in the wild.
What I'd really like to see with CEF et al, is JS being dropped, in favor of directly controlling the DOM from the host language. Then we could, for example, write a Rust (or Kotlin, Zig, Haskell, etc) desktop application that simply directly manipulated the DOM, and had it rendered by a HTML+CSS layout engine. Folks could then write a React-like framework for that language (to help render & re-render the DOM in an elegant way).
Ultralight (https://ultralig.ht/) looks pretty cool. I think another possible option is Servo (https://github.com/servo/servo) – it was abandoned by Mozilla along with Rust during their layoffs a while back (but the project still seems to have a decent bit of activity). It would be great if some group of devs could revive the project, or a company could fund such a revival.
Eventually, we'll need to reflect on, and explore whether HTML+CSS is really the best way to do layout, and we could maybe perhaps consider porting the Android/iOS layout approach over to desktop. Maybe WPF/GTk/Qt/etc even got things right, and HTML+CSS isn't the best way to do layout.
We already know HTML/CSS are not the best way to do layout. We already have many desktop based UI frameworks to create cross platform desktop apps.
GTK, QT, etc which are also superior to the single app, limited size focused Android/iOS UI frameworks. More importantly, unlike iOS/Android, they’re even cross platform.
HTML/CSS/JS are successful because of the same reason electron exists. The success of the web means there’s a huge developer base that knows how to work with HTML/CSS/JS which was the original lingua franca of the web, and there are tons of libraries, frameworks, tools, components, etc available in the HTML/CSS/JS world that makes development easier and quicker.
> HTML/CSS/JS are successful because of the same reason electron exists. The success of the web means there’s a huge developer base that knows how to work with HTML/CSS/JS
The actual reason is a different one IMHO. The implementation of web standards enabled cross-platform engines that provide every UI customization one could possibly need for a SaaS product. This decision included many benefits for businesses and developers alike, such as:
- User data stays in the walled garden
- Users most probably already have a web-browser installed
- Responsibility to keep the runtime (web-browser) up-to-date is on the user side
- Automated updates (POV user)
- No installation instructions
I think it as much as a business decision as it was the decision from developers to bet on HTML/CSS/JS instead of GTK, QT, etc.
Flutter does something like this, rendering on a canvas directly and eschewing HTML, CSS and JS altogether. It works pretty well. With WASM in the future though I suspect many other languages will adopt a similar paradigm.
> directly controlling the DOM from the host language
I think your path to this future is getting a DOM API specced for WASM (as in, not going via a Javascript bridge). WASI might help.
If and when that happens, then you can repurpose that same API without necessarily needing to compile to WASM. The biggest hurdle is that the current DOM API is a Javascript API, not an API describing e.g. memory layouts.
The majority of users who own an M1 are never going to push it to 100% utilisation.
So whether their productivity app is written in Electron or hand-crafted assembler isn't going to make much of a difference to the end user experience.
Electron and similar apps serve a legitimate purpose in the marketplace which is to allow developers to deliver a cross-platform app for the same cost as a single platform one.
A machine that is 99% idle isn't running at 1% utilization all the time. It's idling 99% of the time and running flat out 1% of the time. The user pushes at least one core to the limit with every action they take, even if only for microseconds.
This is why performance matters even with a machine that is mostly idle. A faster system will respond faster to events. It will have lower latency when communicating over a network. It will handle user inputs faster. Users will complain that a program takes a second to respond even if the load average is 0.01.
(I know CPUs enter lower-performance states when lightly loaded. They can go from idle to their highest performance state in much less time than it takes for the user to notice latency.)
> whether their productivity app is written in Electron or hand-crafted assembler isn't going to make much of a difference to the end user experience
Subjectively, every Electron app I’ve used including the local beloved VSCode has been a painful UX. I can always tell. Nothing makes a fast computer feel slow quite like them, it’s amazing. I would settle for React Native at this point.
Electron is an example of modern software that’s slow. And it’s common and exceptionally slow.
I think most software is designed to load once and never be quit; you’re probably just supposed to leave photosnot running in the background and let it swap to disk if needed.
But I have a hard time running like that, I quit programs when I’m done using them.
I wonder if there is a name for the idea that the more you earn the lazier you get.
Sysadmins used to know a lot but earn pittance compared to “devops” or “SRE”; then those more expensive folks outsourced the majority of their hard work to cloud vendors who charge 5-11x the cost for compute.
Developers earn 5-10x more than 15 years ago, yet continue to choose solutions for their own convenience and time saving. Stating that its better for the company if they have to work less.
Surely at some point its just your job to consider the entire ecosystem and your contribution to it.
Because they -are- providing value, and because dev resources are limited. If you're a dev, do you think your company would want you to spend 2 weeks on building a new feature, or on "considering the entire ecosystem"?
And more than that - if you have a team that only/mostly has web experience, they're going to use Electron, and... what kind of company would want them to stop and go learn native app development, or hire extra people just to get a native app?
Note: I fully agree with you, I do feel that as a profession we really have descended into a weird "ship the MVP, ship new features non-stop, forget quality" reality, and I dislike it. I'd love it if everything was resource-thrifty native apps, and dev teams could stop chasing new and shiny things and work on improving their existing apps until they were shiny polished diamonds. I'd love to point to an app and say "see? this is 20 years of development, look at how nice it is, how perfectly thought out everything is, how everything just works flawlessly".
But the fact that that's not happening is not because devs are lazy, or not worth their money. It just doesn't make business sense.
> I'd love to point to an app and say "see? this is 20 years of development, look at how nice it is, how perfectly thought out everything is, how everything just works flawlessly".
I don’t think developers earn 5-10x more than 15 years ago. Assuming you’re using something like $250k as a typical salary (for dev jobs at US tech-hubs) I guarantee you the typical equivalent in 2008 was not $25-50k.
Average income in tech-hub area in 2007 was: $85,355 for "Software Developers"[0]
Salary for developers that we discuss on HN seems to be in the region of $250k base, but that's not factoring in TC- which seems to be the bulk of income for the last 5 years. It's not unheard of around here to hear people talking about pulling in half-million if they are ex-FAANG. Which is nearly everybody these days.
Sure, if they're calling into an Electron front-end. Otherwise, each front-end is different so no amount of C++ and dev exeperience is going to take your Android UI calls and transform them into iOS UI calls. So, you'll have to build your own wrappers over each native UI framework to interact with, costing more money.
Ah, you live in a parallel universe which is the same as ours but Qt and other frameworks to do exactly this but without wasting a ton of resources don't exist?
On the contrary, decent developers should shame shoddy ones every chance they get. Maybe that would keep them from making horrible software that millions of people have to suffer every day.
Keep hoping, because it won’t happen. Practically no consumers care except for other software engineers complaining loudly on HN, and meanwhile companies save millions by not handcrafting their CRUD apps in assembly.
The problem is having to target 5 operating systems. Extremely complicated web app runs on Linux, Windows, Mac, Android, and iOS. Carefully-crafted Native UI widgets only target one platform at a time.
If you want to get rid of Electron, you personally have to volunteer to be responsible for 5x the work of anyone else, and accept bootcamp-graduate pay. Do you see why it's not happening?
Arguably the success of MacOS has caused the expansion of Electron, because developers are using macs and want to use macs even when programming tools for windows.
> Practically no consumers care except for other software engineers complaining loudly on HN, and meanwhile companies save millions by not handcrafting their CRUD apps in assembly.
And you know that because you’ve analyzed whole market and know opinion of users of every web software? Definitely not because you only live in bubble of other software engineers so you’re not exposed to complaints of common folk?
No end consumers I know or even acquaintances of other people I know even know nor care about Electron. You might say that I'm in a bubble but at that point it's just the No True Scotsman fallacy.
You can make slow & buggy software anywhere, with any stack. Teams themselves shipped a whole release focused around sucking significantly less & were getting much faster.
Discord has gotten pretty solid, vscode is spit fast. The hate against Electron feels so selective & antagonistic, picks it's targets, never acknowledged any value, never recognizes any points for the other side. It is just incredibly lopsidedly biased. Over what doesn't feel real to so many.
I was hoping Tauri would be the one, by using a shared library model. That would drop memory consumption down significantly. But they use webkit on Linux and I'm sorry but I want a much better base that supports more modern capabilities.
Objectively he’s right and there’s a reason the iPhone mini was discontinued. The demand for mini phones is niche but very loud online. It isn’t economical to support a mainstream product for the handful of people who are actually willing to pay for that specific niche feature point.
Kinda like headphone ports and microSD slots on android phones. Are there models that do this? Yes! Is anyone willing to pay more for it? No! But they are very very loud online about it.
There do exist smaller android phones as well, just like a headphone port nothing is stopping you from buying one instead of complaining on the internet!
> People want a small phone, but perhaps not if it costs like 6 phones? How is this hard to understand?
iphone mini wasn't priced like 6 phones, and it still didn't sell well. again, you can't seem to grapple with the reality here that what you want is a very niche thing and it doesn't sell well as a mass-market product regardless of price. that is, after all, why apple stopped making that size once again.
the iphone SE is about as small as the market wants. Below that is the domain of mini phones - which do exist, but mostly from smaller android vendors.
> No they don't. There is NOTHING on the market currently that is sized and priced like a samsung S5 mini today.
How do you know they care but have no choice? Are average people besides devs talking about how slow their web apps are?
For your small phone example, even Apple is cutting production of their smaller phones because not enough people are buying them. The entire premise really is overblown online.
The arrogance to believe that their software is so well written already that the only way to improve their performance is by using handcrafted assembly.
Just changing their high level data structures alone would yield 10x improvements, easily.
If they knew the main reason they have to shell out hundreds of dollars to upgrade their computers and phones is because of lazy/shoddy developers they would care.
Nothing says you have to use it to make a game. I recall seeing articles here on HN about using it as a cross-platform app development framework. GDScript may be less appealing than JavaScript, however.
Electron is quite efficient. You get a modern GPU composited UI, and JavaScript is by far the most highly optimized scripting language out there.
The problem with Electron is RAM, disk space, and download size, which is because every app needs to ship its own copy of Blink. This is a solvable problem, but nobody is incentivized to solve it.
A similarly bad problem is that Electron apps by default does not adhere to the operating system conventions and API:s, and often also does not use the the features which makes the OS good.
For example all Electron apps I use have their own implementation of a spell checker rather than hooking into the system one. Normally in Mac OS if I add a word to the system dictionary all apps will learn that word, but in each Electron app I will have to do it again, and again, ...
Many also implement their own UI widgets like basic buttons and input fields whose UX is not at all in line with native apps.
Most consumers don't care about native controls. In fact for a company it's more important to have a unified UX across all devices so that the user can pick up easily from one to the other, ie Macbook to iPhone. Slack is a good example of this.
Strong disagree with your assumption. I __hate__ unified controls across devices. When I am on an iOS device I want iOS controls, not the same as I have on Windows, and on Windows I want Windows controls, not something kind of like Mac OS. And I am 100% certain pretty much every consumer would actually agree if they were given the opportunity and knowledge about the situation, and the "unified" look is just something developers like me lie to ourselves about since it is easier to do it with Electron and not spend time and energy to do a properly good user experience.
Well I can tell you having doing user interviews about precisely this topic for our teams that users really don't care about device specific controls and we've found that it's really devs that care a lot about it, and even then it's been specifically Apple users. Windows or Linux users didn't care as much, although Linux users cared a bit more than Windows users. Most of our interviewees said they either don't care or prefer unified controls per app.
No, we never asked them about our own application specifically, it was about what they preferred generally across all of their devices. So, no flaws here. You may just be projecting your own bias for OS specific controls onto the general populace.
I highly doubt your research was so perfect, since it seems it wasn't even peer reviewed and published, so that conveniently nobody can point out methodology flaws.
I never said it was some peer reviewed study, just that we did it for our team to figure out how to lay out the UX, that doesn't mean it's not statistically and methodologically valid. It seems you are also trying to project your biases on top of my comment so as to question any sort of research that we conducted.
Notice that the Slack app has a similar layout on both mobile and desktop, with a list of teams on the left, followed by channels, followed by the channel content. There are no device UI framework specific elements anywhere, it doesn't use iOS or Android style buttons nor is it designed with Fluent Design on Windows or HIG on macOS.
Agreed that this is a good example where designing for everyone ends up with a worse product for everyone. Same thing goes for Teams, GitHub Desktop, and Postman.
It's horrible having to constantly context switch _within the operating system I am using right now within this very moment_. Context switching when switching devices is not a problem at all.
If somebody just made Electron into a HTML/CSS/JavaScript engine capable of opening just the HTML/etc parts from, say, an HTTP server, and then shipped it to all users worldwide...
The RAM usage is from JavaScript heaps, not from multiple copies of the binaries.
(There basically aren't memory metrics that can show the effect of binary sizes on macOS. It'd show up as disk reads/page ins/general slowdowns instead, but SSDs are so fast that it's very hard to see one.)
> people get back to building applications that don't consume hundreds of megabytes of RAM to render a hello world.
Wasn't the whole OSX's new paradigm a switch to high level language and frameworks , with compositing window management to display super fancily rendered hello worlds ?
Decrying cycle burning for the sake of high level abstractions feels like a complete rebutal of everything OSX/macos ecosystem stood for.
> Wasn't the whole OSX's new paradigm a switch to high level language and frameworks , with compositing window management to display super fancily rendered hello worlds ?
No, not really. You're making it sound like Classic MacOS applications were written in assembly, when it was mostly C++. Switching to Objective-C was more of a lateral move, not a clear step forward in time or level of abstraction.
The Mac OS X graphics stack was a pretty big change from Classic MacOS, but it wasn't that much of a change from NeXTSTEP, which worked with 12MB of RAM + 1.5MB VRAM. NeXTSTEP just wasn't as eye-catchingly colorful. Early OS X did poineer gratuitous use of some effects like translucency with blurs in desktop GUIs, but since this was the time period where GPUs were becoming mainstream, these effects weren't egregiously resource-intensive.
And considering that OS X was adapted with relative ease to power a smartphone with 128MB of RAM (with the graphical effects turned up to 11), it definitely wasn't fundamentally bloated in the way that today's Electron apps are.
Many Obj C libraries were a wrapping layer on top of the Carbon APIs and in terms of abstraction Obj C is also a level above C++. Passing everything as messages is an additional layer of indirection.
NextSTEP itself was a big change compared to X or Windows's GUI stack. NeXT machines were beefy and adjusted to high workloads so the user experience wasn't problematic, but it's still more resource use than the standard stacks of the time. I don't see it as a bad thing, but we should acknowledge resource economy wasn't a target of the neither NeXT nor Apple at that time.
> 12MB of RAM + 1.5MB VRAM
NeXT launched in 1988, basically around the times when "640k should be enough for everyone" was a meme. Even in 1996, 12MB of RAM was nothing to sneeze at. So, sure those are not enormous numbers, but they're far from the lower end of what was sold to consumers at the time.
> smartphone with 128MB of RAM
We should take a minute to note that 128Mb or RAM is not small for a phone at the time.
Otherwise, sure electron apps use a lot more than that, but the point was always to pay for powerful hardware and use it at full potential to run fancy apps. People loathe electron apps' bloat, but ease of development has also always been one of the core values. Better have apps that aren't fully optimized than no apps at all.
Electron’s bloat and power consumption wouldn’t bother me so much if the performance was actually good on a fast machine. But the experience is awful compared to a native app.
My guess is that Slack for instance would never make a full blown native client for macos. If it wasn't electron, they'd probably have gone with Mono or another cross platform stack, and we'd see performance lag and memory bloat compared to a handcrafted native app.
I say that looking at macos' pretty big market share compared to what it was a decade or two ago.
And we see tons of users choosing VSCode or IntelliJ's offering above Textmate, Codea, or even BBEdit, so the market for handcrafted delightful editors is also pretty slim. And that trend was there since the Eclipse days, so nothing new either.
All in all, I really think the choice comes down to electron/cross compiled apps or no apps, in many many cases.
OSX used a high-level framework that could, at one point, produce a very fancy hello world (or highly sophisticated application) on a 25-MHz 68030 with 8 MiB of RAM.
To the CPU it always boils down to sequences of instructions operating on data, so the syntax sugar put on top has be undone first, the abbreviations have to be unrolled, some way or another.
But more importantly it's really not about the lines you code write once, or very few times in comparison -- it's about the computer does any time the application starts up on any machine, or any user does something. Obviously, to an extent. You wouldn't write a thousand lines of code instead of 100 to make something that is already very fast 10% faster. But I certainly would write 10 line instead of 1 if it meant the code is 10 times faster. It's just typing after all.
My point was that CPU cycles mean nothing now. Compared to programmer cycles they are effectively free. So stuff it into RAM and forget about it. Why not? Complexity is bad, and being able to hide it (effectively) is good. I shouldn't be thinking about pointer allocation when building a a text input box for a UI.
> For every extra cycle an hardware engineer can squeeze out of silicon, you will find a programmer adding one hundred cycles to their program[6]. In other terms, I fear that once devs figure out how powerful the M1 is, they will throw more "features" at it.
Can we please stop replicating the "growth" fallacy into software ecosystems? Haven't we learnt enough about how unsustainable and damaging this is?
I hope Electron/CEF die soon, and people get back to building applications that don't consume hundreds of megabytes of RAM to render a hello world.