Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Given this goal, you might be surprised to learn that Finda is built with Electron, a framework that’s often decried for being the opposite of fast.

I thought the criticism was more about its memory usage, not that it wasn’t fast.



In fact, there are many downsides performance wise when you think about electron:

- it is as slow as any other web rendering engine

- it uses as much memory as any other browser

- you have huge binaries

The key benefit of electron is that you can easily build cross-platform apps which run on every notable platform. Many of the general weaknesses of electron can be eliminated by building a Progressive Web App, but that doesn't solve the performance challenges that come with building a web app.

Performance wise web apps are very slow when compared to C/C++/Rust/Go. E.g. for a C program, a single millisecond itself is a long time. For web-apps you struggle to get everything in a 16ms window to not break your 60fps experience.

But the point is that while web apps are in fact much slower, they can be fast enough. And if you have a technology that is very platform independent and you can built good experiences with it, then you probably find some nice use-cases.


>The key benefit of electron is that you can easily build cross-platform apps which run on every notable platform.

I've been a bit confused about the draw of this feature. I understand that there are a lot of JavaScript developers out there, and things like Electron allow them to use their JavaScript knowledge to make applications on many platforms. But there are now a large number of GUI libraries (GUI being a typical obstacle for cross-platform support) that can target all these platforms (and sometimes more) as well. GTK+, Qt, and WxWidgets name a few. And they have bindings across many languages! Of course, to make something that's completely custom-looking using these libraries may be more of a hassle than an html/css document.

Maybe Electron's popularity _is_ primarily driven simply by the high numbers of JavaScript programmers out there.

I guess it's like you pointed out: if you can get it to be fast _enough_, it's acceptable. Kind of similar to interpreted vs compiled languages.


Practically all libraries you mentioned have heir quirks and don't have the native feel, even though they pretend to. Electron apps don't pretend anything, the GUI elements are different but consistent.

But apart from that, you're right: the main reason for Electron's popularity is the fact that increasingly more programmers know JS quite well whereas desktop libraries seem less and less relevant each year, which makes me very sad.


The problem with Qt, GTK, etc. is that you can't use them on websites[1]. Meanwhile you can build most applications as websites, which don't require any installation and are available everywhere and instantly.

I am a Linux desktop user myself, so I have some sympathy for Qt, GTK and software distribution via package managers. Nevertheless, as a developer I like the web better, as that is a platform which allows me to easily distribute my applications to anybody on any platform within seconds.

Performance on the other hand is a topic by itself. The problem I see with web applications here, is that it is much easier to create a bad performing program with web technology than with compiled languages and toolkits. But that is something you can learn and if you create some performance critical software, easy distribution might not be your top priority.

[1]: I mean I have seen GTK rendered on a html canvas some years ago, but as far as I know nobody actually uses that kind of tech.


For that matter, I remember seeing some Qt thing for targeting websites too. But I agree with your points. Lately I've had to write UIs that actually need decent performance (client-side custom rendering and data processing requirements), so my preferences are biased.


the Broadway backend is an amazing but of technology


A browser rendering engine offers a lot more flexibility, expression, and modern feeling than most UI libraries that work natively.


All of our industries GUI mind share has been in the web space, most of the desktop GUI libraries have been relatively ignored.


Is there a comparison online that shows the platform availability of each approach?


Wikipedia had this: https://en.m.wikipedia.org/wiki/List_of_widget_toolkits (under High-level widget toolkits). Not too detailed though.

In terms of language support, I've seen bindings for the more popular ones on all major languages and even some more minor ones. There's probably better comparisons out there; this one doesn't have Electron at all (since Electron is more of an environment/VM than a GUI-only toolkit).

Often running in a VM bubble has it's own features/benefits though. For programmer benefits, I can't think of a better example than Smalltalk VMs. But I digress...


I wonder how do we define "fast enough" though.

At first 16ms sounds little ridiculous. Given how we know the display, and our Hardware input devices ( Keyboard or Mouse ) already took away a few ms already.

But then when we say 16ms is a single frame in 60fps experience. All of a sudden this doesn't sound a lot. I mean I am expecting VR to be in 120 or 240fps range.


There are already monitors that can go as high as 144hz (and some are higher) and I've been eyeing some for a quite a while. Hearing about 16ms being the target just makes me disappointed.

(It may be the right target for 99.9% of their users currently and so a sound decision to make, but I can't help but think in the future they might struggle with the decision. Note: 99.99+% of people probably literally do not care, but I do.)


Anecdote 2: having a 144hz monitor is so pleasant that using a 60hz monitor causes noticable annoyance, even for web browsing.


Anecdote: I love my 144hz monitor for gaming. I can tell when the underlying settings have changed it back to 60, and it annoys me greatly. Highly recommended.


Got a gsync 144hz monitor. Not planning going back to 60hz to put it mildly.


I guess the question is, what are you willing to give up to hit 120 or 240 fps? Most games that I'm aware of won't probably won't hit 120 fps today even if run on the most powerful hardware at the lowest graphics settings. Even if you take games from a decade ago on today's best hardware, you probably won't see 240 fps.


That's a pretty big exaggeration. Taken from the first GPU review I looked at, a GTX 1070 Ti can run DOOM (the new one) at ultra settings at 170 FPS. [1] Sure that's a bit faster than an average gamer's card, and DOOM is better optimized than many games. But the most powerful hardware running a decade old game at low settings will easily get several hundred FPS. (It's hard to find benchmarks of this, though, so I settled for a recent game at Max quality).

The counterexample is games which were designed for fixed 30fps which fall apart at higher framerates, but those are generally limited to Bethesda games and Japanese console ports.

[1] https://www.anandtech.com/show/12373/the-evga-geforce-gtx-10...


I know the Counter-Strike: Global Offence community always goes for extremely high FPS. I found this without much effort [1] where they discuss how to raise the fps cap from only 300. It's a game from 2012.

[1] https://steamcommunity.com/app/730/discussions/0/15464478762...


But not at 4K... In the end, I'd rather hit 60fps at 4K before 144fps at 1080p.


Most applications I have tried feel slow as well; especially when running for a while.


That might be caused by GC, which is ultimately caused by memory usage and the runtime attempting to reduce that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: