Hacker News new | past | comments | ask | show | jobs | submit login

> I think there’s a bit more nuance to it. React (and other vdom implementations) try do be as efficient as possible when diffing / reconciling with the DOM. Sometimes this can result in improved performance but there are also use cases where you’ll want to provide it with hints (keys, when to be lazy, etc.). https://reactjs.org/docs/reconciliation.html

The key part is remembering that every one of those techniques can be done in normal DOM as well. This is just rediscovering Amdahl's law: there is no way for <virtual DOM> + <real DOM> to be smaller than <real DOM> in the general case. React has improved since the time I found a 5 order of magnitude performance disadvantage (yes, after using keys) but the virtual DOM will always add a substantial amount of overhead to run all of that extra code and the memory footprint is similarly non-trivial.

The better argument to make is your last one, namely that React improves your average code quality and makes it easier for you to focus on the algorithmic improvements which are probably more significant in many applications and could be harder depending on the style. For example, maybe on a large application you found that you were thrashing the DOM because different components were triggering update/measure/update/measure cycles forcing recalculation and switching to React was easier than using fastdom-style techniques to avoid that. Or simply that while it's easy to beat React's performance you found that your team saw enough additional bugs managing things like DOM references that the developer productivity was worth a modest performance impact. Those are all reasonable conclusions but it's important not to forget that there is a tradeoff being made and periodically assess whether you still agree with it.




I agree. I am curious though about how substantial the memory and diffing costs are. I don’t mean that in an I doubt it’s a big deal way, rather I’m genuinely curious and haven’t been able to find any literature on the actual overhead compared to straight up DOM manipulation. I would imagine batching updates to be an advantage of the vdom but only if it’s still that much lighter weight (seeing as you can ignore a ton of stuff from the DOM).


> I would imagine batching updates to be an advantage of the vdom but only if it’s still that much lighter weight (seeing as you can ignore a ton of stuff from the DOM).

There are two separate issues here: one is how well you can avoid updating things which didn't change — for example, at one point I had a big table showing progress for a number of asynchronous operations (hashing + chunked uploads) and the approach I used was saving the appropriate td element in scope so the JavaScript was just doing elem.innerText = x, which is faster than anything which involves regenerating the DOM or updating any other property which the update didn't affect.

The other is how well you can order updates — the DOM doesn't have a batch update concept but what is really critical is not interleaving updates with DOM calls which require it to calculate the layout (e.g. measuring the width or height of an element which depends on what you just updated). You don't necessarily need to batch the updates together logically as long as those reads happen after the updates are completed. A virtual DOM can make that easy but there are other options for queuing them and perhaps doing something like tossing updates into a queue which something like requestAnimationFrame triggers.


So you could probably describe vdom as a smart queue. How smart it is depends on the diffing and how it pushes those changes. Abstracting this from the developer. Bound to be less efficient than an expert (like an expert writing assembly vs C) but just like any other abstraction having both pros and cons.

The question is whether the abstraction is worth the potential savings in complexity (which maybe is not the case, but I sure do love coding in Elm).


Also whether there are other abstractions which might help you work in a way which has different performance characteristics. For example, I've used re:dom (https://redom.js.org/) on projects in the past, LitElement/lit-html are fairly visible, and I know there are at least a couple JSX-without-vdom libraries as well.

There isn't a right answer here: it's always going to be a balance of the kind of work you do, the size and comfort zones of your team, and your user community.


Very interesting thanks for pointing out re:dom. I took a look at their benchmarks and some vdom implementations compare very well to re:dom. I was pleased to see elm’s performance. So it seems like it can be done well when you want it. https://rawgit.com/krausest/js-framework-benchmark/master/we...




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: