I agree with this assessment of the 'why' - and it's why I mentioned things like turbolinks and liveview. What you mention here seems like a convoluted solution to the problem, but I realize it's how it's done, and I've written thousands of lines of code doing it.
In fact, the original AJAX stuff in the early 00's typically did a primitive version of this since it was the most obvious solution during the era of server side rendering: just have the server render the new comment and slap it in as-is. Instead of extending the reach of that concept instead we shifted towards generic data access APIs and pushing all the complexity to the client, which has resulted in the gigantic mess we see.
I remember writing a basic client-side templating system in literally 2002 that naively regenerated the whole page in Javascript in response to AJAX API updates - you wouldn't notice unless you looked at the CPU being pegged and realize your computer was useless for multitasking if that site was open. It was clearly a bad idea. Little did I realize at the time that the next ~20 years of web software development would take that approach and just try to optimize it.
Asking the server for new HTML means you can't show changes until you've done a roundtrip with the server. That means no optimistic rendering or immediate feedback for local changes, such as a comment being made or an option being set in a modal that opens up related settings, etc. I think it's extremely useful to have a system that allows the client to respond to changes locally on its own as the default case rather than treating local interactions like a strange one-off exception.
React only re-renders the components that have their own props or state change. There seems to be a popular misunderstanding that React that makes you re-render the whole page in response to changes, but that's not how it works.
Iām aware of how react works. Server round trip time alone is not a reason to push to the client. Light is fast. Computers are fast. See: Phoenix LiveView
> Instead of extending the reach of that concept instead we shifted towards generic data access APIs and pushing all the complexity to the client, which has resulted in the gigantic mess we see.
So like, fat clients done badly? I had a Usenet client in 1993 that provided just as good of a forum experience as we have now, in 16 bits and 4 megs of ram.
In fact, the original AJAX stuff in the early 00's typically did a primitive version of this since it was the most obvious solution during the era of server side rendering: just have the server render the new comment and slap it in as-is. Instead of extending the reach of that concept instead we shifted towards generic data access APIs and pushing all the complexity to the client, which has resulted in the gigantic mess we see.
I remember writing a basic client-side templating system in literally 2002 that naively regenerated the whole page in Javascript in response to AJAX API updates - you wouldn't notice unless you looked at the CPU being pegged and realize your computer was useless for multitasking if that site was open. It was clearly a bad idea. Little did I realize at the time that the next ~20 years of web software development would take that approach and just try to optimize it.