The original tradeoff was mega-cap FAANG companies trying to offload processing power to the client. There never was an organic open source push for SPA's or front end JS frameworks. They add a ton of tech debt and degrade the UX. Premature optimization and anti-pattern for everyone but a handful of companies, imo.
The old world was having a complex web stack that included strange templating languages hacked onto languages that were sometimes invented before HTML was even a thing (see: Python) that spat out a mix of HTML and JavaScript.
Then there was the fact that state lived on both the client and the server and could (would...) easily get out of sync leading to a crappy user experience, or even lost data.
Oh and web apps of the era were slow. Like, dog slow. However bloated and crappy the reddit app is, the old Slashdot site was slower, even on broadband.
> They add a ton of tech debt and degrade the UX.
They remove a huge portion of the tech stack, no longer do you have a backend managing data, a back end generating HTML+JS, and a front end that is JS.
Does no one remember that JQuery was used IN ADDITION TO server side rendering?
And for what its worth, modern frameworks like React are not that large. A fully featured complex SPA with fancy effects, animations, and live DB connections with real time state updates can weigh in at under a megabyte.
Time to first paint is another concern, but that is a much more complicated issue.
If people want to complain about anything I'd say complain about ads. The SINGLE 500KB bundle being streamed down from the main page isn't taking 5 seconds to load. (And good sites will split the bundle up into parts and prioritize delivering the code that is needed for initial first use, so however long 100KB takes to transfer nowadays),
> Oh and web apps of the era were slow. Like, dog slow. However bloated and crappy the reddit app is, the old Slashdot site was slower, even on broadband.
Just those that attempted to realize every minuscule client side UI change by performing full page server side rendering. Which admittedly were quite a few, but by far all of them.
The better ones were those that struck a good balance between doing stuff on the server and on the client, and those were blazingly fast. This very site, HN, would probably qualify as one of those, albeit a functionally simple example.
SPAs are just a capitulation in the face of the task to strike this balance. That doesn't mean that it is necessarily the wrong path - if the ideal balance for a particular use case would be very client side heavy (think a web image editor application) then the availability of robust SPA frameworks is a godsend.
However, that does not mean it would be a good idea to apply the SPA approach to other cases in which the ideal balance would be to do much more on the server side - which in my opinion applies to most of the "classic" types of websites that we are used to since the early days, like bulletin boards, for example.
> Oh and web apps of the era were slow. Like, dog slow. However bloated and crappy the reddit app is, the old Slashdot site was slower, even on broadband.
Which reddit app are you talking about, the redesign or old.reddit.com? I ask because the old version of reddit itself certainly wasn't slow on the user side, iirc reddit moved to the new SPA because their code on the server side was nigh unmaintanable and slow because of bad practices of the time.
> Time to first paint is another concern, but that is a much more complicated issue.
That's the thing though, with static sites where JQuery is used only on updates to your data, the initial rendering is fast. Browsers are really good at rendering static content, whereas predicting what JS is going to do is really hard..
The new reddit site on desktop is actually really nice. Once I understood that it is optimized around content consumption I realized how it is an improvement for certain use cases. Previously opening comments opened either a new tab, or navigated away from the current page, which meant when hitting back the user lost their place in the flow of the front page. The new UI fixes that.
Mobile sucks, I use RIF instead, or old.reddit.com if I am roaming internationally and want to read some text only subreddits.
> That's the thing though, with static sites where JQuery is used only on updates to your data, the initial rendering is fast. Browsers are really good at rendering static content, whereas predicting what JS is going to do is really hard..
Depends how static the content is. For a blog post? Sure, the content should be delivered statically and the comments loaded dynamically. Let's ignore how many implementations of that are absolutely horrible (disqus) and presume someone at least tries to do it correctly.
But we're all forgetting how slow server side rendering was. 10 years ago, before SPAs, sites took forever to load not because of slow connections (I had a 20mbps connection back in 1999, by 2010 I was up to maybe 40, not much has changed in the last 10 years) but because server side was slow.
If anything more content (ads, trackers..) is being delivered now in the same amount of time.
New reddit makes it easier to push ads; any other motivation for its implementation is an afterthought. There's plenty of valid criticism that can be levied against the claim that the redesign is "superior" by default. And I think often we confuse amount of information with quality of information exchange. Due (mostly) to the ever increasing amounts of new users that it desires, you could easily make the point that the quality of content on reddit has nosedived. Optimizing for time on site is not the same thing as optimizing for time well spent.
Reddit as a company obviously wants more users; a design that lets people scroll on through images ad nauseam is certainly better than a design that is more information dense, so if that's something you'd cite as an example of "better in certain use cases" then I agree, otherwise there's plenty of reasons to use old.reddit.com from an end user's perspective.
Even if everything you said was true (it's definitely not!) that doesn't explain why the web is bogged down with entirely static content being delivered with beefy JavaScript frameworks.
10 years ago it was static content being delivered by ASPX, JSP and PHP, with a bunch of hacked together JS being delivered to the client for attempts at a rich user experience.
It still sucked. It just sucked differently. I'll admit it was better for the user's battery life, but even the article shows that it was not any faster.
I don't know where this misconception came from - XMLHttpRequest was invented by Microsoft for use in Outlook Web Access, Gmail was essentially a copy of that.
The first web versions of Outlook were plenty fast and usable on my then workstation (PIII 667 MHz w/ 256 meg). In fact, a lot of the web applications made 15 years ago were fast enough for comfortable use on Pentium 4 and G4 CPUs, because most used combinations of server-side rendering and vanilla JS. It was painful to develop, sure, but the tradeoff in place now is severely detrimental to end users.
Let them experience the pain of rebuilding that old wheel :)