The caveats in moving state to the client is that it's a huge perceptual shift for developers, with a steep learning curve.
From my perspective we're coming full circle back to client/server desktop apps... only instead of C++, we're doing it with js inside a browser container... I've done ActiveX controls and Flash components... It's not that much of a stretch.
Agree, althought I'm not fully convinced the web is fit for "web apps", or that even "web apps" make sense. At least not with current technology.
Developing apps to run on the browser still implies a lot of redundancy: you are forced to work with solutions that weren't created for interactive applications in the first place (HTTP, DOM), limited to one language designed by comitee (Javascript) and code reuse is minimal. Everything just feels hackish (at best) compared to native frameworks (e.g., Apple's Cocoa), and in the end it's hard to achieve good UX and compatibility.
I would rather see more people deploying native aplications with great UX backed by REST APIs than shoe-horning apps into browsers (which break the web).
1) Developers are worried that a user won't try their app if they have to install something on their computer.
2) Compatibility , support anything that can render HTML , although this may no longer be the case with the amount of chrome only apps I see.
3) Firewalls, you can get out on port 80 pretty much anywhere, that random port number you decided to use for your app not so much.
4) A few years back web programming was seen as the "easy" way to get into development since writing relatively limited PHP was a lot easier than wrangling C++ and the Windows SDK. Therefor web developers reached critical mass.
Java applets and Flash did a reasonable job with 1 and 2, but they seem to be some of the most hated parts of the web, I think part of this may be because they are see as "too powerful".
People want the web to be lightweight.
The problem with Java applets and Flash is that they are a broken model inside another broken model - they shoe-horn their own VMs to run applications inside a browser. It's the ultimate hack.
The web, as it was originally envisioned, makes perfect sense: HTTP, URIs and hypertext to provide navigable content, period. On the other hand, building interactive applications by manipulating the DOM while reinveinting UI patterns and widgets over and over again seems like a hack that grew to enormous proportions.
I feel the reason why "web apps" are so popular is because it allows small startups to develop more-or-less cross-compatible products faster and with fewer resources than developing a webservice and then hiring 4 engineers: one for a Windows client, one for a Mac client, one for a iOS client and one for an Android client. The fact browsers all follow more-or-less the same standards and they all run Javascript turned them on the "write once, run everywhere" platform that Java failed to deliver, but in my opinion it's still (very) far away from ideal.
Frankly, I'm wondering how nice a GUI you can write inside Inferno. Just have everyone install an Inferno client and serve your app as a mountable filesystem containing the VM code to JIT as a binary file.
Java applets and Flash aren't hated because they're powerful. It's because they're perceived as slow. Java applets used to make my entire browser freeze up for a few seconds when they loaded. Before installing AdBlock, I would regularly see Flash ads chugging along at 100% on one of my CPU cores.
It's a shame that your post is disappearing into the gray background - I feel something similar every time I have to use Javascript & HTML for a rich interface.
Every time that happens, I wish I was using GTK or QT instead.
This is a recurring cycle in application development. We've gone from mainframes and minicomputers serving plain text to dumb terminals, to programs running on personal computers accessing applications and data on servers, to web servers serving structured text to 'dumb' browsers, to powerful in-browser runtime engines accessing applications and data over the web.
It is another iteration of that cycle, only this time we have something to lose, namely the web. We are regressing to client/server with a bunch of ever-changing single-site APIs and shoddy client code third parties can't readily fix, and these are destroying the world-wide web of repurposable content in open formats at stable addresses.
Aren't we just embracing the difference between a site and an API? It's hard to do both well at the same URL. The API provides the repurposable content in an open format, and the site itself is free to experiment with different presentations. Is that so bad?
Before devs started experimenting with client-side rendering, all sites' content was amenable to the same set of tools. Now there are more and more services with broken frontends and unique APIs which are incompatible with everything else and aren't even stable—when you can rev your own client js instantly, you don't know or care whether any other clients broke. I'm stuck using only one client (your js) that works at all and I can't fix it, which is almost everything that client/server got wrong the first time around. It's not impossible to carefully implement a stable form-compatible API in common with a bunch of similar sites, but I don't see it happening.
It's the prisoner's dilemma: yeah, everyone on the web would be better off if businesses weren't doing this, but individual businesses will continue to do it as long as it's in their best interest to retain very tight control of and limit third-party access to the data and meta-data they capture from users.
I'm not sure how or even if the infrastructure of a distributed system like the web could be engineered so as to prevent this kind of situation. Perhaps the solution is to build in a system of financial incentives -- not unlike what the Bitcoin folks have done to solve the Byzantine Generals problem. It's an interesting problem.
For usability response time is everything. So for the moment if people want to build richer apps then they will have to be asynchronous. But do you think there will ever come a day when you can assume the network is always fast and we will be able to go back around the circle to the mainframe again?
We have the ways to transfer only data (versus data + markup + DOM triggers) on partial requests with the even less taxing than rendering a full static page.
is it really "less taxing"? An app page that polls every few seconds to get back an empty JSON block is going to be eating up the network radio on a smartphone constantly, vs just getting back 120k HTML page, and a few CSS/image requests one time over, that may then sit there and be read (or a form filled out) for the next few minutes.
I just submitted a talk on this very same notion, "all of this has happened before, and all of this is happening again", in regards to the rebirth of client/server.
It's in both mobile apps, browser based JS app, etc.
Now that native App-centric ecosystems exist, the client-server model is more pronounced. Supporting these different clients forces this perception shift to a more client agnostic server back end.
From my perspective we're coming full circle back to client/server desktop apps... only instead of C++, we're doing it with js inside a browser container... I've done ActiveX controls and Flash components... It's not that much of a stretch.