A lot of people seem to think that Single Page App frameworks like Angular/Ember are suitable for use on the public facing client side. I've always believed that SPAs are meant to be behind a login, where you don't have to also deal with spiders and other sub-optimal browsing devices, and you have a little bit more wriggle room when it comes to routing and web history.
Just look at Blogger...their client-side rendering is annoying as all get out. It's just a blog post, render it server side and give me the content, then sprinkle on some gracefully degrading JS on top to spice it up.
I say this as a huge proponent of Angular who uses it for all his web app projects who also wouldn't ever use it on a public facing application.
I agree that things like blogger are a great example of what not to do, but I'd go further and say that treating something as a single page web app running on the client side throws away most of the advantages of the web:
URLs which can be stored and shared and are idempotent
Mostly stateless operation for anonymous use (fast to serve/load/cache)
Document formats that anything (including dumb crawlers and future gadgets) can parse and reuse in unexpected ways
What you call suboptimal browsing devices are what makes the web special and distinct from native apps. These are not trivial advantages, and most websites would benefit from these strengths of the web, even if they are an app.
As an example of where something like a single page app can shine on a public site, I've seen chat software which used it which worked really well (using socket.io I think), but only because people didn't care about sharing individual messages and the chat was ephemeral.
> Document formats that anything (including dumb crawlers and future gadgets) can parse and reuse in unexpected ways
As in the article, you can use phantomjs to serve up static HTML to crawlers. They are correct in that it does slow you down and add complexity.
The main problem I think is that SPA tech is still immature and getting all the moving parts to build a public facing SPA working together is a time sink.
It's not really a single page application if you are serving separate pages is it? BTW, the page you linked to says 'Uh-oh! Couldn't find that page.' before loading and displaying the content... ouch.
One of the things I love about the web is that it uses incredibly simple building blocks like simple html pages at defined stateless URLs, dumb servers and dumber clients, and builds complex systems through their interaction. I'd be very wary of solutions that drop those advantages.
There are certainly technical solutions possibly to almost any problem with angular or client-side apps in general, but I'm not sure that rendering everything client-side really gives you enough advantages to warrant it for many websites. What do you see as the main advantages to this approach and do you see it spreading everywhere eventually?
Every website is different and what suits (say) a chat application will not suit a document-orientated website at all. There's certainly room to explore both approaches or even mix them at times.
You make some really good points, but I think simple json documents are much simpler and easier to re-use by other clients in interesting ways than simple html pages. I think the API + client (note, not just traditional web browser) rendering is actually a more "pure" interpretation of what the web can be - data sources and data consumers that interpret and present that data on behalf of their users.
I'm also not sure that rendering everything client-side is advantageous enough to warrant its current popularity (hype...), but I do see some advantages. Firstly, I think it is a better separation of concerns - the server is in charge of data discovery, persistence, consistency, and aggregation, while the client is in charge of determining how that data can be most useful in the current context. In practice, this means it is possible to have different front-ends for the same back-end. Admittedly, that is certainly not always a necessary or desired feature. The separation also makes it easier to build the front end and back end of an application separately from one another, and possibly even in whichever order you prefer. That can be a good thing, though I don't think it's really taken advantage of very often. I also think that true web applications can be made to feel much snappier and closer to native. The line between what should and shouldn't be considered an "application" is unfortunately blurry (the Blogger example is a good one).
I think simple json documents are much simpler and easier to re-use by other clients in interesting ways than simple html pages. I think the API + client (note, not just traditional web browser) rendering is actually a more "pure" interpretation of what the web can be - data sources and data consumers that interpret and present that data on behalf of their users.
This is an interesting point - if you are representing numeric data like chart datapoints, a representation like json might make it cleaner and more reusable by other services or clients. Of course much data is actually formatted documents or snippets of text, in which case json is not such a good fit and html is perhaps better. In many ways html is a worse is better solution, but that is probably part of its strength - it is very easy to get started with and munge to extract or insert data.
I'm not sure a separate of concerns between server and client is necessary and helpful to all apps, though I'm sure in some cases it is useful (for example serving the same json or xml to a mobile app, a client-side app and some desktop client, or separate teams working on both), but then a server-based solution can easily output both formatted html for traditional clients (web browsers, which are now and in the future ubiquitous), and a json or xml representation for other clients - this sort of separation of concerns between data and presentation is not really exclusive to client-side solutions.
I'm not sure it makes much sense to refer to a JSON packet as a "document". HTML is truly meant to represent documents, with embedded semantics. JSON is really meant to represent data or objects in the most abstract sense. It has no notion of embedded semantics.
" I'd be very wary of solutions that drop those advantages." They are called native applications. I can think of some useful ones over the years, particularly for people who produce rather than consume. I notice that my Bosch drill isn't available for seo and mashing :) Seriously though, it depends on your perspective. What's wrong with saying I'd like to make a native app but use the web as a delivery/installation mechanism and that's all?
Nothing really, there's room for all approaches to be explored.
I suspect the concept of native APIs (desktop or mobile) will eventually disappear, but it'll be an interesting journey if we ever do reach that point and would take decades.
Turns out I wasn't using Iron Router properly. My bad.
It is a single page application if you don't make your browser reload the page from the server each time you navigate within your app. URLs here are implemented using HTML5 pushState -- the browser isn't loading or refreshing the page when the URL changes, except for the first page load.
My point is you get the best of both worlds there: static, representative URLs that live forever (as they should); and the responsiveness you get when you only need to load data and move some divs around instead of reloading everything from scratch each time.
In fact Meteor takes things even further with latency compensation: it predicts how things will change as you interact with the app and does eventual consistency with the server state. This makes updates/deletes feel even faster.
But yeah, it's a trade off. And right now it's a big trade off -- my productivity has dropped in some places, compared to writing a simple app in express or Rails.
I think the term "Single Page App" is a bit deceiving in usage sometimes. My idea of a SPA is one where the client downloads the bulk of the application code on first page load, and then only talks to the server with data-based API calls (JSON usually). The predownloaded client then just renders that data, rather than downloading an entire new template to render on the whole screen.
This interface style does not require any visual page refreshes to load new content, but it also still can support routing and deep-linking.
Discourse is essentially a SPA (see http://try.discourse.org/) and designed to be public-facing. It does a good job at providing a very bare, lightweight interface for people with JavaScript disabled and, I'm assuming, for web crawlers.
Most websites shouldnt be SPAs. One can still use Angular to code widgets in a regular page based site,without using a js router.
It's just that devs are getting lazy ,they throw a restfull server app quickly then dont want to deal with any view layer on the server and do everything in client-side js. For some projects it make little sense.
I think the key word is "App." There is a difference in nature between a web app and a web page. Both can be built using the same underlying technologies, but the goals are very different.
Knowing which one is building can greatly inform choice of framework.
Agreed - there is a wealth of applications that are not in the massive scaled consumer market - and making those clean, easy to maintain and deliver is an enormous win. That said, there is a wealth of consumer apps that don't have massive market shares either so the market for learning these lessons is pretty rarefied
Any sort of standard server side rendering, or possibly a front end framework that can be rendered server side as well (I think React can do that, same as Backbone).
Basically, SPA frameworks are useful when you are working with lots and lots of data moving back and forth. A good example is something like Intercom.io's interface. They have tons of tables and modals and data flying around. This isn't conducive to the standard browser request -> server render -> load whole new page on the client. It's just too slow. When you're interacting purely with data in a master interface, SPA frameworks are the way to go. And it isn't even a matter of literal page loading and rendering speed, it's the fact that refreshing the view with a whole new page on each link click is a context change that adds up when you're managing a lot of data or performing a lot of small tasks in an application.
But something like Blogger, where you're reading just some content, maybe some comments...there's no real benefit from loading it in a SPA environment. Render it server side, cache it, and fire it to my browser as fast as possible.
> Render it server side, cache it, and fire it to my browser as fast as possible.
Shameless plug, but we've been trying to do something similar with Forge (getforge.com). We built a Javascript library called TurboJS that precompiles a static HTML build into a JS manifest and loads it all. It's SEO-friendly and super-fast. Our other site, http://hammerformac.com/ uses it, for example.
Precompiling static HTML is a strict sub-problem of the problem most people in this comment thread are referring to, which is development incentives and SEO characteristics for web pages that have non-trivial amounts of dynamic content.
I've started using PJAX where the rendered page does not have to change when its source data does, and where you don't have large tables/calendars that would have to be re-rendered when one data value changes.
Development is significantly faster, less error-prone, easier to maintain. Development can also be given to people with lower skill levels.
"node.js + express + jade" is a fairly common stack, you can even try to move an existing complex angular.js code there because it's written in the same language
The Blogger example doesn't seem fair: Any app can overload the user with too many animations or other distractions.
At Thinkful (http://www.thinkful.com/) we're building our student / education app in Angular, and are moving all browser-side code to Angular as well – both public and private.
In a lot of our splash or SEO-enabled content we're not making use of all of angular's features, but the upside of using it is that we have a single, unified codebase in which we can share libraries, team knowledge and design patterns. Simply put: Using Angular everywhere allows us to keep DRY. Testing the front-end using purely Angular is yet another core asset at Thinkful.
One framework for writing code and testing is much better than a hybrid of server-side rendering and Angular.
Our biggest challenge was SEO, but this was reasonably easily solved with using BromBone (http://www.brombone.com/).
There are reasons to stick with non-angular or JS frameworks, so it's not always a slam-dunk. For example, if Thinkful had millions of SEO pages that we needed to load as fast as humanly possible Angular would be a bit much... But that's not what we're optimizing for: We're building a phenomenal user-experience that we can support long-term, is well tested, can have a lot of developers use, and can have non-developers do their job inside our codebase (everyone at Thinkful codes).
For all this and more Angular has proven a great choice for both logged-in AND public sites.
Totally agree, but you still have the analytics/tracking problems. Unless you're not tracking clicks/activities/views on your various features. But if that's the case, you have bigger problems :)
Some people use screen readers, text-mode browsers, IE due to stupid work/school policies, etc. Some people like automating their workflow, which can involve scripted browser interactions. Some people actually care about security and privacy, and so run NoScript, etc.
Screen readers and browser automation can run javascript. Sure, it may be more pure and perfect for everyone write websites that don't require javascript, but the economics of building websites doesn't support it.
I had the opportunity to work for a million dollar budget client project, about a year ago. (Obviously I'm bound by an NDA, so I won't go into specifics). You can think of this site like an oDesk/Freelancer competitor, but with some social features.
We also had another team from California working on this project, who consistently insisted that we go with Angular for a project of such complexity. Back then, on HN, everybody was writing about how awesome Angular is, and how you must use it for your next project and so on. It reminded me of the early MongoDB days here.
I was under constant pressure from my client too, since he was also reading a lot about Angular everywhere and the Californian company had him almost brainwashed in favor of angular. After already falling trap for the MongoDB buzz (I used MongoDB for the wrong use-case and suffered the consequences), I decided to carefully evaluate the decision to go with Angular for the project.
After about 6 months of using Angular for a different medium-scale project, I decided against it for my client. I realized that Angular is the all powerful Battle Tank. It can do everything you want it to. But it's very tempting to choose a battle tank when all you need is a sedan to get you from home to office.
Angular has it's own use-cases, but for the most part what I observed was that you could get a lot of mileage without using Angular, with just plain Jquery+Knockout (or any other similar framework of your choice) for most of the front-end.
In a simple calculation that I made (to pitch to my client), I estimated about easily 25% of time (and thus money) savings by not going with Angular for our project. (YMMV)
Usually I tend not to open my mouth about/against angular here because most HNers seem to like Angular a lot and they downvote without a reason just for having a different opinion. But, I am really glad someone wrote a blog post about this.
Although angular is aimed at Wide/SinglePage Apps I find it perfectly suited for small scale stuff. It's like a sedan if you wish but with all the safety airbags and comfort of an mvc and the possibility to mount a canon if you move out in the jungle. It gives you a level of confidence I'd find difficult to abandon.
Also I don't see how jquery + knockout is so different from taking the angular/ember road ? (edit: I see why from a legacy point of vue)
Finally the comparison with mongodb is misleading : angular & cie brings your code from procedural mess to a well known structured area, whereas moving away from sql didn't improve any of your architecture it just proposed another one
One of the things that appeals to me about angular is that you can compose your application of smaller angular apps. I think what you're talking about (and the original article) mostly are about single page javascript apps rather than angular specifically. One perk of angular is that you can use it almost exactly like knockout, as a way to get nice two way data binding, without having to go full on SPA. Unlike knockout, it also gives you a nicer enforced structure for your mini apps.
That said, I still go knockout for this sort of thing most of the time because their support for legacy browsers wins out, and in the industries I do work for old versions of IE are sadly common.
I find Angular really helps when you have a fairly complex single-page app that has non-trivial interactions. The complexity tradeoff you make using it is not worth it when you don't have those needs, especially when you have a team of people that need to be up to speed working on it.
Wait, Wait, WAIT. No to angular, but you then casually throw in that instead you went "plain Jquery+Knockout", as if Knockout is commonplace? How is that plain?
Knockout is not commonplace.
They're just alternatives. One is Microsoft's, one Google's. They're both client-side frameworks.
Your post makes no sense at all. All it sounds like is that you're familiar with knockout but not angular and managed to convince the client to use something that everyone else wasn't familiar with, only you.
In fact this post is all about not using client side frameworks like angular. The arguments against it are the same as the one's you'd use against knockout.js. So your post makes even less sense.
Knockout is a common data binding library in the JS world. It does one thing and only one thing very well - databinding. Sure it might have (little) features here and there that allow you to do other things, but it's core feature is data binding.
>They're just alternatives. One is Microsoft's, one Google's. They're both client-side frameworks.
No, they're not. Knockout a is a data binding library. One of several Angular's offerings is data binding.
>..and managed to convince the client to use something that everyone else wasn't familiar with, only you.
That's an assumption. I never said that nobody knew it except me. Sorry if it wasn't clear from my original post, but everyone in my company knows all the major frameworks - Angular, Knockout, Ember, etc. etc. We never get religious over this stuff and always use what's best for the project in hand.
>The arguments against it are the same as the one's you'd use against knockoutjs.
You completely missed my point - Angular is X+Y+Z, my suggestion is to use a framework for X, if you need mostly just X for your project. Replace X with any framework you want, including Knockout. It does not make sense to use a framework that offers X+Y+Z when you need just X or Y. That's my point. I'm sorry you feel offended.
Just to be clear - I'm not advocating any framework by name, including knockout in particular, I just mentioned it because I was documenting my use-case in my original post. Use what suits the best for your project and not because you read about it on HN/Slashdot/etc.
Please calm down your tone and don't get religious about this stuff.
Yeah, Angular and Ember both have their own 'runtimes' essentially which is in stark contrast to knockout, which really is just a library, and there really is a fundamental difference.
I've had a primarily positive experience criticising it and the whole javascript-mandatory mentality in general. Perhaps comparing it to early-2000s Flash sites helps.
This mayaswell be titled: "Why we're paying for re-discovering client-heavy apps are hard or bad." Angular, or <insert hot new JavaScript framework> doesn't particularly matter.
Twitter learned it[1].
Lots of us learned it when we were experimenting as Web 2.0 was being born. Things were far more obvious, far more quickly then, as bandwidth and resources weren't anywhere near what they are today. Back then, we quickly realized that just a couple of delayed asynchronous calls could cause your app to slow to a halt and feel sluggish.
That's not to say it can't be done[2], it's just to say that, thus far for the most part, folks end up discovering reasons why they didn't "do it right" too late over and over. I could be wrong, but I feel like there's been a few posts to Hacker News within the past couple months with similar sentiment.
When people start suggesting client-side rendering, I usually ask something along these lines:
Why on earth would you leave something as simple as textual document model creation up to the client's 5 year old machine that is busy playing a movie, a song, downloading a torrent, doing a Skype call, and running 15 other tabs, when your server is sitting around twiddling it's thumbs with it's 8 cores, scalable, SSD and memory-heavy architecture?
To answer that question -- for the same reason you'd prefer mobile apps on a smartphone. Even in resource-constrained hardware, it makes sense to do things client-side if that's less expensive: no need for server communication on some or all of the app, since it can be cached locally. Sometimes your app can be more expressive than a sequential history of documents loaded one page at a time.
Now, do people really think that way when they adopt these frameworks? Nope. I mean, they might think about speed, but we all know that loading a bit of static HTML and CSS is faster than any JavaScript execution.
That said, I'll ignore my point above and get a bit technical here: Unless you're using Opera Mini, client-side rendering is indeed all we have for "textual document model rendering". That's what we call "HTML" folks when we're not "viewing source". So ... I'd give the client-side a bit of credit here, things will improve with time.
Use the right technology for the job. And that advice keeps changing. Right now, I'm most influenced by http://www.igvita.com/slides/2013/breaking-1s-mobile-barrier... but once you've caching/native, it's a whole different game. And if you add pre-fetching...
That said, I'll ignore my point above and get a bit technical here: Unless you're using Opera Mini, client-side rendering is indeed all we have for "textual document model rendering".
Semantic nitpicking. It's obvious that the grandparent speaks about templating, which can be done both on client and server side.
Honestly, I'm really tired of people who pretend there is no difference between serving up HTML and serving a program that constructs that HTML. The difference is that in the second case you cannot get the content without executing the program written by someone else with all the relevant implications.
Also, people often miss another important fact: server-side rendering can be cached and shared across clients. Client-side templates must to be executed by every client separately.
> Even in resource-constrained hardware, it makes sense to do things client-side if that's less expensive: no need for server communication on some or all of the app, since it can be cached locally. Sometimes your app can be more expressive than a sequential history of documents loaded one page at a time.
> Now, do people really think that way when they adopt these frameworks? Nope. I mean, they might think about speed, but we all know that loading a bit of static HTML and CSS is faster than any JavaScript execution.
You sort of gathered the problem up into a nutshell. People aren't thinking of separating the need for server communication all the way through. They aren't thinking the right way when they adopt the frameworks. They aren't thinking about what they don't know. That's okay, they can't. But, continuing to push the idea that they can is not helping anyone.
Not only is this affecting the actual performance of the app, it's affecting analyzing it, testing it, making it properly available to SEO, and probably other things not illustrated in this common revelation of an article. This isn't just one problem, it's a slew of problems that get so bad that ultimately the entire system needs to be rewritten. If it was just slowness, that'd be one thing, but it isn't.
> That said, I'll ignore my point above and get a bit technical here: Unless you're using Opera Mini, client-side rendering is indeed all we have for "textual document model rendering". That's what we call "HTML" folks when we're not "viewing source". So ... I'd give the client-side a bit of credit here, things will improve with time.
You're right, I should have said "textual document model generation". I've edited my comment as such. The act of rendering the model isn't normally the problem. The problem is that these frameworks rely on the client to turn their representation of the model into something else. They're converting something the browser doesn't natively understand into something the browser does understand, then using the browser to run a whole bunch of commands to generate representations of objects that can then be displayed on the screen.
Wouldn't it be nice if you could've skipped all that and just delivered clear instructions on how to render the information from the get-go?
Listen, SOAP is ugly. We all hate using it. But it's there and it's a standard for shit that matters because humans are fallible. We're not good at knowing what we don't know or how things may change. Every time a shortcut is taken, something else will need to be done down the path to ensure stability of the system at a later date. Often times the cost of that is human intervention.
Developers and big companies discover over and over why the nice and easy:
"Throw up a REST bro, JS that shit in Chrome, and get a back end as a service for the rest. We don't even need to worry about the fact that the server and client are different anymore! We can CODE ALL THE THINGS in one spot and not have to learn anything further! Isn't that great? Can we get our VC money now?"
...isn't sustainable.
Again, I'm not saying that it can't be done. I'm saying that it's really hard, you have to pay very close attention, and you need to know a lot up front.
Otherwise you get stuck trying to fix things you didn't know you didn't know. And you write another one of these blog posts.
Cool. Can't argue with any of the above. And sorry for getting nit-picky earlier.
I'd point to react.js as a framework that encourages fast code while discouraging knee-jerk adoption by calling itself a view layer, batteries not included. Of course, that doesn't help people realise whether JS MVC, PJAX or PHP is the best choice for the job. But it does highlight certain desires for client-side rendering that's more generic than "insert this block of HTML here". Like angular, it's most useful when you need to repeat yourself a few times on a page as each new piece of data has to be rendered in real-time to a page. If static, you should have fewer worries. Usually with JS, if it's easy, you're doing something wrong, or using Dart :p (Okay, the last was tongue in cheek)
That Quora link is behind a sign up wall - would you mind giving a précis?
Additionally I have to say Angular (or any client side framework) seems a poor choice for a consumer facing content driven site. Apps are for actively doing something - not passively reading. Of am I missing the point
I was able to read the first answer without signing up.
And I'm not sure I'd say passively reading is something we ever do on the web. Consider nytimes.com redesign -- it uses app concepts for a sidebar while devoting all attention on the prose in front of you. You can even navigate using arrow keys, though that could be improved: first time I do it, use a popup to let me know what happened and how to undo. The point is, the app-ification of the web is upon us, we just have to find language and frameworks that will best support it. Both client-side and server-side are necessary at points.
The app-ification of the web implies one does something with an app. Something that produces, creates or alters - simply having a easier navigation for reading an article does not to me qualify. I already have a very good framework for reading - text and books.
we are going to see a million and one ways of presenting the same article of the same tv show, and one million of them will be crap. the other - who knows. I just hope it's worth living with the million others - I will prefer to avoid them and wait for the one announcement.
I need to stop clicking on the "why we left x for y" articles on HN. Mostly people have picked the wrong tool for the particular job and the articles are just an embarrassment.
Obviously SPAs take a lot of extra work to make search engine friendly and are probably going to be the wrong tool for the job for any site which requires that. Much of the web isn't searchable and doesn't want to be searchable. If you are writing a web app to solve some business problem which sits behind a login angular really isn't a problem.
Think of the millions of poorly maintained and inflexible VB and Java business apps out there that are due to be replaced and the employees who are wanting to do business on the road with their macbooks, chromebooks and tablets. There is your market for Angular.
The Problem is that most of the people only read about "the new fancy JS frameworks" and then they choose it too.
Most articles are so optimistic (because it is new, cool, make fun), that is hard to understand if the "tool" is the right one or not, you see it when you use it.
So i am glad to see when people / companies write about their expierence with the "new" technologies.
Everybody can then verify if the tool is the right tool for a project/problem or not.
E. g. you write "If you are writing a web app to solve some business problem which sits behind a login angular really isn't a problem".
When somebody read this, this person thinks "cool, angular is the right tool for a login backend application".
Slow... depends on what you're used to; I've worked with Java / Maven and such, and one step worse, Scala; if you want slow, go for those.
Complex. The author links to a certain gruntfile[0] as an example of a large, unmaintainable gruntfile, but apparently people forget that a gruntfile is just a Javascript / NodeJS file, and thus can be broken up into more manageable chunks - like any code[1]. Alternatively, there's newer, less config, more code build tools like Gulp.js[2].
#4 is also no longer valid; Angular's Protractor[3] wraps around Selenium etc and deals with angular's asynchronous behaviour, as long as you stay within angular's framework.
And #5 is to be blamed on the developer for not having attention to performance / total load times, not the framework.
I'm defensive, but then, I don't have a public-facing app.
The idea is to keep a lot of the advantages of the traditional web development model, but, via HTML5-style attributes, RESTful URL design and partial driven UX, achieve a better UX.
It's not for everyone or for every problem, and it is still in pre-alpha (we are going to change from a preamble to HTTP headers for meta-directives, for example) but, if you find Angular too heavy-weight and foreign for your UI, it might be of interest.
Please contact me if you are interested in contributing.
I think this method is the way forward for most document based sites/apps. Last year I built something similar to intercooler (nice lib btw) for our product to great success. Makes things so much simpler to develop and maintain.
Have you tried react.js [1] ? If you use node to serve your content, you can pre-render the initial state of your app. When everything loads up, react will take a checksum of the rendered portions to ensure that it doesn't re-render the same DOM. This should come close to solving your SEO/test issues with minimal work.
In my opinion, a setup like this is close to what the next big wave of frameworks will use.
You can break your layout up into parts and have a site that is partially dynamic and partially static. You just pass the html that react renders to your templating engine.
Getting everything setup correctly can be a little hassle, but gulp is fast enough when doing a watch on the compilation step. Of course, because everything is javascript you share the exact same component code between client and server.
Haven't tried it, but I heard great things about it from bradfitz, who uses it on http://camlistore.org/ (and whose opinion definitely deserves respect). Is React an all-or-nothing thing, or can you sprinkle it in certain places on your site without needing to make the whole site in React?
> you can pre-render the initial state of your app
This, I think, is the killer feature of Node, and the reason I'm slowly transitioning from Python for new web projects. You can reuse your server-side templates client-side (without worrying about, say, reimplementing Handlebars Helpers in your server-side language), and can easily render full HTML templates for the client that get enhanced when the client-side JS loads. This also solves UI nuisances -- like your server's markdown renderer being different to your client-side preview (grr).
Meteor and Derby are obviously heading down this path, and while I'm not sold on the rest of Node and the general JS style, having the same language in the browser and the server is too much to pass up.
One thing people don't really think about is that this whole notion of SPA is kind of a pipe-dream, overkill. There might be some apps that are really a single page application (likely just those simple, demo add task apps), but most really can be broken down and composed of many mini-applications.
For example, how often do your users go to their settings page? Does that need to be part of the SPA? Have a complex Settings pages that's composed of 5 or six tabs and 20 different user interactions? Maybe the settings page is itself it's own mini-SPA
How does a user flow through your app, do they really need every screen bundled under a single SPA?
Routing issues, complexity, code dependencies, etc...are all good reasons to not make one monolithic application, even if it's behind a login.
Likely your SPA should really be an app composed of a bunch of smaller SPAs. You search functionality...mini app, your separate workflows...a mini app, your timeline...mini app. history view...mini app, etc...
Breaking your app down into a bunch of smaller SPAs has a lot of advantages and implicit modularization, as well as productivity gains when working on bigger projects with bigger teams.
I have one project using Angular in this manner, and it's great for it (I don't use Angular's routing system). Another project is using Knockout.JS. I prefer the Knockout method when integrating with a lot of 3rd party/external JQuery components.
In general for a Web apps, you won't go wrong using different pages as a module system. It's proven and when your app gets big enough, you don't necessarily have to worry about a huge up-front download.
BTW, if you develp web apps to be shimmed into native apps, like PhoneGap, or something, then definitely look into the routing aspects of these libraries.
We're using Backbone+React so this may not be applicable.
However...
“You can separate your dev and production build pipelines to improve dev speed, but that’s going to bite you later on.”
In my experience, you must separate dev and prod pipelines. It has never bitten me because I make hundreds dev (local) and dozens kinda-prod (staging server) builds a day.
For dev builds, Grunt just compiles LESS but doesn't touch the scripts so there is literally no delay there. In dev environment, we load scripts via RequireJS so there is no need to maintain a properly sorted list of scripts too.
For production, we concat them with grunt-contrib-requirejs with `almond: true` so RequireJS itself is stripped out completely. Production build takes time (10 to 15 seconds; we're also running Closure Compiler), but it's never a problem.
Even adding JSX (Facebook React) compilation didn't cause a slowdown for dev builds because grunt-contrib-watches compiles them on change and puts into a mounted directory.
Yes, we did use separate dev and prod pipelines when we used AngularJS. (We used https://github.com/ngbp/ng-boilerplate.) It took 2-3sec for the dev build (most of which was taken up by recess) and 30-45sec for the prod build (primarily JS uglification). However, probably 5-10 times we deployed a broken site because either 1) the LESS compiler changed the order of our CSS rules or 2) we used AngularJS DI syntax somewhere that ngmin couldn't handle. We fixed the issues and added better linting, but it's still one more thing to think about (and the theme of this article is that it was death by a thousand cuts, not one big show-stopper).
I understand how this could be really frustrating. We solved this problem (well, we never really had it, but we solved the ability of this problem to come up) by having a suite of unit and functional webui tests. Combined that with a CI environment that runs these tests and creates builds only when the tests pass. Ngmin can be a little flakey if you get into fringe situations but deploying broken code is a testing problem, not a toolset problem.
I am curious to know why you go for a full js app approach from the begining, knowing that your app would be very dependable from content that needed be indexed by search engines overall?
(OP here) This was definitely a bad choice on our part (my part, as I'm the one who made the decision originally). It was super simple to get started with AngularJS, and I had a fair bit of experience with it. Part of the motivation for this post is to tell other developers in a similar position (who have used AngularJS/Ember/etc. in the past and are starting on a new project) that they should consider server-side generated pages as well, even though they are often considered old-fashioned.
Thank you for sharing your experience, and i hope that this post teach people a lesson about context (e.j: in your case you were using the wrong tool for certain context), consider that either serve-side or client-side rendering are not competing to each other to claim which better, it always "depends" on the context that they are implemented and lastly that server-side redering is not something that should be percieve as "old-fashioned", there is a duality between the two techniques that address specifics problems and is our job to choose what is best for the job.
Hey OP! We did the same EXACT thing as you. Our end user experience was super slick using Angular, but this doesn't bode well for things that, well, aren't users (we migrated back to Rails). Your post looks like it got HN-hugged so I can't read the whole thing. Either way I'd be happy to share (so we can compare/contrast) our own findings and conclusions via email. As you said, educating other developers is great!
I'm well-versed in Angular, and I too made that mistake on one project as well. There's something to be said for picking the right tools for the right job.
I have to disagree with most of this article.
1. Bad search ranking and Twitter/Facebook previews
Don't force your public side to strictly angular. Serve normal pages and use angular for your interactive components. Let Google index a well formed dom. Use a full angular stack for your non public facing application(a SaaS application). You don't want to index this anyways.
2. Flaky stats and monitoring
Use event driven metrics from your api and or client side. Track everything in the sense of user, controller, action, params. Blacklist sensitive data. Derive metrics with funnels, user did x actions, returned and subscribed. Conversion! It's all there just understand your events.
3. Slow, complex build tools.
Your not limited to grunt, or node. For example we use rails and use our own buildscripts and generators to build fullstack angular apps. Easy Breezy.
4. Slow, flaky tests
There is room for improvement. But jasmine and phantom can get the job done. But let's not forget were also testing our api. Use your goto testing framework and let jasmine phantomn do the client frontend testing.
5. Slowness is swept under the rug, not addressed
Precompile your angular templates, only wait for api responses. Don't fragment your page load into seperate request. Resolve all the requires data beforehand in the route provider.
I’m as big an Angular evangelist as anyone, but that bit about search rankings is an absolute killer.
You talk about these server-side webkit parsers as tricks that “slow things down,” which indicates that you at least ultimately got them working. I never got that far.
As someone who is familiar with AngularJS but hasn't used it much, the idea that the best answer running WebKit on the server, indexing your client-side generated pages and dumping them out into a Google-indexable static resource just blows my mind.
What's really interesting is that "google" didn't see this coming - they made angular, they should in theory be huge advocates of SPA, but in reality their primary product doesn't support it well at all.
The angularjs.org sites are indexed, just by using a trivial nginx directive to selectively serve partials to crawlers. All of the angularjs.org apps still rank fairly well using this strategy.
This isn't always appropriate of course, those apps aren't really relying on data from a database, they're proper SPAs. But as noted by others, there are ways to fool crawlers if it's something you want to do.
Generally though, the feeling is that crawlers not executing JS is going to end, and the problem will go away on its own.
If you're OK to delegate the scraping process to a SaaS then you may be interested by SEO4Ajax [1] which will make it much simpler. Disclaimer: I'm one of the co-founder.
For sake of correctness — recent versions of Phantom.js are not dependent on Xvfb or any other variant of X, and there are grandmotherly prepared binary builds on the official website (kinda statically linked, so no dependency on WebKit as well).
Not that it changes the author's arguments that much, but just worth pointing out.
I'm currently playing with Fay (haskell2js compiler)... It's awesome.
It type checks like haskell and allows code sharing between serverside and clientside of the app. This means i can use code to generate a complete HTML site (for SEO purposes) when the URL is hit directly and modify the DOM from there once the app is loaded... with the same code!
Obviously this is code sharing is mostly interesting to app written in haskell. But I'm so excited about it that i had to share... :)
G'luck! The "javascript problem" (try google for that) is a hard one.
[edit] i call it "playing with Fay", but im certain this will end up on production for me.
I used AngularJS + Express for a reasonably sized custom news app for a client. It does everything form payments, to security, ya know, all the shebang.
I agree with is the first but only if you're still in the days of SEO trolling. Frankly it's just not as important if you're doing your other marketing aspects right.
For #2, I think there are plenty of ways to build in analytics. We use angularlytics and it works pretty well. Took me like 5 minutes to setup.
#3 - Yeoman. Generator Angular. Here's how I do it:
1. Make a client dir, and yeoman up your project with generator angular.
2. Make a server dir and setup an express server.
3. Grunt serve your client dir
4. Make it so express watches your .tmp and app folders for local dev
5. Run your express server
6. When your ready to serve it, Grunt build to a dist folder in your server folder
7. For production, have express serve the dist folder
Yeah kind of dirty (since you're running two local servers for dev), but hell, it's fast as can be to setup and a pleasure.
#4 Tests? If you're doing tests of any sort, they're bound to slow you down to an extent.
#5 Isn't this applicable to all web apps? Mistakes and mismanagement of loading resources is a problem for anything.
Sure it has it's problems, but there's just far too much productivity to be gained from using it. For example, Ajax animations are beyond time saving.
The real problem with angular is the terrible docs ;)
Really, this seems to be more of a case of the wrong tool being used for the wrong job than a tool with flaws and no real use. AngularJS positions itself as a fit-all solution for great Javascript based applications, when in reality it is meant for use only in an authenticated user setting. Look to an application like Asana (built on a similar internal Javascript framework), you only get the Javascript application version after you've logged in, not before.
It's like creating an online store and deciding to choose MongoDB or any other NoSQL branded database and then discover it doesn't support transactions and having to move over to a RDBMS like MySQL or PostgreSQL. The caveats listed in the article are definitely true though. As someone who's used AngularJS enough to know its downfalls, it's definitely not a one sized fits all solution and much like anything it comes with both its own pros and cons.
It's important you spend the extra amount of time when planning your project to ensure you choose the right tools for the right job (well at least at the time). If your requirement is to be indexable via search engines, choose a solution that allows that and so on. Don't use something just because it's the flavour of the day on the HN front-page.
> 1. Bad search ranking and Twitter/Facebook previews
This problem is patently obvious to the most cursory examination of single-page applications. If SEO is important, and you want to do an SPA, then you must be willing to bear the cost of addressing HTML requests. For my startup, I wanted to keep things DRY, which lead me early on to the Nustache engine for ASP.NET, allowing me to use the same Mustache templates on server and client. This doesn't have anything like the complexity described in the article.
> 2. Flaky stats and monitoring
Simply not true. Using Google Analytics and Backbone, you simply listen to the Backbone.history:route event and fire off a pageview using the Google Analytics API.
> 3. Slow, complex build tools
Complex, yes. Slow? Using r.js, no slower than a typical static language build.
> 4. Slow, flaky tests
Slow, yes, but no more so than desktop app test automation. I've found PhantomJS with QUnit (unit-testing), and CasperJS for integration testing to be quite reliable. It took a few days to get everything connected (scripting IIS Express to start and end in the background being the trickiest bit), but that was it.
> 5. Slowness is swept under the rug, not addressed
This is a UX challenge that is known and obvious up-front. Failing to address it is a design problem, not a technological one.
Overall, this seems the result of the ineptitude prevalent in inexperienced, "move fast, break things" teams. Rather than owning up to moving too fast and foregoing due analysis/research, they blame technology. Or, the article is a marketing ploy.
This a thousand times. I read the article and thought it a little weak on perspective. I've used Mustache+Backbone+Require.js stack like the one you're alluding to and it worked quite well together.
I, am/was a big fan of Angular, but on a recent project decided we didn't have the right team composition for a lot of the coding styles and complexities angular introduces. We went with a JQuery/Knockout solution, too. Some of the devs still struggle to keep the viewmodels clean, but being able to integrate from knockout to existing JQuery plugins has been a big win. Writing custom angular directives, adding another layer to existing Jquery is a pain for a lot of front end developers, and likely overkill for many projects.
(to give scope size, we're replacing a several hundred screen Adobe Flex app with our new KO/JQuery app)
This is really a case of picking the wrong tool for the job. __This is in no way a slight of the author__...b/c I have done worse on more than one occasion, so thanks for sharing.
To anyone reading, you really should understand your workload before picking tools. And, you need to understand the difference between Web Application vs. Web Site: Which are you building?
Server-side rending is the winner for content sites (as mentioned by the author). Beyond initial rendering, a server-side solution allows for more caching. Depending on the site you could even push a good amount of file delivery to a CDN. In the end the author switched to Go, but Node.js + Express, RoR, PHP, Java with Play, etc. would all work just as well.
Next, are you CPU bound or network bound or I/O bound. If you're writing an application that requires heavy calculations that utilize massive amounts of CPU, then pick the appropriate framework (i.e. not Node). If you are I/O bound then Node may be a great solution.
Client-side rending (such as Angular/Backbone/etc) really shine when you need a web application (not web site). These frameworks are best when the application code is significant relative to the data such that many small JSON requests deliver better overall performance. Think of a traditional desktop application or native mobile app where the application code is in MB, but the amount of data required per request is in bytes. The same logic applies to web apps.
A few areas where problems such as what the author experienced emerged from blanked statements about technologies:
1. Gulp vs. Grunt: I use Grunt. I may switch to Gulp. But seriously, which one is "more complex", "faster", can be quantified. Lots of people pick the wrong technology because the web is littered with echo'd opinion statements. Exchange "more complex" for project A has a config file with X number of lines, while project B has a configuration of Y number of lines for the same task. Or project A uses JSON for its configuration while project B uses YAML.
2. "Or we could have used a different framework) - with a link to Meteor" - No please do NOT use Meteor for your site. I love Meteor and want it to succeed, but it is not the optimal choice for a content heavy site where each user views a large amount of data. As mentioned above, use a server-side rendering solution (like you did with Go), then cache, then push to a CDN. Problem solved. Meteor is awesome and is a great real-time framework. Use it when you need real-time capabilities...but not for a content heavy, static site.
> but they just weren’t the right tools for our site.
This could have been the title or first sentence and would have delivered 100% of the message if the reader read no further.
A lot of these articles about why we changed from technology A to B could be much improved if the original decision making was documented (not just the switch). As in we picked project A because we thought it would deliver A, B and C benefits based on our applications required capabilities. However, our application really needed capabilities M, N and O, which project A was not a good fit for. So, we switched to project B and experienced the following improvements. Therefore, it can be concluded that if your application needs M, N and O then project B will be a better fit.
> "Client-side rending (such as Angular/Backbone/etc) really shine when you need a web application (not web site)."
This, 1000x over. I have static landing pages and about pages for search engines, but the app itself is a single page Angular app. The data does not have to be indexed.
Exactly. And if you're going to use it for a content-driven site, don't use it to serve up content, use it to make a nicer UI around the content. And if you do use it to serve content, make sure that content is also accessible through a direct URL of its own, so you can show that to Google.
Though Google is cheating here, of course. They use plenty of JS frameworks to serve content, yet those Google+ posts do show up in my search results. Though every G+ post does have its own URL, so I guess that's the way to do it.
Is anyone aware of a solution to allow clients to validate a version of an SPA website (cryptographically), in the sense that once downloaded a signature is checked and then further visit, if they require an update, have to be validated and verified by the user?
I'm thinking of a way to allow user to trust their applications in the same way you would trust a dist-upgrade on Debian via the packager's PGP signature and chain of trust.
This would solve the current problem that sites can change user side code at will anytime without them knowing and thus making it quite impossible to develop proper security solutions where the user actually owns and is responsible for his own security.
With such a solution in place, we might start seeing proper p2p/webrtc security related apps, we could even imagine an in Browser (read js) Tor-like service...
Maybe I'm missing something but I don't get "4. Slow, flaky tests". I understand that Selenium et al can be a pain, but how does server side generated code excuse you from using it? Are you only validating the html structure and not testing any of the interactive capabilities?
Until Web Developers stop treating the end-user as a consumer of unlimited computational and bandwidth resources, issues like this will continue to crop up.
I love AngularJS for internal desktop tools I write, but I would never use it or any other client rendering script in the wild where my applications could be consumed by unknown form factors. Specifically, you have absolutely no idea how much memory allocation you are getting when you are dealing with mobile devices, and any assumption on the developer's part is asinine.
AngularJS was not the problem in this case; and I'd wager we are going to continue to see articles like this as developers go through growing pains of learning that you should optimize for the end user first, not yourself.
Try prerender [1]. We use it in production with backbone. This combined with keeping the most content not-client-rendered has alleviated most of our issues.
In the long-term I'd love to see a web framework that uses react on the server-side, kind of like how rendr uses backbone on the server-side [2]. Seems to make sense because react works against a virtual DOM, so it would allow you to avoid the hacky ways of working with an actual DOM in node.
I just want to say thank you to the author for showing me backbone.anayltics. Absolutely fantastic and everything I had been looking for recently. Funny how one thing teaches about something else.
Since the author hints that they migrated to Go templates, this article is more about when you should render templates client side vs/ server side than an opinion against AngularJS.
Concatenating your large collection of controllers, services and directives. Running internationalization jobs, running unit tests, running JSLint, running LESS compilation. Creating a versioned file so you have a fallback. And more!
Sometimes, we just create our own problem. I fell into that trap before, and my toolbox is more minimal now.
And btw, gulp is a better replacement for Gunt. But still, consider adding too many plugins into it before it's too late.
Take a look at the prerender.io module. Also it is not impossible to render some types of angular pages in the server it is just a bit of a pain. Google angular on server. If you really don't need an interactive app then I would consider generating static pages when the data changes or when called and then don't update them more often than you actually need to.
It's funny because I find myself going the other direction from server-side page generation to angular.
One of the main reasons is angular forces you to separate your controller logic from DOM manipulation. Without directives I tend to see a pile of jQuery on every page.
The separation between DOM and logic is indeed a nice part about AngularJS. We took a look at how much client-side logic our particular site actually needed, and it was much less than we thought. Most of the page logic can be implemented using Go's template library's own AngularJS-like server-side templates. We implemented the remainder in jQuery, which you can see at https://sourcegraph.com/static/js/web.js. (If someone in the future comes along and that link is 404, email me at sqs@sourcegraph.com if you want to see it.) It's actually a surprisingly small amount of code.
So I would say you should ask yourself how much of your controller logic needs to be done on the client? If most of it can be done on the server, then the amount of jQuery you need will probably be quite manageable.
These are just difficulties that need to be addressed, not deal-breakers. All new technologies have a transitional period where the rough edges need to be sanded off (Node.js, HTML5, etc.). As for people making comments that SPA should only be made for back ends and never a content site, USAToday.com is a SPA, and they rank just fine in Google.
I think you just used Angular at the wrong place. For a content based site, Angular was just not the right choice. I've made similar decision on a content site which I did in BackboneJS and totally feel your pain. I've had exactly the same problems you mentioned.
Server side generated stuff would've been just great here or on the project I did!
This post's title should really read: "Why we left single-page apps." Saying that they left Angular makes it sound like it's somehow inferior when compared with similar technologies like Ember. As the article describes, Angular was not so much the problem as was the mis-application of the single-page app architecture.
> 5. Slowness is swept under the rug, not addressed
This is a joke right, asyc loading is somehow bad? If it's that much of a problem hold off rendering untill you have all your data back, of heaven forbid implement item one of Nielsen's list of heuristics, "Visibility of system status" and chuck in a loading gif.
Are there any good guides on how to organize JavaScript on non-SPAs? I have a weird web development background in that I've only ever worked on SPAs, so I haven't ever done JavaScript work on a server-side rendered app.
All these arguments seem like lowly excuses to not do proper work and instead blame the framework you are using.
If you have any kind of interactiveness the feel of a fat client is so much better then template rendering.
Is that the best interface you can get?
In 2014 everything I click reloads the page? a tabbed interface that doesn't load the content in the window is noticeable by users these days.
No pop ups of any kind? why do I need to reload the page to see a list of 4 contributers.
You've gained maintainability at the cost of user experience, a lot of user experience for very little maintainability.
There are sites that benefit little from client-side rendering- blogs and news site for instance, but most will gain a lot.
1) Indexing with PhantomJs is a breeze, truely. Not only are there a ton of libraries that already do it, there are even SAASes that will do it for you for a fee.
If you are really unable to come to terms with this, you can use react.js, which solve the SEO indexing issue completely.
2) If the only thing that you are doing on the site is measuring page loads then your site either lacks interactiveness completely or you aren't measuring everything you should.
You aren't measuring to where a user left your site (and incredibly important metric) or any action he does (assuming there is any he can do) that isn't navigation.
With Angularytics (and a thousand other libraries) adding analytics is maybe 5 lines of code, and you get declarative analytics on any link you want.
3) This site's js is neither minimized nor concatenated, so I'm not sure what build tools you need for angular either then the ability to serve static content?
But in any case it's js, you are going to need to minimize and concatenate it at some point for performance, doesn't matter if you use a fat client or some custom jQuery plugin.
Even with Grunt, though I don't like it very much, the build file is maybe 10 lines long, and the build process takes miliseconds.
4) And the alternative is what? using manual QA on every build? You have a website with even minimal interactivity you are going to need to use a browser based testing solution.
Karma is a breeze, and with the new setup, the only thing you need to install is node and karma. Takes exactly 3 seconds, and you get one of the best isolated unit testing framework for client side code.
Angular is actually built around the ability to unit test it.
5) So your saying that the solution to slowness is to have 43 unique resources loaded and rendered on every navigation? Page reload slowness is one of the major hassles that Ajax, and fat clients as a consequence, are trying to overcome.
Your site takes, to me, about 3 seconds to load from page to page (6 seconds to finish rendering), there is obviously no wait time indicator that you can add and no tricks to minimize this.
Not to mention that rendering is slow, and server-side rendering is not only extremely slow it can also cause parallel load which will make things worse.
If you don't care about your speed it doesn't matter what framework you use.
For the sake of this you are losing interactiveness, speed and lower bandwidth to name just a few.
Why not? it's not like google and other crawler bots crawl your site every second, we are talking about roughly once a day.
And even then, there are solutions to do PhantomJS rendering on the fly, it might be a bit slower, but shouldn't be drastically so - some of the SAAS solutions I mentioned already provide such an option.
It was a wrong choice of tool for that kind of a project, in the first place. The biggest problem that I've personally seen with AngularJS is the steep learning curve.
We've transitioned from Angular to ReactJS with great success. Much smaller learning curve. Using Backbone to handle the models and React for the view is a great combination.
one thing React by itself can do that React+Backbone.Model can't, is handle oddly shaped state. Which is almost all of your state, once your application becomes nontrivial.
Just look at Blogger...their client-side rendering is annoying as all get out. It's just a blog post, render it server side and give me the content, then sprinkle on some gracefully degrading JS on top to spice it up.
I say this as a huge proponent of Angular who uses it for all his web app projects who also wouldn't ever use it on a public facing application.