Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Rails is a tradeoff, quick dev time, slow execution.

It's not designed for low-profit per request applications. Recent benchmarks suggest about 400 req/sec on a $10 digiocean server suggesting if you max your machine 24/7 that you'd need to make at least 1 cent per million requests, if your profit margin is below this then rails would not be a good solution.



There's no way you could serve 400 req/s with Rails on a $10 server for any sort of complex web app that requires data lookups and html rendering, partials, etc.

Maybe possible if you cache everything, but that can be quite complex.


http://blog.wiemann.name/rails-server

It was 200req/sec on $5, most of the SQL issues are going to be HTTP framework independent. Russian doll caching is not that complex to setup at all on rails 4 which removes most of the HTML/caching issues.


Yes, you can do 200 r/s on a site that doesn't do much.

If you are doing something like https://www.tanga.com/deals/watches where what's shown changes completely depending on who's looking at it, you will not get that sort of performance.

Unless you cache the heck of out everything, but that just works around Ruby being slow at generating HTML and Rails being slow at rendering partials, creating URLs, etc.


Slurping in a whole tonne of data from SQL and sorting it on the app side is going to be slow anyway. Better off leveraging SQL, or pre-computing the result sets in SQL.


Rails being slow to render templates/html/urls/etc has nothing to do with SQL.

Again, this likely only happens with larger code bases, lots of routes, views, controllers, etc.


Can you back up "complex"? I did extensive work on caching with 2u.fm on Rails and found it exceedingly simple, especially coming from Java or PHP.

Plug in the dalli gem, and pass in the updated_at timestamp as part of the key.

Edit: Perhaps if you had to re-write an app to support caching you'd run into trouble, but thats outside of the scope of this article. If you build a rails app, you plan for caching from the start.


If users are shown different content, you can't cache easily. content_for quits working. etc.

Caching in Rails is used to fix the symptom of Ruby/Rails having terrible performance for rendering templates/html/helpers/urls. It's fine if you have a small, simple site. For anything complicated/large, it can become a pain.

By "complex", I mean something with hundreds of routes (I have close to a thousand), hundreds of controllers, hundreds of views/partials, etc.

But if you have a small site that won't have lots of code, then sure, Rails is going to perform great.


What I don't get is how number of controllers or views has anything to do with this. Size of the app has no effect on the app speed at all so I'm not sure why you'd bring that up. Complexity of pages does.

But still, rendering even complex views is usually no more than 200ms on MRI, and less on others. With multi-core and multi-threading and (easy, nested) caching it's trivial.

How do you explain 37signals and Basecamp? You're spreading FUD. Real world disproves that Rails can scale and like I said, it's only getting better not worse.


Maybe I'm wrong but I was under the impression that content_for works correctly with caching from rails 3 onwards.


Russian doll caching makes it even easier, if that's possible.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: