Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: How to handle a potential sudden huge traffic load tonight?
23 points by dangrover on Jan 21, 2009 | hide | past | favorite | 31 comments
Hi guys,

My little-known Mac app ShoveBox just got mentioned by Andy Inhatko on Macbreak Weekly, according to this site:

http://www.mbwpicks.com/2009/01/20/picks-from-mbw-124-the-warmth-saturation-of-analog/

The podcast has not yet been posted, but when it does, I imagine I will be getting a lot of traffic that I'm NOT prepared for.

I'm worried that Dreamhost will not be able to handle it, and that I won't have time to make a DNS change to another host (I also have an MT account).

What should I do, both:

1) To make the site stand up to the load

2) To make the site not suck as much and make a better impression when it gets traffic (if it survives)?

I'm already moving the downloads to Amazon S3 to lessen the load.




Have an almost-static replacement index ready with just enough stuff for a php "give us your email address and we'll send you invites [and a discount or some other bennie]". No CSS or JS on that alternate universe page. If things get nasty you'll want this to be as few keystrokes as possible, because it's possible to trash interactive response in the shell.

Maybe make a pass through your normal site with YSlow if you haven't already, but your site is super fast right now (2009-01-20 17:20pm PST), which is good news.

Also, if you're concerned about it still and you have another hosting option available, change your index.php to <? header("Location: http://www.temphosting.example.com"); die(); ?>, which is probably the smallest amount of cpu you can involve without getting dns moved.

Good luck.


YSlow only applies to browser rendering and sort of assumes you are able to get the page to the browser in the first place. Somehow I doubt client-side rendering is going to be the bottleneck. ;-)


YSlow's primary purpose is to help you make sure the browser renders each page in the minimum amount of time. However YSlow does not only apply to browser rendering, following 6 of the 13 YSlow rules will result in a reduction of the total work the server must perform for each request.


I was under the impression it checked page retrieval times, which (after subtracting for idle network latency) would reveal which pages were transferring slow because of excess computation or disk use on the server. The idea being if it's slow for one user it's going to be _really_ slow for 1000+.

I've not more than installed and tried YSlow once, so I can't say too much about it.


It also looks at if you use a CDN, which may help him.


Along with S3 for large static content,

- Make sure your MySQL queries are optimized (use EXPLAIN SELECT to make sure indexes are being used as you intend)

- If you end up needing a dedicated server, look at a service that can automatically provision servers in a matter of hours (such as http://www.theplanet.com). Or go with something like (http://www.slicehost.com), which can set you up in minutes.

- If you need to change servers in a rush, you can leave your DNS with Dreamhost; just set up a subdomain like 's2.wonderwrap.com' that resolves to your new server, and redirect requests for wonderwrap.com to s2.wonderwrap.com

- I think you'll be fine! Good luck


I really don't think you're likely to have scaling issues. Your site appears to be static (if not, freeze it temporarily, since there is no dynamic content which adds customer value -- aside from the purchasing pathway, and honestly if your website gets brought down by too many people hitting that at once then you should just take your lumps like a man and then console yourself that as soon as you fix things you can retire to your own private island).

It is very, very difficult to knock over a site, even on shared hosting, with just static files which are not rich media. You'd need to have thousands of users a minute. I am not familiar with the podcast, but I am familiar with the concept of conversion rates, and based on the expected conversion rate from someone hearing a URL to typing it in unless his podcast gets piped in over the loudspeaker at MacWorld I rather think you're unlikely to get more than four figures of visitors out of it. You'll get through four figures in a breeze.

(Candidly, I think I wouldn't be hoping to see four figures myself, but that might be natural pessimism.)


Thanks. I figured it wasn't too big a deal, but just want to be prepared.

I'm more focused now on making the site suck less when it does load. :-)


Being on Dreamhost without the option to move in the limited time, these are your options that I can personally think of:

- Cache the hell out of everything.

- Fix links to broken JS includes. On http://store.wonderwarp.com you link to https://store.wonderwarp.com/mint/?js which does not exist and wastes valuable processing time only to error out.

- Pack and minimize your CSS/JS files, even just temporarily. If you are dead sure this is going to be a lot of people, every small bit of saved bytes helps!

That's all I can think of since you've already moved the downloads to S3. Your site is mostly heavy on the static content anyways so not much you can do unless you stripped it down, but I would say keep it as it is because it looks great.

Regardless, if your site does go down think of it as a victory. You got so much traffic (and probably sales) to knock you down that you can only become better for the next time you get press. If it's a good product, it will happen again and again :)


If you can, also move all of your static content to S3 including every image, css, and js file. You're currently serving ~47 requests per user to load the home page. If you move all of your static stuff to s3, each client makes 1 maybe 2 requests to your server instead of 47.

If you're using apache, any content, including static files, has a huge memory overhead to serve. Also, because you're not setting far-futures expires headers, users have to check for every static item every time just to get a 304 not modified.

Last bit of advice, get firebug http://getfirebug.com/ and add yslow http://developer.yahoo.com/yslow/.


We have had decent luck with <a href="http://aws.amazon.com/cloudfront/">Amazon's CDN CloudFront</a>. If you already use S3 it is pretty easy to configure CloudFront and make sure your users have a speedy experience.


Another Mac dev just assured me no one listens to MBW. :)

I'm still putting the files on S3 just in case though.


I listen to MBW. But I often do it weeks after the fact. I wouldn't be surprised if the traffic increase is smooth enough that you don't have a problem. It's not quite like running a Super Bowl ad.


S3 amazon link to download, stripped down version at the ready if your current site will start generating too much load. Dreamhost loves to suspend accounts which use too many recourses so to prepare I would move your site over to something more reliable like a small slicehost slice. At the very least you should lower your TTL on the domain to 10 minutes and as soon as you see your site go down, open up a new account at slice and switch over quickly.


Is there any reason you are still on a shared hosting account? I mean shared is great for b.s. or personal stuff...but I just don't see how anyone can justify using it for anything real when servers or VPS are dirt cheap.

I mean you are actually selling a product, you can't afford to have any down time.


Does Dreamhost let you configure apache mod_cache on your account? You might get some wins there. Watch out for artificial limits like database connection counts. But unless they're really awful you should be fine, a decent linux box can throw out huge amounts of traffic. S3 is a good idea, not merely for traffic capability but to avoid getting your account suspended for being 'over-transfer'.


Congratulations. Your app looks good and I hope the influx of traffic results in sales. (I have no technical advice to offer)


Unless you can move to a host quickly where you'll know exactly how your app can scale, I recommend you cache the hell out of everything possible. What does the site run on, by the way? Maybe we can offer more specific advice.


It's running on PHP 5 in a shared hosting env (Dreamhost). The framework is homegrown, but pretty small. PDO is used as an abstraction later talking to MySQL. It uses Smarty for templating.


Smarty has caching in it that is easy to set up. Just use that, and it should be good enough. PHP5 won't be your problem, the database will.

If it's your Wordpress site that you are worried about, Wordpress has a caching plugin that is supposed to work if you can get it installed properly.

Both of those are file-based caches.

If you want to get super ghetto, save a copy of the loaded page that is going to be hit the most and rewrite the URL to that saved static page.

Since we're talking about a podcast here, the traffic is most likely going to be distributed over time, so you really shouldn't have to worry that much.


I'm pretty sure he isn't using WordPress.


Cache and/or static pages. DH should be able to handle static pages fine, they're not that bad :p

Anyway, ab (apache benchmark) is your friend. Hammer way. It might be that you don't even need any optimization at all.


What do you use to cache right now (besides the built-in Smarty caching)?


I unexpectedly got 20,000+ u/hits from StumbleUpon to a dynamic PHP page last year on a shared hosting account and there wasn't even a blip - how much traffic are you expecting exactly?

I just wonder if you're unnecessarily wasting effort..


Can you share with us, which ideas you implemented and which ideas gave you the biggest benefit? Thanks.


Just remember that DNS change time isn't a big problem if you can put out a HTTP redirect.


few thoughts:

S3 = awesome, Transmit makes it nice and easy to use too

Dreamhost = awful, seriously consider moving to another host. A Small Orange are awesome, and would probably happily help you out in times of traffic bursts.

Please do post back with how you got on! :)


You shouldn't have a problem with static file downloads, on Dreamhost or on S3.


Just cache.


I looked at your page. Change the home page with an html page with just two links. One link will point to a Coral Cache(http://www.coralcdn.org/) and the other to your regular page. That should enable you to handle the load you are expecting.


[dead]


and if you're using a mac just use Transmit




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: