Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
SimpleCDN has been effectively kicked off the Internet by its ISPs w/o warning (simplecdn.com)
125 points by archon810 on Dec 12, 2010 | hide | past | favorite | 86 comments


Could this month get any better for Amazon?

-It's December

-Mastercard.com and Paypal.com went down making Amazon one of the few on-line shops that could still process payments

-Since no Amazon service went down under DDoS, big companies will come crawling to them to use AWS

-SimpleCDN goes down, no doubt driving lots of customers to bigger parties like AWS

-They are making boat loads of money http://www.wolframalpha.com/input/?i=AMZN+vs+GS+vs+GOOG

Boycott? Pfff. I predict record revenues.


"... making Amazon one of the few on-line shops that could still process payments" is a huge overstatement. Mastercard.com and Visa.com being down has no material effect on almost all eCommerce (I don't remember the last time I had visited either). PayPal does, but that affects Amazon as well since they accept PayPal at places like Zappos and Diapers.com.


both visa and mastercard had their payment processing servers down along with their websites.


Well I was able to process payments just fine for both during the DDOS...


I'll vouch for that. We had no issues at all. Ran a lot of transactions throughout the period as well.


oh really? I spoke to someone who was unable to process payments and I saw a lot of people reporting it. Maybe it was isolated due to location? The people reporting it I saw were in the UK.


Hearsay, but I read it was a Mastercard problem limited specifically to the UK.


It was a relatively isolated problem in some parts of the UK I think (I was able to buy stuff during the supposed outage over here).


"They are making boat loads of money http://www.wolframalpha.com/input/?i=AMZN+vs+GS+vs+GOOG "

I'm not sure what the point of comparing Amazon to Goldman and Google, as the latter two are way more profitable than Amazon is. Is the comparison to show that they're in the same league as these two companies, or are you just looking at the stock price alone?


Mostly revenue / employee, stock price.

Bezos' lack of interest in profit is well-known, e.g. http://www.businessweek.com/magazine/content/06_46/b4009001.... (2006).


Does mastercard.com process payments? I got the impression that it was just the marketing/corporate presence that was affected.


Securecode was affected as well, which is the payment portal you need to go through for many on-line shops.


Securecode is theoretically optional in most cases, transactions that go through it generally cost less than non-securecode ones but it shouldn't stop all transaction processing in most cases.


I guess it didn't affect everyone since I was able to use it on the day in question to make a Ticketmaster purchase on 12/8.


The information is spotty but from what I gather it seems someone (SimpleCDN) took the "unlimited bandwidth" claims on 100tb.com a little bit too literally - and tried to build a business around them.

The question remains why a supposed infrastructure company thought it'd be a good idea to rely on a single other company to provide their... infrastructure.


I hope that's the case, honestly, and I hope that it does go to court. I'm frankly sick and tired of the 'unlimited' advertising model, wherein unlimited really means limited to some arbitrary amount that is not disclosed.

As a customer who has been kicked off of a number of unlimited hosting services for a site that only does ~30Gb or so in traffic in a month, I'd love to see hard limits advertised rather than 'unlimited'.

I know that Dreamhost oversells on purpose, and that's fine, but I think they can still do that (though perhaps not quite as effectively) by just stating their upper limit. Of course, this means that more people are likely to hit or approach that upper limit, but at least they'll know when they need to grow into another 'slice' as it were, or whether or not to relocate from Dreamhost altogether.


We (at DreamHost) don't state any explicit limits as they're pretty fluid in practice. The effective limits are basically:

* Don't use enough resources that you make it hard for us to provide good service to other customers. (Saturating the network / filling the filesystem = bad.)

* If you're legitimately using lots of resources, we'll move you around if you start getting near the limits on the hardware you're on, but we won't install new hardware just for you.

And, of course, you're also required to stay within the ToS, which exclude most of the really obvious ways of burning through lots of resources. (Public upload / mirror sites aren't permitted, pirated media is obviously a no-go, and you aren't allowed to use your "unlimited" disk space for content that isn't part of your site.) We've got some $10/month customers who are using insane amounts of resources; so long as they don't expand beyond what we can handle without building new infrastructure just for them, we're happy to keep them on.


What you're describing is exactly the problem. I don't want to worry about whether or not I'm exceeding some subjective limit, I want a hard number limit that I can compare to my actual usage.


What we're up against is that providing raw numbers for usage limits leads to several problems. Among them:

1. Competition. If you provide N gigabytes of storage/bandwidth, another provider will offer N * 2, and they instantly look more competitive, even if they aren't even actually capable of providing that. So, back when we provided limited plans, we were constantly being forced to increase our resource promises to unrealistic levels just to avoid looking stingy.

2. Expectations. If you provide N gigabytes of whatever, customers will expect (and demand!) that their site be able to use up all of their provided resources. This is both on a policy level ("what do you mean I saturated the interface?") and on a technical level ("why can't I serve 100 gigabytes of dynamic HTML a month?") This becomes an increasing issue as competition drives the provides up, and the actual resources you're supposed to provide are absurdly high. (Consider for instance 100tb.com, which was mentioned earlier in this thread - good luck actually pushing 300+ MBit constantly.)

In practice, the policy we've got now actually works better in some ways for customers because they don't have to care whether they're exceeding resource limits, either subjective OR objective. So long as they're running a site which complies with our Terms of Service, we'll do our best to keep it going, even if it gets big.


I get that there's a strong disincentive to be the first company to impose caps, but would a court order not level the playing field? Or would everybody simply host overseas?


What you want is easily available all over the place, at any scale you want.

All of it is going to cost you more than Dreamhost's incredibly cheap prices - which are only possible because of it's police, as the guy stated.

It seems unfair at first - especially to a technical person, it's misleading - but the reality of hosting is that you do need to actually pay for the resources you are going to use - and the internet isn't free. The more your business is worth, the more you should be spending on solid contracts, multiple sourcing and fault tolerance.


>What you want is easily available all over the place, at any scale you want.

Yes, but by not stating a limit, while still enforcing one at some point, you are effectively not allowing yourself to be compared with others.

It's cheating. Even a rough "approximate limit" would allow comparisons, but stating nothing is strictly cheating. Would Dreamhost allow me to run 100tb/day? 100pb/day? They don't say they won't... how do their prices compare against someone who would?


With those sorts of numbers (100 tb/day ≈ 10 Gbps, for instance), nobody can offer that amount of traffic under "unlimited" terms. One of the limitations I mentioned is that we won't upgrade infrastructure just to support individual customers, and this would definitely fall under those criteria.


Well then... what's the current up/down internet connection, after subtracting the average use? That's the limit (unless limited further by something else). Why not advertise it? It's probably huge.


I'm not sure I can give out exact numbers, but it's far in excess of what any single machine can push out, either practically or technically. Advertising it would be just as misleading as any other specific number. :)


>That's the limit (unless limited further by something else).

>... what any single machine can push out ...

That number wouldn't be very misleading, and could actually be useful - it's effectively the limit on a dumb fileserver. If their code results in a lower boundary, that's their fault, not yours, and not in the least incorrect because it's being restricted by them.


I think what your parent and I are in support of are something that can be guaranteed, regardless of price. If I have contractual obligations to people regarding my webhost's uptime and bandwidth, I want the webhost to give me something more than "It changes a lot, don't use too much".

When the barrier to entry for an internet-based business is so low, I expect ISPs and other service providers to understand this requirement, and provide plans and prices accordingly.


Hmmm, I think the dreamhost guy gave a pretty good explanation. Thanks for that. But he seems to be getting downmodded (downmod if he's wrong; not if you disagree).

Yes, it's subjective ("doesn't require us to upgrade infra just for you"), but so what? If you want hard numbers, go with the ones which advertise hard limits.


Maybe it's just how I see it, but "unlimited" is a "hard limit". If you don't provide unlimited service, then say so, it's not rocket science.


DreamHost "solved" our problematic site by banning the search engine spiders from our site. Apparently we were being spidered too much so they changed the htaccess (without our knowledge) to ban them. But the site did stay up and I am sure we were using less resources after that. The site more or less dried up traffic wise.


I'm curious, why don't you state limits on your site? You state "unlimited", when in fact it's not. You don't even have small print to cover yourself.

What if one of your customers end up taking you to court for breach of contract, of does your contract explain what you have just pointed out?


Why are you treating a random developer at a hosting company like he's the CEO? If Dreamhost gets taken to court, it's not his problem. He is just explaining what he actually does to make "unlimited" as unlimited as possible.

Company legal policies are not his domain, so it's probably a waste of effort to complain to him about it.


I see your point. But I didn't realise he was a developer, from the way he was talking, it sounded like he was in a position of authority at the company.

I see how you figured out he was an employee now, I'll be sure to do profile checks on people in the future to get a better frame of reference.


Our limits are based on policy, not numbers. The policy is (roughly) outlined at http://www.dreamhost.com/unlimited.html if you're curious.


I beg to differ. I know that I've been well within the terms of service at all times, hosting only a semi-popular message board. I'll grant you that 30Gb of traffic in a month is doing well, however, that certainly comes nowhere close to saturating the network or filling the filesystem.

From your TOS, I should be fine so long as the intent of the site wasn't to do either.

For what it's worth, I generally recommend Dreamhost to people looking for small personal sites or new apps -- until they need to move, but it's my experience with DH that eventually they will need to move. As I've experienced on more than one occasion, the limits, ignoring the Unlimited + 50Gb claim, are enforced far more vigorously than you claim.


One thing I'll say for Amazon is the explicitness of the cost for storage and bandwidth, coupled with the simple fact that my (and your) startup are very unlikely to get anywhere close to something they haven't seen, make AWS extremely attractive for any real deployment.

We had a Dreamhost account (have!), and briefly entertained using them for our rollout, but some simple number crunching and common sense made us see that Amazon was set up for the possibility that we would really have to scale, whereas Dreamhost was pretty much set up for casual work-at-home developers who were looking for absolute rock-bottom costs.

Seems like the simpleCDN guys made the opposite decision, and went with a company that markets XYZ rather than really looking under the hood. It's incumbent on any business to vette their vendors and understand the risks, rather than just point a finger and say "they promised!"


"* If you're legitimately using lots of resources, we'll move you around if you start getting near the limits on the hardware you're on, but we won't install new hardware just for you."

Could you define an "illegitimate use of resources"?



Personally, I think pay-for-use models a la Tarsnap and NearlyFreeSpeech.net are quite fair. I suppose most people will be put off by "mental transaction costs", though.


That might be a part of it, but I think the other part of it is that the average guy looking to start a blog, or website, or whatever, has absolutely no idea how much traffic his thing will generate. For people like that, the claims that Dreamhost makes are a godsend, as he doesn't have to worry about "what if I get a 1000 subscribers?"

Generally, I'm pleased as punch that Slicehost, Linode and Amazon came along, as now I have fair prices with real limits. If I outgrow them, I don't have to move, I just increase memory allocation, or grow the slice, or add a node, whatever.

Dreamhost couldn't get away with this because likely, 80% of their users are paying $9.95 a month for the equivalent of 25 cents worth of usage.


They are fine for joe average who wants to play around and start a blog.

They aren't fine if you want to deploy a web-scale infrastructure on top of them and make tons of money. That much should be obvious to anyone architecting such services.


Dreamhost however is very open about the type of `best effort, use what you like` model.

Compare the claims on http://vps.net/ to the reality of http://status.vps.net/


FYI status.vps.net doesn't include most outages they deem "too small" and only affect ~5% of customers. They only post there when it affects a large number.


If you're going to sell "unlimited" bandwidth, you'd damned well be ready to deliver it. It's arguably very naive of SimpleCDN to build a business on an "unlimited" provider (there's no free lunch, after all), but selling services as "unlimited (unless we decide you're actually using what we sold you, in which case we're going to kick you off)" is not cool.


If that's the case, maybe both parties are at fault. Claiming availability of 'unlimited bandwidth' is just as naive as believing in it.

In any case, I don't understand why the ISP wouldn't choose to send any communication about the termination. Seems like reckless behaviour.


They make you pay extra for an unmetered gigabit port, so it seems reasonable that you should be able to use more than the standard bandwidth allocation if you pay for more and if you don't you should be able to use the standard package.


Lots of companies have single points of failure like this, probably more of them than those that don't.

This only goes to show that if it is too good to be true it probably isn't.


It's ok if you want to stop providing service to SimpleCDN, but at least give people some notice, don't just start shutting down their servers.


Presumably because they could sell bandwidth below market price? This would also explain why they aren't scrambling to put up alternate data centers, but instead sending their customers over to MaxCDN.


Unlimited bandwidth is easy to provide. Just rate-limit the port once they exceed a certain amount of bits transferred. They still have unlimited bandwidth, but at 128kbps instead of 128000kbps.


That is what strato does, it drops the 100mbit to 10 mbit until you unlock it, and this game goes on every 300gb (it starts at 1tb).


On a side note, I haven't heard a single good thing from anyone about the ISP in question. One of their brands, VPS.NET, was an absolute nightmare and almost tanked the launch of one of our projects. A friend who made the mistake of going with them as well, had nothing but headaches. The main issues were reliability and poor customer care.


Seriously, as much as their team "care" their product is absolute shit. My experience with vps.net has put me off UK2 for life, some of the things I've seen in the customer forum are just plain scary. Like the admission they don't have the infrastructure to handle DDoS attacks because it's "too expensive".


Wow, really? I've been pondering whether I should get off linode and try out vps.net due to the elastic scaling they have (burstable memory, etc), but damn, they're this bad?


I was with them for 1 year, September 2009 - September 2010 and I've now moved to Linode. I would highly recommend you stay away from them. The longest I went without downtime was 90 days, in total I've had well over 40 hours of downtime within 1 year, I've never had that much elsewhere even on dedicated hardware that isn't as "redundant" as they claim to be.

Their product is amazing if you don't care for stability and then it beats the competition hands down, but if stability and uptime matter to you at all stay away. If vps.net had the stability and reliability of Linode they'd rule the world.

Sign up for a daily node ($1) and then login to the customer forum, you'll see so many reports of problems and this is after they recently "revamped" the forum and removed over a years worth of posts. I can't provide links because they removed my access.


Just a few fun experiences I had with vps.net:

* A server reset itself to a 3 month old backup overnight, their support was unable to explain it.

* Random disk I/O issues bringing throughput to < 1 MB/sec

* My favorite: Some internal routing issue on their end would randomly route my ssh attempts to another machine on their network. Their solution? Grant me access to this machine which was rented by another customer!

Oh and of course, there was plenty of downtime thrown in for good measure.

That being said: I really like their offering in theory, I just wish they could pull it off in practice.


From 100TB's TOS (presumably just added):

We will not provide services to those that are using our services for:

A content delivery network or content distribution network (CDN) is prohibited from running on our network. Special requests to run CDN services may be approved on a case by case basis. Failure to comply with this policy will result in the disabling of all hosting services.

http://www.100tb.com/tos.php


That's a hilarious tactic.

Are they now going to blacklist bandwidth intensive business models one by one until only blogs and "under construction" pages are left on their "high bandwidth" plans?


It's amusing in context, but it's not an absurd contract term. Hosting a CDN on an "unlimited" hosting provider is basically bandwidth arbitrage. It makes total sense that the hosting provider would want to limit their hosting to bona fide hosting customers.


Yeah, never arbitrage the arbitrageurs; they won't stand for it.


Why should they? Companies have every right to choose their own business model.


In this particular case it comes across as more than a little bit shady.

When you advertise aggressively with high volume/high bandwidth, when you even rename your company from 10tb.com to 100tb.com, then one would hope you'd handle it a tad bit more professionally when someone calls your bluff.

I wonder if they considered that by pulling the plug on SimpleCDN like this they've also effectively terminated their own business. Nobody in their right mind will host at 100tb after this event.


Like I said, in context, it's amusing. That contract term, though, is totally reasonable.


I was wondering the ame thing. If I were to start youtube on 100tb, would I be blacklisted too?


I was a client of both Softlayer and 100TB/UK2 whatever you want to call them. Both instances were separate occasions and my experience with Softlayer was nothing but spotless.

I had 2TB monthly bandwidth allotted, and the server was using up close to 80% of that. We were hosting 4 TF2 game servers, forums etc. Their customer service and services provided were top notch.

100TB on the other hand was not as pleasant. Even though we ended up not going with them in the long run, I have to say that the process of signing-up, managing the servers and canceling was ok. Nothing great. The factor that pushed us to another carrier was network speeds. There were moments when we were getting 10% of what we were paying for and that wasn't a fluctuation worth the risk.


It's all a little bit silly.

Everyone knows 100TB can't afford you to use the included bandwidth.

When you start actually using your bandwidth then you cut into their profits and they'll want to get rid of you.

When you start actually using the bandwidth _and_ undercut their own CDN using their _resold_ infrastructure, then they're losing out twice.

It's important to remember that UK2 = OnApp/UK2/Midphase/VPS.NET/100TB/Hosting Services Inc/etc/etc. UK2 doesn't own much infrastructure, they're just a reseller and overseller.


The way I see it is, if UK2 wants to fuck customers out of the service that UK2 advertised and the customers paid for, the customers are welcome to fuck right back.


It's just like insurance, you are required to have it but when you need it and have to use it the insurance company jacks up the price from that point on.


It's a little bit worse than that. Most insurances will at least pay up and then jack up the price. 100tb.com is not paying up (as in: providing service), instead they canceled the contract, apparently on short notice.


While they may very well win in court - their business will be destroyed.

Why on earth was their CDN built around a single provider? Had they had multiple providers,this would have ended in reduced capacity, with buffer time to re-deploy elsewhere.

Sounds like a good lesson learned - when I need bandwidth X for Y years, I make sure I have solid contracts to that effect - not just relying on a simple clause about severability. If your business depends on your providers, you make sure your providers know what business you are in and make sure the contracts are solid.


It looks like they built it around two: UK2 (under multiple brands of theirs too) and Softlayer. But both colluded to shut down SimpleCDN at the same time.

The main problem is they went to the datacenter providers first, not the network providers. DCs make their money over the long term by selling hardware cheap up front (just a monthly fee) and hoping ongoing costs remain low. CDNs put a high operating cost on the system, so it is one place they end up losing out big time.

What they should have done is partnered directly with a network provider. I was looking over at the MaxCDN offer they linked to on the page and they appear to have partnered with Mzima. That's a smart move (Mzima's an awesome network; akin to Internap, but a lot cheaper). Maybe if SimpleCDN recovers from all of this, they'll start approaching things differently and stop trying to put stress on a business relationship where it hurts the most.


UK2 is a reseller of SoftLayer. SoftLayer have been pretty reputable up to this point and many large companies use them. SoftLayer also offers unlimited bandwidth but charges an extra $2,000 for it. It appears UK2/100TB are the shady ones here, although SoftLayer may have had a hand in shutting them down due to the competitive nature of their product.


That's odd - every datacenter (read: colocation facility) I've worked with had clear pricing for rack space, power, bandwidth, etc -I could use as much as I wanted, and I knew what I'd pay for it, and would re-negotiate if the regular usage was going to go higher.

By DC's - do you mean some kind of web-hosting provider instead?

Even the VM providers I use all have soft & hard limits on traffic stated clearly - in practice, anyone offering "unlimited" anything is automatically excluded from any kind of critical business hosting decision - because that's not realistic or maintainable.


It's quite refreshing to see them provide a migration path for their customers, even if it is to another company.


As you would expect unmetered bandwidth from 100TB is truly unmetered and unshared, with no limits and no small print. Unmetered servers use exactly the same SoftLayer network as their 100TB equivalents and are fitted with 1000Mbit ports.

Hosting plans page though has:

Unmetered Bandwidth (324TB) add $399 / month


100TB a month = ~300mbps of constant use. The Unmetered upgrade means that you could saturate the full 1gbps for the entire month.


means that you could saturate the full 1gbps for the entire month.

Except for the minor inconvenience that they cancel your account when you actually do that.


My company had a server hosted with UK2. We bailed when they had no answer to why our server restarted randomly and made absolutely no attempt to look into the problem. The lack of respect from them, that our company depended on a reliable server, just astounded me.

After reading this I'm doubly glad we moved away from them.


Not really seeing why they're "naming" SoftLayer here, when it's 100TB.com they're dealing with.


UK2 claims its SL forcing the disconnect.


Since UK2 doesn't really have a CDN product themselves, as they resell HighWinds and Level3, their interest is a bit lower as opposed to SoftLayer which has million dollar contracts signed with InterNAP for connectivity and DC space, which also sells content delivery.

My guess is it's not really about kicking out the competition but mostly the fact that they can't realistically offer the said bandwidth for the advertised price, which is far too low for a quality network blend.


you do know they host youporn and other huge streaming services? youporn host around 800 machines with them and i would guess use a lot of bw. softlayer is not a small hoster, theyre big.


It's not a question of capability, it's a question of profitability.


This is a disgusting action from UK2/SL.

If you promise 100TB of data transfer on 1Gbps, you have to keep your word (not matter what). Otherwise, don't venture in offering such deal. Essentially, they're kicking out the tail end of the "distribution". A lot of SL clients have been reporting network latency and sometimes packet drops over the last few months. SL is most likely seeing this in their MRTG graphs and have no choice but to enforce the ToS, which does have a CDN clause by the way. The right approach would be to expand their network and purchase additional bandwidth. But I guess they're there to increase their margins.

I have to confess that, as a provider, one has to anticipate things like this for the sake of keeping the business afloat and the clients. The upstream could at any moment pull the plug on you (literally). There's so much one has to watch out for. These are things one loses sleep over.

It's unsettling to say the least. I'm reevaluating SL/UK2 as a possible partner (it used to be high up there in the list but no more).

Regards

Joe


The WebhostingTalk thread makes interesting reading: http://www.webhostingtalk.com/showthread.php?t=1005111


SimpleCDN has provided fairly consistent service until now. Their customer service was always non-existant, but they were exceptionally cheap. We were forced to switch yesterday when SimpleCDN went down -- I actually wasn't aware what happened until I saw this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: