Hacker Newsnew | past | comments | ask | show | jobs | submit | mzjs's commentslogin


The sortable animations are incredibly bad on ipad 2 (ios 7) as well.


Read something non-coding-related, on paper, before you go to bed, typically for 1 - 1.5 hours. Almost always works.


Just curious, what does this offer that cloudflare's pro plan doesn't?


Well, this is $25/month for 100GB of traffic. CloudFlare Pro is $20/month for unlimited traffic plus the security features.


Right. If you're looking for the cheapest (by $/GB), then there are definitely cheaper options. This is about performance for lower traffic websites. To actually deliver on that requires more resources, which means it's never going to be the cheapest option out there.


"more resources" in the sense of "less aggressive cache expiry" ?


Yes, that's one way. But also since there are fewer PoPs, more emphasis is placed on the quality of the connection (premium bandwidth). Maintaining a low ttfb and total transfer time means keeping more spare CPU power available. And high performance storage (since it's storing many files that perhaps aren't accessed frequently).

As time goes on, it'll become a lot clearer what I mean. The goal here definitely isn't to reproduce cloudflare.

I think the feature that's going live later tonight is a good example of something that's a big benefit to smaller sites and requires more resources: Normally, each edge node in a CDN contacts the origin server and caches the file on that node. By tomorrow morning, you'll be able to tell Nuevo Cloud that if one edge node has a cache miss, it should also tell the other edge nodes about the file so they too can cache it. Essentially, 1 cache miss anywhere in the world, and you've primed the cache on the entire network.

What that means in practice is that there's a single request for a file, and it gets transferred to 8 different locations, and stored at each location for 30 days.


Nice project. My only advice as a potential customer is to sell the advantages over tradition (and cheaper) CDNs more clearly. Obviously, having a longer ttl is the killer feature. And what you mention above seems useful as well. But I'd like more clear cut reasons (on the landing page) why your service is the better, albiet more expensive, option.


Thanks for the feedback. I think you're right, and that's definitely something I'm going to work on improving.


CPU is the last thing you'll bottleneck on in a CDN. You can saturate network IO with a 20% CPU.


I think Cloudflare provides a great service, but when I used it (a couple of years ago for one of my other websites), I wasn't impressed with the performance of their CDN.

It's certainly not a replacement for their DDOS/security features. Nuevo Cloud is solely about improving the performance of low/medium traffic websites. And from my testing (on both nuevocloud.com and one of my other websites), the performance is the best I've seen from a CDN (I'm clearly biased, but numbers are numbers).


Firstly, from a non biased, completely subjective point of view, that is one of the quickest loading pages I've seen in a while. A great showcase for the service, congrats on getting it out there.

I had a few questions I don't see answers to on the landing page:

Do you support pushing to edge nodes? Do you support the Vary header across things like Accept & Accept-Encoding? Is the $25/mo for 100GB across multiple domains? Any plans for higher tiers?

Out of interest, how big is the team behind this? What's your background? Would be interested to hear of how you go about setting this up to be quicker than the competitors, and what lessons you've learnt along the way.

All in all it looks really compelling - Good luck with it.


Thanks. I've been working on this for quite a while, and I'm excited to finally launch. Still have a lot of work to do on it though.

Do you support pushing to edge nodes? Push zones won't be implemented until later this year.

Do you support the Vary header across things like Accept & Accept-Encoding? Accept-encoding is used to decide if the reply is gzip'd, but other than that no. If you email me, I'll let you know when I'm finished implementing Vary.. it'll probably be this weekend.

Is the $25/mo for 100GB across multiple domains? yes. there's no limit on the number of domains.

Any plans for higher tiers? Yes.. there will be a couple of larger plans. In the meantime, you're welcome to email me if you need a larger plan.

The team is just me. I built this because it seems like every CDN focuses on enterprises, instead of the startup type websites I usually work on. I feel like most CDNs are a catch-22.. to get good performance at the edge, your website has to be popular enough to cache... but if your website is slow, no one is going to use it, so the CDN doesn't cache it. So this actually caches your files at the edge, even in areas where your website is less popular (so you can actually build up an audience in that area of the world).

As far as the website being quicker, it uses a web app framework I've developed over the past decade + the CDN (which I wrote in Golang and C). I've worked on various startups over the past 14 years (nothing big though).


How are you "clearly biased"?


rgbrenner is the poster of this link and seems to be developing the project.


Yes, this is my project. I probably could have made that clearer.


Thanks, I just saw that on your profile. Normal practice would be to include "Disclosure: this is my project" or similar in your comments.


How are you testing the performance?


Mostly real browser synthetic monitoring from 24 locations. Obviously that's not exactly real world (since those servers are in a datacenter), but it's reasonable close.

I also use Nuevo Cloud (and have used other CDNs) on one of my ecommerce websites that gets a couple thousand people/day. So there's google anayltics timing stats, but those aren't the most accurate.

Of course, this is very different than the performance monitoring that's done for Nuevo Cloud. It's much easier when you have access to the actual servers to track performance stats and work to improve those (compared to trying to compare real world performance versus another CDN).


Actually this is not a good test. Synthetic monitors live in data centers and you are testing the performance from one data center to another. You need RUM.


Yes, I completely agree it's not the best test. Real user monitoring would be much better.


Google drive? Add all the photos to a folder, set the sharing permission to "anyone with the link" (or your parent's google accounts).


This is what show HN is for - you should file this post under that.


>>Additionally, it's possible that a fix would make the game run significantly slower.

Why?


While there's not any reason not to hire people from something like this, don't assume the person is competent just because they went to a coding academy. Assess these candidates the same as you would anybody else.


What's wrong with seperate accounts? Provides the right level of easily switching between projects vs. too hard to switch when distracted.


(Mostly) agree. Google is out of control, and they'll do anything to sell more ads.

The best solution is simply to not use google products (which aren't that good anyway).

- duckduckgo for search - dropbox, box, etc. for files - openstreetmap for maps - firefox for web browsing (except firefox is about to introduce advertising, too, so I'm not sure about this one.)

This way, your data is decentralized, making it harder to track you. Also, these companies each focus on just one product, meaning their products are better anyway.


It's not about centralization, imo. An out of control AI will be able to harvest this data from wherever - Google just has the best starting set and the most robust data stream to make the initial leap, I think. But I agree with moving away from Google as much as possible, though FF is dubious - it's both a mediocre browser and will, as you say, introduce ads.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: