Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well, in any case, I've never heard of a government software project being overdue because their proxy cache contained spaghetti code :)

To be clear, I'm not trying to denigrate this project in any way. I'm just saying that the complexity of the problem should be taken into account here.



Maybe this will change your opinion. I recalling helping rescue a government software project that was horribly behind schedule. One of the many issues being an e-learning system for a state department of education. Their CMS (Blackboard) was fronted by a forward HTTP proxy/cache/load-balancer that was misbehaving but near-impossible to debug. Details a bit hazy now but I recall it being a hairball of poorly written java. I ripped that out and replaced it with Squid and some rewrite code.

HTTP intermediate services are easy to write, but operate in a hostile and chaotic environment; they are very very hard to make reliable, performant, interoperable, secure, forgiving, and compliant. To achieve that and still have elegant code is really something so yay Varnish.

I'm a bit of a fan of the Dovecot mail server source for similar reasons (but I'm biased, having made a small contribution and got into the authors file)


Nice. I'm pretty sure that for any "a government software project being overdue because X", there is such an X... and someone on HN was there.


The government should stop hiring HN programmers, that's clearly the problem!


I think the complexity of the problem actually speaks in favour of Varnish. The Unix philosophy of writing software is based on the principle of doing one, possibly small thing very well. And that's exactly what Varnish does: it is a reverse-proxy cache that can optionally do a bit of preprocessing on the incoming HTTP requests.

Besides, it's not trivial to write a high performance HTTP cache.


> Besides, it's not trivial to write a high performance HTTP cache.

Certainly true.

I guess that for me "beautiful code" invents some abstractions that transform a problem that initially seems dauntingly difficult into something that is easy to reason about. A proxy cache does not, for me, satisfy the first part of this premise, although I can imagine that the details of such a project take a lot of effort (hence "not trivial", but in a different way).


>I guess that for me "beautiful code" invents some abstractions that transform a problem that initially seems dauntingly difficult into something that is easy to reason about.

Well then, look no further than any RDBMS.

Seriously, for as much as people sometimes rag on the relation model, it is amazing for its power and relative simplicity.


The complexity of the problem is significant enough that there's only two verified HTTP servers I recall and a specifying/verifying a web app was considered good work. The bugs, inconsistent perfirmance, and CVE's in various caches also suggests they're not dead simple.

They're certainly easier to write than a lot of software. Their simplicity is deceptive, though, when real-world isdues come into play. Esp if result is to be beautiful.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: