Hacker News new | past | comments | ask | show | jobs | submit login
You're a Bad Programmer. Embrace It. (dzone.com)
75 points by Garbage on Sept 30, 2010 | hide | past | favorite | 65 comments



Hi all. I'm a long time lurker, but when I saw my article made the front page, I made an account. :)

I don't think that using tools makes us "bad"... I think pretending that we're "good" when we really need the tools make it much more difficult for us to past our egos and really look for what we need. There are so many great practices and tools available, but most developers I've encountered because things are "good enough". I was hoping to encourage people to try something new. Something a "good" programmer would never need, but a "bad" programmer would desperately want.

In regards to knowing the API versus using an IDE... spend some time with a Ruby developer. Someone who's not used to having great IDE support. They've been forced to learn the language. The difference in their productivity level is amazing. Contrast them with a developer who only learned Java or dot Net and was using Eclipse or Visual Studio from day one. Those developers tend to not learn the underlying APIs and I think it hurts their ability to know what functionality and libraries are available to them.

Anyway, it's very cool seeing an article I wrote on HN!


Wait, what? OP claims that we are bad programmers because we use too many tools:

We can't remember all the methods our code needs to call, so we use autocompleting IDEs to remind us.

and then recommends using tools to become better programmers:

These tools flag common mistakes... really common mistakes.

I know I'm in the minority because I've had this debate with almost every programmer I know, but I do not use an IDE for exactly this reason. I want the "firmware" that knows how to assemble building blocks of code to be in my brain, not in my computer. To me, writing code is like driving a stick shift, for much of it I don't even think, it's unconscious competence. Can you imagine driving a car with auto-complete, color coding, or drop down menus? To this day, I get pissed off when something doesn't compile the first time, usually because of a typo.

There's a time and a place for every tool. An IDE or test suite is like a back brace or crutches; they are indispensable when you really need them, but they only weaken you when you're healthy. I guess I draw the line in a different place from OP (and almost everyone else I know).


Why is there such massive nerd-rage when it comes to people using IDEs versus what amounts to text editors with scripting toys?

IDEs are fantastic and I am a more productive programmer for using one. I work with a pretty big codebase and just having excellent cross-file navigation and usage searches are big time-savers. Color coding allows me to establish a visual understanding of more code at a glance which is especially useful when I didn't write it. Integrating with source control directly from your editor gives you fine-grained organization when you are working on multiple changes in parallel. Task managers (ie, Mylyn or Atlassian), in conjunction with source control integration, significantly reduce the overhead time that is required to perform workflow-related management during the day. And despite my hacker pride, I love having good debugging tools around as they are wonderful at aiding in maintenance.

Code completion saves you time and accelerates learning. Maybe I'm in the minority here, but I simply cannot understand why autocomplete makes you so bad. Once your team grows to beyond 4-5 programmers, you're going to be seeing plenty of code you didn't write yourself. A knowledge of naming conventions and design patterns allow you to use autocomplete to help find the behavior you want before turning to documentation or duplicating it yourself. At the very least, it helps me type faster because I am typing less and it helps me avoid spelling mistakes. Autocomplete is a search tool and a typing assistant; it does not create a black hole in your brain that sucks away any memory of what you have seen or done in the code.

If you find a function that is useful, stop and find out what it does. Why are you calling it if you don't know what it does? If you don't remember it after the first time, you probably will have it committed to memory by the third. And who cares if you don't remember it, do we really need to know every API of every library we use? No, we used it as a tool to get the job done. We have precious few brain cells to spend them on memorizing an ever-changing dictionary; let the computers do what they do so well.

I use Textmate and occasionally vim when the situation arises. They are excellent tools for certain types of work. But when I'm working on a feature in a big codebase, there is no place like my IDE. It takes care of all of us average programmers at average companies.


The boring answer is that there is no particular nerd-rage directed against IDEs. There are many nerds who prefer IDEs and there are many nerds who prefer text editors. The modern culture encourages us to self-identify as rebels breaking away from uniformity. Thus your attention focuses on people putting down IDEs, while a vim user's attention focuses on people putting down text editors.

I don't use IDEs, because I find myself somewhat less productive in them. Syntax coloring, to me, is a fantastic feature, and I use it in a text editor. Integration with source control and usage searches are things I like to do in a separate window, because I want them to be deliberate enough - I think they'd break my concentration more easily if I didn't have to context-switch to do them. I dislike autocomplete for a similar reason - it creates the impression of my thought moving in tiny jerks all the time - even if it isn't really true, the impression itself is a nuisance. I don't think that the typing speed gains from autocomplete could really matter to me. The time spent actually typing lines of source code is a very small portion of my most productive days.

All of these things are not true for some programmers I know who feel at home in IDEs, some of them my betters.


Most if not all of the things you counter nerd-rage about here are available in vim and emacs if you want them.


Can you imagine driving a car with auto-complete, color coding, or drop down menus?

Have you been in a car built in the last few years?


ABS, traction control, stability control, multiple clutches, power steering, lane drift alert, automatic breaking, seatbelt pretensioning.

I can go on.


That's called progress.


automatic breaking

That's a surprisingly common feature.


> " I want the "firmware" that knows how to assemble building blocks of code to be in my brain, not in my computer. To me, writing code is like driving a stick shift, for much of it I don't even think, it's unconscious competence. Can you imagine driving a car with auto-complete, color coding, or drop down menus? To this day, I get pissed off when something doesn't compile the first time, usually because of a typo."

Well said. I've also had the argument a few times, but have given up trying now.

You could also compare it to a novelist using a "Writing IDE" to create a story. Click "John", Click "Conversation with Kate", Fill in the speechmarks, click "Build suspense macro". It would be mental to write a novel that way, so why do some people write programs that way and then look surprised when they end up with rubbish code.


The novel-writing analogy is a bit off, because the novel elements you mention are higher level than a single API call would be. They're more like a subroutine or algorithm level. IDEs don't really do that.

An IDE's functionality is more like a screenwriting app's formatting of a script as the writer writes.


idk, I've seen a lot of IDE's that have scary high level things like "Refactor this", "Remove unused code" etc.


No, a closer analogy would be when you're near the end of your novel you realize that you could add a twist by having Kate actually die in the first act and no one realizes it until the twist ending. Refactoring tools make sure that you don't have to comb through every line of the previous 1200 pages looking for inconsistencies.


>OP claims that we are bad programmers because we use too many tools

I took from the essay that the tools we use are evidence of our bad programming ability, not the cause of it.

Incidentally, I mainly code in gedit, for much the same reasons you cite.


> To me, writing code is like driving a stick shift, for much of it I don't even think, it's unconscious competence. Can you imagine driving a car with auto-complete, color coding, or drop down menus?

Can you imagine driving a car with power windows, turn signals that turn off after you complete the turn, etc?


I dislike IDEs for all their features which for the most part I think just add clutter, since I've ever found a use for most of them like debuggers etc. but I'll admit that they are useful if done right and can bring a productivity boost.

When I have to write things like:

    strstr($str, $delimiter)
and

    explode($delimiter, $str)
Netbeans ctrl+p comes in very handy in re-enforcing my choice of arguement order because if I make a mistake that's a very subtle bug waiting to happen.

or when I have to wrote things like:

    g_file_new_for_commandline_arg
who wants to type that out when they don't have to?

or this over and over:

    AStudpidlyLongName::
when I can just type `ast` and press ctrl+[space] and get the name in a completion list highlighted every single time.

or:

   m_ui->tooBar->addWidget
when I can m_ui.t... and have to automatically figure the correct sequence, this comes in pretty handly when you're working code written by other and they decide to use hard objects as opposed more common pointers.


Alternatively, use languages that aren't PHP and actually have standards and have a designed API rather than something organically grown into a disgusting mess.

That said, I used PHP for years and am very familiar with how annoying the lack of standardization is. I took the opposite approach from you, I just used a text editor and hit php.net/$foo until I had the parameter orders burned into my head.


Well I didn't see you offering to pay me to write in anything else so excuse me if I don't take your advice, and going to php.net is just time wasting if your IDE provides that info right there and then. After that you didn't address the C++ case or the case coding in glib or C# with their overly verbose(though less ambigous) naming, C is even worse when you have to use multiple libraries or predominanly binding dependent languages like Vala or Lua.


You raise a good point -- when I was writing reasonably small programs, it was gvim all the way for me. However, the minute you have to interface with a library like wxWidgets, for example, the IDE really shines through with auto-completion. To answer previous posters, it just replaces an alt-tab to firefox and a couple of API lookups, and saves you about 5 seconds per obscure, unfamiliar API call.


I'm a good programmer. Embrace it!

I used to think I'm a bad programmer, and I was. But ever since I'm doing my startup, I'm concentrating on one project in one language, instead of ten different things in 4 different languages. My co-founder and I found and agreed on a practical subset of the language (C++), and we write great code. I'm still a "bad programmer" in the sense that if I wrote Javascript code, it'd not be great, but now I know that the key to success is to concentrate on one thing for a long time --- and actually it's possible to be a "good programmer" if you do that. Okay, I still don't do enough testing, but in our case (distributed DB) the long-term smart solution is to hire a test engineer..


You sir get a gold star, The big reason that projects fail is "technology soup", everyone has to get their pet project or technology into the project. It gets worse the bigger the projects get. When you get to the enterprise where the big project are, you get governance and a bunch of design by committee. Where people who are not writing the code, tell the people that are "here is what you are going to use".

It is funny but every time I walk into a project that is "fad" programming I see failure. When I walk in a see a bunch of guys heads down, iterating and releasing the most crucial features first, the project always seem to succeed. Maybe the solution is less magic technology dust not more.


"heads down, iterating and releasing the most crucial features first"

That's the key.

The "number" of technologies involved is moot. Every piece of software is build upon thousands of layers of technology. Adding one or more ingredients to the soup doesn't necessarily ruin it.

Often the simplest way to solve a problem is to duct-tape two different items from different technology stacks just well enough that it works for you. In that simplest case, you may be duct-taping 2 technologies together with a 3rd.


I disagree somewhat, one of the reason the web is a battle ground of corpses, that where once projects is due to the fact that it became a convoluted mess. The bar to be proficient required HTML, CSS, XML, (JSP, ASP, PHP pick one) generally some frameworks (taglibs, struts, tiles), SQL, Web servers, applications servers, Database servers, CMS servers that double as a bad application server, the list sprawls out from here.

Developers had to master all of this just to put a page together. Technology soup can bloat to the point where it is unmaintainable. When you get into the enterprise you can add MQ, LDAP, SAP, Peoplesoft and a slew of others, that require integration. SOA and ESB's are doing a lot to clean it up and give a single implementation technology but it is still a mess.

Jaguar once had the distinction of being the automobile with the most moving parts, they also had the distinction of being one of the most unreliable vehicles. Technology for technologies sake is always a project killer. I have seen it too many times.

When I take over a troubled project (one of my specializations is troubled project rescue) one of my first efforts is to reduce the amount of technology load the project carries. That coupled with a focus on MVP saves a good deal of projects.


While this doesn't necessarily negate the "we can't remember syntax" argument...I remember back before mobile phones were rampant, I was really great at remembering phone numbers. People besides by parents were regularly surprised at how I'd remember numbers I'd seen or read just once. Then I got a mobile phone.

I think I probably have less than 5 numbers memorized (those of my immediate family). In essence, using an IDE encourages us not to memorize syntax.


Am I the only one who feels the whole egoless programming thing has gone too far? Yeah, ego is a bad thing. But self-esteem and confidence aren't. I'm not afraid to say that I feel I'm good at what I do.

That doesn't mean I'm never wrong or that I don't make mistakes. It just means I'm proud of my abilities.


Great post! Some suggestions that were new to me, too, although you're mixing Java and Ruby in your examples.

I agree with everything except the DSL stuff at the end. People have mentioned on HN recently, and I believe it is true, that Cucumber, etc. may be something customers get involved with in the beginning, but then they lose interest in it. Sure there are a number of other things that vary from place to place like politics, conformance to standards, etc. But, customers typically just want to see results (in a reasonable amount of time). If you spend time on good design/UI/UE (read "nice looking and easy to use"), it is fast, it doesn't break, and you've tweaked to their needs, you're set.


I've spent more time in Ruby and Java than other languages. My background is leaking through. :) Feel free to post any other tools to the comments on the article.

Re: DSLs... I don't think the DSL specs always stay in front of the customer (they sometimes do), but they're a valuable tool. Whether it's a developer, a technical customer, or a non technical customer, you've provided a much simpler way to show what the system does. That's a huge win.

I also find that developers who spend time in DSLs tend to write much cleaner code. Once you get used that level of abstraction, you use it in other places.


I can't imagine using autocomplete or even an IDE. I remember when I first started I was writing C++ on Windows in Visual C++. My learning only accelerated once I moved off Windows and stopped using autocomplete. Once I started working on Linux and in vim, I didn't even consider looking back.

I like to think I'm a minimalist when it comes to my tools, so obviously my experiences and insights are very different from someone working in an IDE. I prefer the command line to a gui any day.


I've always believed that I am a pretty terrible programmer.

I'm cool with that because I know I am gradually becoming better and better each day.

I invoke 3 core principals/ideals:

* No matter how good you are, someone is always better than you. Yes. Always.

* Surround yourself with people smarter than you, rather than shun them egotistically. Intelligence is contagious.

* Admit fault, show humility to others who do the same.


Does anyone still believe there are really any ground breaking insights to be made in these kind of self-loathing rhetoric articles?

Shouldn't there be more "You know what, sometimes just having something work is pretty much awesome" articles instead of everything fooling themselves that every programmer in the world needs to be a mini-genuis?


A big theme in programming is to get as much complexity out of one's head and into the computer as possible. In this regard, I consider the IDE an extension of the high level language. Whether I use one depends on what I'm coding in. I'm usually fine with gedit when coding Python. In Java, the IDE helps cut through the verbosity.

There is some truth to this article - humans are much worse than machines at simple, regular processes. Manufacturing eventually figured out how to divide up labor along the assembly line and automate as much as possible to make the process regular. Software is comparatively in its infancy. Maybe we will see a similar progression as companies replace the artistry of software with more standardized, repeatable processes. I probably won't be coding then.

Kind of reminds me of http://xkcd.com/378/


The article starts of a little depressing, telling the reader that humans are bad at programming and we have no hope. At least it finishes telling us we just need to use a bunch of cool toys to look smarter.


We can't remember all the methods our code needs to call, so we use autocompleting IDEs to remind us.

I thought memorizing all of the details of these was only important in school, and even then only at exam time.


I always thought it strange to consider it 'cheating' to rely on a powerful IDE. A programming language is a tool - nobody think it is cheating or a sign of weakness to use the most powerful/productive programming language suitable for a task. An IDE is similarly a tool, which complements the language - just as the code we write is basically a tool.


Not true. If you have the details in your head, you can manipulate and compare them far faster than if you have to rely on an external tool. That means you can rehearse more ideas, faster, which will lead to a better likelihood of picking the best one.


Remembering things is VERY important. It means you have an instant grasp of the code base, the APIs available to you. It means you can solve problems much quicker than someone who only has a limited memory, or uses auto-complete all the time.


Remembering the right things is very important. I know the code base far better now with an IDE than I ever did without one. Because it lets me focus on what is really important in the code.

I get to focus on code structure, algorithms, I can quickly drill into the code and back out. I can in a few seconds see all uses or all definitions.

The thing that makes IDEs great IMO, is when I'm looking at code that isn't mine. The ability to abstract structure from other peoples code quickly is a godsend.


Personally i dont use an IDE, i use gedit because it does what i want, colour coding, auto indentation and a file browser pane that supports (s)ftp (but i suppose thats actually the OS), thats all i'll ever want or need.

If you want to use an IDE, then by all means go ahead, but if your trying to hire me, dont make me use it, and dont make me use an OS i dont like either.

Everyone's different and they've got their own way of working, just let them do it their way and they'll be more productive, they'll achieve what you're after.


I've posted a follow up article. You're a Bad Manager. Embrace It. http://agile.dzone.com/articles/youre-bad-manager-embrace-it

I'd appreciate any help spreading the word. I'm involved with much more of the technical sites than the managerial ones.


A short bio at the bottom of the article says:

"Jared Richardson works at Logos Technologies As a recognized expert in the software industry..."

Take note of the word EXPERT. He claims developers are bad programmers yet he is an expert. It's contradicting or maybe the right word in the bio should be WORST? :P


Nice catch! It's just marketing though. Don't read too much into that...


I am struggling with that: how to market myself as a consultant, knowing that all code sucks?


Does anyone know why user access_denied, who has a [dead] sibling comment to this comment, seems to have been "hell banned"? Showdead needs to be set to true to see the comments or profile.

http://news.ycombinator.com/submitted?id=access_denied

It disturbs me that, looking over past submissions and comments, there doesn't seem to be any egregiously bad behaviour.


"I made X several million dollars." Why talk about code?


If I knew how to make X several million dollars, I would probably also now how to make myself several million dollars. I agree that if I had make X several million dollars, it would be a good hook.


So you know my whole shtick is engineering marketing outcomes, right? I'm working for a client right now, doing SEO/conversion optimization/metrics/etc. It is quite reasonable to expect that I will increase their business by 5%. (I quoted a much, much higher number than that.) I could do that for myself, too, no problem.

Step #1 to making myself a million doing it: grow a business to where it gets $20 million in sales annual. Step #2: Successful A/B test.

Or, alternatively, make $CLIENT a million and get paid handsomely, without the "build a business for a decade" step.


Good idea, and I'd actually like to get into that area of work. Will study your articles as a starting point :-)


Just remind your clients that your code sucks less than your competitors.


Indeed, knowing that things suck and being able to document why they suck, and why your approach is somewhat better is a good skill. It's not all smoke and mirrors - it's providing a somewhat objective assessment of the situation.

I provide unit tests and documentation for projects - many others don't. Well... showing that to a client and saying "you're getting this which will help you understand what I'm doing, and help future devs understand what I did" is just good sense. If the client values it, they will pick you over someone else. If they don't value it, they will after they choose the cowboy over you . :)


You know it sucks. Find the right tool set and practices to help your clients improve. You can't fix it overnight, but there's ~so~ much room for improvement in our industry.


You're a good programmer if you reuse code.


It reminds me of the saying (probably butchered here) that the more we learn about a topic, the more we realize how little we really know.


This simple and completely true statement never ceases to fascinate me:

> In short, as an industry, we all stink.

We really do stink and we as an industry are nowhere near a state where there are any real standards and best practices in place like e.g. when building a skyscraper or a road. Those are fairly to pretty damn complicated tasks and there is so much knowledge, experience and best practice involved and everyone in the construction industry (should..) knows them and when it is done you can have it checked and certified that it was well built and won't crumble the next day.

I don't think anything like this even remotely exists in IT. We have a lot of RFCs describing protocols and what not but nobody can really objectively certify your serious-business software as well-built or can verify whether you applied even the smallest best practices or common sense guidelines because there are so many almost religious wars being fought over completely minor advantages and dis-advantages which simply do not matter on a global scale.

And far too many really, really actually bad programmers just get away with their mess or horrible, insane code.


We need to remember that computers are young. Really, really young. The invention of the first-ever electronic digital computer is still an event within living memory.

This can be hard to remember. Especially when you yourself are young, you're used to thinking of things that are even slightly older than you as eternal. (When you reach my age, and you are beginning to have colleagues who literally weren't alive during formative periods of your early career, you get a deeper appreciation for the fact that people -- including yourself -- have these horizons.) For example, I did graduate work on lasers, and I built my work on top of a great deal of earlier work, so I tend to think of lasers as things that have been around for a long time. But the laser isn't very old. It turned fifty this year. Many of my friends are older than the laser. Many of its early pioneers are still wandering around.

Look up the early years of engineering. Read certain chapters of the excellent Structures, or Why Things Don't Fall Down, or for a more personal view check out Chapter 20 of Mark Twain's Life on the Mississippi:

http://www.classicreader.com/book/2886/21/

The early history of tall buildings is the history of towers falling down. The early history of modern bridges is the history of bridges collapsing. The early history of steam-powered transportation is the history of boiler explosions, especially in the USA, which had a reputation for quick-and-dirty mechanical hacks as far back as the early 19th century. These situations took decades to change... or longer. Sometimes an order of magnitude longer.

And, crazy ideas about the Singularity notwithstanding, technology proceeds at a human pace. Right now, we're still at the point where substantial portions of the world population don't even have access to a computer (although the smart phone promises to change that). Very few people know how to program at any level. And that is the major issue facing programmers today, as it has been since the microcomputer was invented in the 1970s: There is more value in spreading the temporary hacks around more widely and cheaply than there is in inventing more permanent and solid stuff. Until we can meet the insatiable and growing demand for poor implementations of 1970s-era computing technologies, there's really no time or money for anything else. This condition is probably temporary, but when I say "temporary" I may still be talking on the scale of decades.


No. Although your point about the youth of the system involved being a critical determinant to it's reliability is correct, you make an error in naming that system as "computers".

"Computers" are reliable these days. We rarely have design problems with the actual hardware (ok, sometimes we have higher than desired failure rates, but most of the time, computers these days are very reliable). They function pretty much as we expect them to. As proof, they are so reliable that when we have a bug in our programs, we no longer even think to point the finger at the hardware. Instead, we point the finger at the software (though in too many cases devs still tend to point the finger at other people's code, when they should look at their own code first - it is the youngest, and hence least reliable, as a general rule).

The problem is software. I seem to recall pg writing an essay about this - we aren't making copies of something that already exists, we're making new things, things that have never even been dreamt of before. Of course we're going to have problems. I've watched in amazement as Rails has gone from a not-particularly-reliable framework to something relatively solid and flexible in a matter of years. MacRuby has gone from a dream to one of the best ruby implementations out there over the same sort of time frame. The list could go on and on.

But getting reliability out of a system takes time. As a rough heuristic, I think you could say that you need about a year of catching and fixing bugs in a product from the time that you stop adding features for every year that the product took to develop. If you add features, the new parts won't be right until another year has gone by (although hopefully unit tests should prevent regressions in the existing code).

Of course, in a shop with sloppy coding standards, you may never get a stable product out the other end, but I think that most competent engineers these days know the techniques that will give a reliable system - verifying inputs to APIs, Unit Tests, documentation, specs.


> We rarely have design problems with the actual hardware

Hah. Spoken as someone that has clearly never written drivers. Take a peek through driver code some day - you'll see workarounds for bugs on different implementations of the same standard. You'll see heuristics to, say, filter out spurious monitors reported as connected by video cards. You'll see poking random values into registers with comments like /* this shouldn't fix things but it does */.

More or less, you'll see dozens of workarounds for hardware bugs. Hardware is buggy.


Actually, there's nothing 'clearly' about it. I spent three years of my life doing nothing but writing drivers for NT 4.0 and later on linux. I'm not saying that hardware these days doesn't generate errors, I'm saying that the errors they produce are understood, and can be corrected/handled in such a way that the system doesn't fall down. You just don't see incorrect values being read in from memory or from disk, because we have error correction systems in place. You don't see the value in a register in a CPU changing for no reason. You don't see uncorrected cross-talk errors on the CPUs data buses. You don't see dodgy values being read off buses due to reflection off incorrect bus terminations. These are all problems that existed at various stages in the development of computers, but these days, it just doesn't happen any more because designers know the problems exist, and engineers are taught how to avoid them at university.


>We rarely have design problems with the actual hardware

Hard drives, still the main data storage hardware in most computers, are utterly unreliable.


No, you're missing my point. Consumer hard drives are unreliable because consumers would rather save a few dollars rather than pay for a reliable system. That's one end of the spectrum. The other end is, say a bank, where databases are backed up remotely in conjunction with high-reliability RAID drives to achieve a very high level of reliability.

It's not that we aren't capable of designing systems that are reliable, but rather that we don't value the reliability as much as it costs. And that's because even a crappy consumer drive is 'good enough' for most people, unreliability and all. If it's not, we put in some sort of a back-up system to mitigate the risk.

In the early days of computers, that just wasn't so. If you're old enough, you probably remember having a C64 game on a cassette that you just couldn't get to load. Or a floppy disk that became corrupted because it got heated to over 50°C and some bits got jumbled. Go back even further when only prototypes of these devices existed in labs, and the situation was even worse. Well, most software is more akin to the prototype in the lab than a product on a shelf. It hasn't been thoroughly tested, it is still under development, we don't understand the limitations of the system, etc etc etc.


But not for the reasons you'd think -- mechanically they're incredibly reliable these days, even when violence is done to them.

What sucks now is the software running on the embedded drive controller, especially on the most recent highest-capacity drives available at any one time -- the most common bug is for the whole drive to just stop responding for several seconds.

There is still one type of hardware error causing problems -- single-bit corruption errors. They're much less likely than they used to be, and the drive will catch them with a checksum when read back, but capacities are so high that the probability of hitting one when reading back a full drive is nearly %100. This makes RAID-5 and other parity-based striping systems almost completely unviable these days -- you'll never be able to rebuild the array from n-1 drives without hitting a single-bit error. And because of the software bugs in drives, they drop out for no reason all the time forcing you to rebuild the array (which then fails because a cosmic ray caused a single-bit error).

SSDs won't save you either, they have even more classes of software bugs present in their embedded controllers.


I started the clock at the birth of the first computer because, obviously [1], one could not begin to solve the practical problem of developing methods for writing error-free software before one had hardware to test it on.

I'll concede that we seem to be better at controlling the error rates in hardware than we do in software. There are lots of reasons for this; the biggest, ironically, is that custom hardware is very expensive. Therefore, there is little market for highly unstable custom hardware built by high school students. The cost/benefit ratio is all out of whack. It pays, if you're going to build hardware, to take your time and do it very carefully.

---

[1] On the one hand, at this point I expect mathematicians to jump out of the woodwork and claim that their illustrious predecessors were able to theoretically predict the future existence of buggy software as early as the nineteenth century. On the other hand, I have a gut feeling that very, very few computer programmers realized how hard it was all going to be before they actually tried it. Nobody ever does, even today. Gedanken software never has bugs.


Comments with supporting book suggestions. This is the kind of comment I can get behind.


I'll second the recommendation for J.E. Gordons's Structures. The first and last chapters are wistful and heavily editorialized, but everything in between is excellent.


Many "real world" building projects far exceed the originally estimated cost; some of them are not very stable at all, and scale badly when put under strain that wasn't originally envisioned.

Just one link: http://en.wikipedia.org/wiki/Scottish_Parliament_Building#Ti...

Software development is not nearly perfect, but neither is engineering of "real" buildings. My farther in law is a civil engineer, and when I told him that I believed that engineering in his discipline worked much better than in the software industry, he just gave me a disbelieving look and called me naiive.


I think the use of some of these strategies mentioned are what should be best practice. Code coverage checkers and code analysis tools, most specifically.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: