Hacker News new | past | comments | ask | show | jobs | submit login
12 (Really) Controversial Programming Opinions (billthelizard.com)
31 points by johndcook on Sept 2, 2012 | hide | past | favorite | 60 comments



Mostly opinions of an angry C programmer. Disagree with all of them. Too context specific. For example, testing. It is generally a good idea, but you don't always want to invest the time if it will have a negative return. Small projects or throw away work are two conditions where this would be true.


These are NOT his opinions but the opinion of several different people who answered the original StackOverflow question.

The first one, for example, is obviously a joke.


guys,

>The first language should NOT be the easy one, it should be >one that sets up the student's mind and prepare it for >serious computer science. >C is perfect for that, it forces students to think about >memory and all the low level stuff,

as one low-level programmer to another, I hate to break it to you, but "low level programming" is not "serious computer science". the two have very little to do with each other.


Ironically C is technically a high level language anyways.


Not if you live in the 2012


Originally, the term "high-level" meant that programs were portable between machine architectures. You can do this in C, but in assembly. This is a precise definition.

I cannot find a precise definition, which puts C and Javascript on different levels. Has garbage collection? A big standard library?


I think there's a spectrum around how many different ways the machine could choose to execute what you wrote. E.g., V8 creates hidden classes rather than naïvely following the prototype chain, where C pretty much dictates how things are actually laid out in memory.


'ish.

C exists in the middle territory between high (colloquially right now) and "original" low-level programming.

By all accounts, virtually any language that operates relatively close to the semantics of the machine in terms of operations and memory (pointers are a standard abstraction in assembler) is a low-level language.

The only things that detract from C's low-level credentials are that it is more of a hybrid attempt at grafting a harvard machine onto von neumann machines and its functions. The functions/stack are a bit of an abstraction depending on which architecture you're using.

Calling C high level isn't accurate or descriptive, but I'd be willing to concede that it's not truly low-level either if you know enough C to explain why.

You don't, though.


"You don't, though"

Interesting, though I'm not sure how my knowledge of C is relevant to the conversation.


You made a false assertion about C.

          o
             
     o         o
Connect the dots.


"C is not high level because you don't know C" is not exactly the most cogent of arguments. The rest of it I pretty much agree with, nonetheless the actual definition of "high level" language doesn't change just because you (or I) feel it should, which is why I said it was "technically" high level.


That's pretty true. However, in C, you can actually write interesting data structures and algorithms that come up in computer science. I think that makes for good "theory meets practice"-based learning.

I'm not saying all algorithms classes should incorporate writing C code all the time. But probably every budding computer scientist or software engineer should take at least one such class.

I'm also not saying C is the only language for which this is true. But it's definitely not generally true for interpreted languages.

Also, C is pretty much the language everything higher-level is implemented in. So once again, at least some exposure to C can help someone conceptualize what's happening on the computer at every level.


Hmm, can you give an example of a typical "learning CS" level algorithm that you can't implement in, say, JavaScript?


Not an algorithm, but used in many. Pointers.


Emulating pointers in JS is pretty easy; you just wrap the values in objects or arrays.


B-Tree indexes in JavaScript


Googling around, I see a number of implementations. What's wrong with them, and why do you feel this is impossible?

(And hint: it's guaranteed not to be impossible since JS is Turing complete)


You could probably implement any of them in JavaScript, but it would be sort of silly, because the actual runtime complexity and memory usage aren't really going to be close to what they "should be" due to the interpreter.

I haven't given it a lot of thought, but I doubt it makes sense to implement "textbook" algorithms and data structures without pointers.


As I said in another thread, you can get the same functionality as pointers by using arrays or objects, which are always passed around by reference.

You're right that this is an inefficient way to implement those algorithms, but I disagree that that makes it silly. The purpose of writing those algorithms is to learn how they work, not to use them in practice. Generally, if you're using them in practice, you shouldn't be rolling your own in the first place.


I like this quote from Knuth: "People who are more than casually interested in computers should have at least some idea of what the underlying hardware is like. Otherwise the programs they write will be pretty weird."

C is only marginally closer to the hardware than other languages, and programmers really should be aware of even lower level concepts, but I think learning C is a hugely useful and relatively digestible step towards really understanding what's going on.


absolutely! C is so widespread that you should be conversant in it as a software practitioner, whether as a programmer shipping code or a scientist making tools and working with others tools.

however you can explore what computer science has to offer without subjecting yourself to C. I think that "computer science" has more to do with "optimizing algorithms for efficiency" and less to do with "staring at stack traces and wondering why gdb is so bad"


One that really stood out to me:

6. The use of try/catch exception handling is worse than the use of simple return codes and associated common messaging structures to ferry useful error messages.

I think this really depends on the language. In erlang I absolutely hate exceptions because they don't really make sense in a functional language (which let's say erlang is for the sake of this argument). Threads run functions with only the function's parameters as state. It doesn't make sense to have to worry about a function stopping half way through, especially when most functions already pass up tuples looking like {error,E} or {success,S}. If you have to worry about exceptions as well, now you have to handle the error tuple and the exception case, and usually both cases are getting the same code so you have to go make a new function to call for that case. This gets messy. If a thread crashes it crashes, otherwise it should be passing up the error tuple; there's no reason for exceptions.

On the other hand I think exceptions make a lot of sense in python, and I think the python community has incorporated them in a standard way which makes sense, so that handling exceptions in python is actually easier then passing back different error codes and what-not (no atoms in python).


I do sort of agree with the try/catch criticism. I know what it achieves and why we need it but I don't agree that it's the best or even most intuitive error handling mechanism. I personally prefer returning error messages in place of or alongside expected data structure outputs, such as the common paradigm in JavaScript where async functions return error info (or null) as the first parameter in the supplied callback before returning the actual data (if no error) in the subsequent parameters.

I do somewhat agree with the OOP criticism as well. I find the best programming achievable is by mixing strategies and not adhering strictly to any one philosophy. Java's OOP is so inflexible and limiting that it takes a significant amount more effort to get basic work done. It may lead to a somewhat more manageable codebase at first, but I feel experience in your language of choice can bridge the gap between such a strict static-typed language vs a more elegant but potentially messier dynamic-typed language in this respect.

Finally I do agree that C should be taught before you learn any other programming language...but not to such a degree that you need to be able to build the best Javascript engine in the world from scratch before you move onto another programming language. The goal of programming is to build software that saves you time, after all - you aren't saving much time with as high a barrier to entry as mastering the fundamentals of C. It is important for theory discussion as well as learning the best way to code a lot of logic in many other languages, but I feel an abstracted language has failed if it hasn't done enough of the work for you to refactor your code properly in the first place. Loop unrolling, seriously? I like that I learned why it's important...but I shouldn't ever have to do that in any decently programmed higher level language.


I don't. Exceptions in general FORCE a developer to deal with the problem. They can slap a developer in the face if they're abusing the API.

Error codes on the other hand can be ignored.

PutLionInCage();

PokeLionWithStick();

I'd rather PutLionInCage() throws if there is a problem.


They don't necessarily force the developer to deal with the problem, unless you're talking about checked exceptions (which are a whole other controversial programming opinion). What they do accomplish, in my opinion, is more deterministic behavior and easier debugging.

If you rely solely on return codes to indicate an invalid state and the caller is free to ignore the error codes, it makes it much easier for the program to continue on doing invalid things - this is akin to wrapping everything with try/catch blocks and suppressing exceptions. Even more, when you do realize something is wrong, a system utilizing exceptions is generally much better at propagating information about the original problem.


Exceptions in general FORCE a developer to deal with the problem.

There are good arguments in favour of exceptions, but I don't think this is one of them. If Java has taught us anything, it is that trying to force someone to handle exceptions just causes a lot of code that says:

    catch (Exception e)
    {
        // silently ignore e
    }


Actually, the correct comment for an empty catch block is:

// This should never happen.

(Until it does)

I have mixed feelings about checked exceptions. They seem like a great idea in theory - if there are known error states, then making them explicitly known and forcing the consumer to acknowledge them in some way seems reasonable. In practice, a lot of the time the only response to a checked exception is "let it propagate up - there isn't anything useful I can do", which leads to extra code and can be annoying.

Even though I don't have quite the negative feelings toward them others do, if I were involved in designing a new language today, I'd probably vote against including them. It would possibly be nice if a method could explicitly declare which exceptions it was likely to throw, and then an IDE or static analysis tool could use those to aid consumers of the API, without forcing them to deal with every single one.

Interestingly enough, I view Scala's Option construct as being somewhat similar - forcing the consumer to acknowledge that a particular call could return a null value - but it doesn't seem to provoke the same negative visceral reactions that checked exceptions do.


Sure you can do that but the programmer is FORCED to make the choice to write that. Also the code _looks_ wrong like that.

Furthermore if the dev just wraps the entire process in that silent catch then the lion doesn't get poked!


If you need an exception here, you're just a bad programmer. It's your fault you didn't explicitly check that the lock was functional before poking that lion! /s


Sure but are you telling me you always work on teams where everyone is shit hot?

If you write a library are you thinking that everyone is going to use it properly?

My point is that exceptions help people to learn how to use APIs (its kinda like documentation that appears through use!) and exceptions make things _safer_.

I appreciate you could write the above in two try catches but my point is that it would look more wrong. The code above _looks_ fine and that's my point.


Hah, I guess my post could have used a sarcasm tag :)

I totally agree that exceptions are far superior than checking return values.


But sometimes it's not just the programmer that gets bitten, it's all the other people at the zoo, too. Exceptions help protect everyone.


There can only be controversy when either the truth isn't clear, or it doesn't matter. If we knew the answer, and it mattered, everyone would do it that way (and those who didn't would lose to those who did). e.g. religious wars:

  vi vs emacs
  OO vs fp


I don't agree at all. I don't think Galileo would agree, either.


One opinion of mine is "code is also configuration". If you structure the code right, I see nothing wrong with putting stuff that many people see as configuration in the code. I do Java programming, and I think that frameworks like Spring are overrated. There is nothing very different from stitching together dependency injection with XML than with Java code. It is a pattern, not a framework. But the Spring trend is hard to walk against, it is so strong. I am getting tired of this occupation.


I err on the side of http://en.wikipedia.org/wiki/Rule_of_least_power. While it's true that code is data, arbitrary code is a form of data that's exceptionally hard to analyze or repurpose. Text config rules out stuff like dynamically-composed names of objects and properties (which you can't search-and-replace).


I disagree with most of them and seemed to be biased towards a specific language the poster of the opinion liked/understood best or against one they disliked.


Really surprised at the amount of hate for Python significant indentation. I've only used Python a little, and I expected to hate that too, but found it wasn't really so bad. I thought my experience was pretty common, and that by now most people either thought it was a great idea or at least found it tolerable. But "it was a bad idea" is currently favored 43 to 5. Maybe not many Pythonistas have seen this page yet.


In my opinion not many Python programmers care for places like stackoverflow, as the language itself has excelent infrastructure to deal with problems and to get ones questions answered quickly without bothering other people. Maybe that's why the vote is so high there.

Everyone working with Python a little while won't even think about indentation anymore, it just comes natural. It's not a big deal.

Python is often made out in discussions like that to be for begginners only and only amateurs would use it because its slow, not concurrent etc... The truth though is that YouTube, Reddit, Dropbox, Spotify, Discuss, Google, Nasa and many other "real" professionals use Python in a big way.

I tend to ignore such trolls.


13. Most programmers think they know what they're doing but actually are "incompetent". (There are Usenet posts going back to 80's documenting this fact.) How am I defining "incompetent"? In my definition it means writing software that any reasonably skilled hacker can cause to "malfunction", i.e. not perform as expected by the programmer, which depending on the program may or may not present a security risk. The number of "competent" programmers i.e. those whose programs are immune to the acts of reasonably skilled hackers, is very small. They are a rare commodity. In sum, there's lots of reasonably skilled hackers and very few competent pogrammers. There is no licensing body for programmers. Anyone can call themselves a "professional programmer".

Note: If a programmer only writes programs for her own use, "competency", as defined above, is all but irrelevent. Whatever works.

Controversial, but true.


I agree with the assertion that few (probably close to 0) programmers routinely write bug-free code, but I think "incompetent" is a bit harsh - "imperfect" is more apt, I think.

Very, very few non-trivial pieces of code are completely bug-free, and building anything interesting under normal time constraints is really difficult and expensive to prove 100% correct.

So, I guess that point is true, but ultimately not very useful.


Knuth's code has bugs. NASA's code has bugs. I don't think our species has produced even one nontrivial correct program, much less a programmer who's competent (which I take to mean a large portion of their work is 100% correct). Our profession is passing through the mercury-and-leeches phase medicine once went through. Someday our descendants will look back from their error monad formal proofs or something and cringe at everything we did.


Yes. This is why I defined "incompetence" quite carefully.

Finding bugs in Knuth's or NASA's code might be beyond "a reasonably skilled hacker". To find bugs in that code you would likely have to be "highly skilled", above average.

Everyone makes mistakes. Even professionals who are licensed. The idea is to minimise them to achieve a reasonable, expected level of "correctness". Competent does not mean "perfect". It means no stupid mistakes.

In my biased opinion, there's a high tolerance for stupid mistakes in software.


>I don't think our species has produced even one nontrivial correct program

Off the top of my head: seL4, a correct microkernel...pretty nontrivial!


Wow, I'm pleased to see that's no longer totally beyond us, though the price is still very high (LtU says fifty dev-years!)


It was actually about 28 py, but that number would come down to 10 if they were to do it again.

source: http://ertos.nicta.com.au/research/l4.verified/numbers.pml


This is nothing compared to the number of py spent writing crapware. Think of how much effort has been spent writing lousy software. It is enormous. (But then most consumers of software don't know any better, so from a sales perspective, maybe writing crapware makes perfect sense.)

IMHO, these guys are heros merely for undertaking the task, let alone completing it.


This is a narrow view of what software is supposed to do. Are styrofoam cups inferior or are their designers "incompetent" because they break easily? No! That's the whole point--they are cheap and disposable.

The same is true about most software people write--it is also cheap and disposable. Often, having something that works well enough now is better than having something that works well tomorrow. I'm not even talking about being impervious to skilled hackers; even breaking under normal use is not necessarily bad.

Sure, if your software would kill people or destroy things, you have an issue. If your software is very critical and usually above suspicion (like a compiler, say) this would also be a problem. But if you're just writing a web app to share cat pictures or some internal tools or really most any other type of software? It's most often better to be cheap and fragile than good and solid.

My laptop is probably my single most important possession. (Yeah, I'm a student so I don't really have much else :P.) Do I have a Tough Book which is nigh unbreakable? Nope. I have a laptop which is actually fairly easy to mess up. And it is in every practical sense the better choice.

This is, coincidentally, why I think the cynical view of "planned obsolescence" is somewhat shortsighted. Sure, in a sense it's bad that electronic gadgets fall apart after a couple of years. But there's no cynical force behind all this: the simple fact that you wouldn't pay twice as much for a phone which is only more sturdy--but not more capable--is what drives the markets. If you were really worried about it, you would have bought a purpose-built device that would hold up better; instead you bought something cheaper which will break sooner. And this is a perfectly reasonable compromise to make!


Toughbook is hardware not software. Get an old Thinkpad. It's durable.

I take the opposite view. I like durable software. And that's how I build my systems. If they get trashed, they can be restored in minutes. I'd rather spend my effort preparing simple durable systems that can be restored easily than building complex, fragile systems that would be difficult to reconstruct if something goes wrong. I've heard that sometimes things can go wrong.


If you go through the answers on Stack Overflow, the point that "most programmers are incompetent" was made quite frequently. Those that made an effort to support their opinion were highly upvoted (so not that controversial), but many others didn't so they just sat at 0-1 upvotes.


I hate writing what I just wrote because I know it might offend some people. I'm OK with the "incompetence" (I'm sure I qualify myself), but it's the fact that many of these programmers think they know what they're doing that bugs me. And they are writing software that will be used by other people.

They don't want to admit their mistakes or that someone else might understand things they don't. They actually think they are being smart when they are really being stupid.

Certainly not looking for "votes" with this one. Although I did get an upvote immediately, then a downvote from, who would have guessed, a closed source developer.

I'm not a fan of StackOverflow. There is a lot of stupidity and herd mentality there. It's not where I would look for "competent programmers". Like finding a needle in a haystack.


I don't agree with many of the points, but having taught a lot of beginners while loops, they are often very confusing to people.

I usually end up explaining them like a repeat...until, which usually makes more sense to non-programmers.


Wow, mostly terrible advice.


Can you elaborate?


I find it odd that I have to, other than 7 and 10, which I'm not saying are necessarily always good advice, the rest are just terrible advice that I think are self evidently terrible advice. Experience has taught me nearly exactly the opposite; the advice strikes me as bad programmers who only know one language generalizing their poor procedural skills across domains they don't understand.

Of course, that's the point of the article, "Really" controversial. Seems most people voted those opinions down as poor advice, which they're correct in doing.


I would also like to hear an explanation. I found many of the topics thought provoking, such as Single Entry Single Exit.


Well, some of the suggestions have been fairly thoroughly examined by now, and often with quite clear results.

For example, anyone who advocates writing only extremely short functions because they leave less room for bugs appears to be wrong in about the same way that someone who thinks that 2+2=5 or that a chainsaw is a good implement for brain surgery is wrong.


A linkbait, "N Ys that X" content-farmy headline that seems just tailor-made for getting on HN.


Actually I think that it's a more ironic take on the meaningless links that made the front page (twice I believe) a while ago about some (not so) controversial opinions on software development. I certainly hope no one thought any of these were opinions to be taken seriously.


Was this article just one big troll?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: