Hacker News new | past | comments | ask | show | jobs | submit login
How can I teach a bright person with no programming experience how to program? (programmers.stackexchange.com)
120 points by ekm2 on Nov 7, 2011 | hide | past | favorite | 108 comments



The top comment is bullshit.

I could not get through those three books but I get paid to program. My code is repetitive and shitty but I can produce a working web app as long as it doesn't have to support too many users, and it's fine.

My husband never read any books about formal programming and taught himself from those random "learn Java" or "learn python" books. He's a mathematician and programs to write simulations/solve equations, and he gets paid for it too.

If you can't get through those three books maybe you can't hope to be a computer scientist, fine. But you can certainly program, in the same way you don't have to be an English major to read or write.


Most programmers write code that other people have to read, maintain, or extend at some point. If you write repetitive, shitty code, you may be qualified to write some useful programs for yourself, or write a web app that doesn't have to support too many users -- which are valuable skills -- but it is not fine for the majority of programming jobs. Incidentally, given that you can do what you described, I'm also pretty sure you could get through those three books if you tried; they are not rocket surgery.


My point is that programming is useful in a variety of contexts and you don't have to be that great at it to accomplish something.

You're right, I probably wouldn't work out at most programming jobs. I do contract work; the web app I'm writing now I had about 3 weeks to knock out, will be used by 2,000 people, and will only be used for about a month. But I'm still a "programmer," because I program.

We're not all writing mongodb or working at Google or whatnot.


Well, I agree with that, so I suppose I won't argue any further about it.


Cool!


I couldn't agree with you more araneae, personally I am tired of seeing the statements where it is hinted that you need to have done xyz otherwise you have no business calling yourself a (good) programmer. I know people who haven't touched those books and they may be considered the so-called "bad" java programmers, yet that is not the case...these people know the intricacies of J2EE, have read & understood the many JSRs and have in-depth knowledge of the many java libraries and frameworks..yet they are not traditional computer scientists with CS backgrounds


There's programming, and then there's programming.

I'm sorry, but what you're talking about sounds like it doesn't qualify as "programming" as much as web design and code plumbing. Note that in large part I'm talking semantics, here, in that my definition (and I suspect Joel's as well) differs from the way you're using the word).

I understand that you're likely paid well to do it -- as are plumbers. And it's a respectable job. I'm not trying to knock what you're doing.

But by your own admission your own code less than ideal. And Joel is talking about what a good programmer would need to be able to do, and if you want to start someone off on the path to becoming a good programmer, it's not a bad suggestion.

If you just want someone who can copy and paste code off of Github and doesn't really understand it, well, then you don't want a "good programmer" -- and frankly anyone worth their paycheck in IT probably already knows how to do that.


Everyone here seems to be a big fan of the "no true Scotsman."

If you write code you're a programmer. End of story.

"Programming, and then there's programming" and "code plumbing" are, in fact, semantics.


  If you write code you're a programmer. End of story.
The 'no true Scotsman' fallacy does not apply when you are actually debating whether someone living in a border region with mixed parents can be considered 'a true Scotsman'.

We don't call everyone that uses Excel a programmer, even though the simplest expression of the form sum(A11:A22) is, for every reasonable definition, 'code'. That would dilute the term far beyond what people generally mean when they talk about 'programmers'.

You cannot call yourself a writer, just because you passed English with a C for your essays and now write blog posts for a limited audience. You are not a programmer just because you can whip out webapps that strictly do the limited things you require of them.


> If you write code you're a programmer. End of story.

If you do arithmetic, does that make you a mathematician? Some things are a matter of degree.


Your analogy doesn't work because "mathematician" is a research position. You should compare mathematicians to computer scientists, not programmers.

You could make the same analogy and prove any side of the debate depending on the job you choose:

If you do plumbing, does that make you a plumber?

If you do painting, does that make you a painter?

If you do physics, does that make you a physicist?

If you drive vehicles, does that make you a driver?

Really, the debate here is that some people consider the word programmer to mean "computer scientist" or "computer expert", while others consider it to mean "person who programs". It's a debate on the meaning of a word and this kind of debate cannot be settled, because the word has multiple meanings. Everyone's right in their own way.


> Everyone here seems to be a big fan of the "no true Scotsman."

This isn't an example of that, because nobody changed definitions partway through.


On the other hand, a lot of people here love claiming that an argument is "no true Scotsman." ;)


I don't think you need to get through those three books as your first order of business learning to program. "Give up you're never going to get it" is a ridiculous attitude to have when trying to learn anything.

Try something simple. Work your way up to more and more complicated things just like any other learning endeavour.

If you want to stick to plumbing type programming then go for it. If you want to branch out into a more formal computer science type understanding, by all means grab some text books and learn some data structures, algorithms, abstractions, oop patterns, etc.


So true: If you can program, you are a programmer. How you came about this knowledge is insignificant.

However, the post was about the fastest way to learn programming and books are a terrific way to do that. Not the only one. Maybe not the best one for you. But for many people, books work great!


How you came about this knowledge is insignificant.

Yes, but these books teach you more than writing something that compiles. Just read the ToC of SICP: http://mitpress.mit.edu/sicp/full-text/book/book-Z-H-4.html#...


So what? That's not what the original question was about. It was about how to get someone to be able to program in C#, and giving that person some theoretical books on computer science is probably a very bad idea.

Some people learn very well just by reading a book, most people don't. It's impossible to answer the original question in a good way without knowing how that person learns best, what he knows beforehand, what he's done so far, and what he wants to use programming for.


Are you implying that self-taught programmers can only ever write 'something that compiles'? Because frankly, the best programmers I have met thus far have largely not studied computer sciences.


It is bullshit because you are borderline incompetent and still get paid?


This conversation seems to be entirely missing the point of the original article. The programmer to be in question is working at a .Net shop and needs to be able to get up to speed in a timely manner. It addresses a particular person in a specific instance.

Really he needs to understand what object oriented is, variables, arrays, loops, conditionals, debugging and the actual .NET syntax. Understanding these thing the person will be able to do their job. That seems to be the most important thing. Certainly you may not consider the person a "real" programmer but, it is up to him to fill out the rest of his knowledge and grow as a programmer.

The "top" answer sounds suspiciously like today's XKCD


K&R is awesome but... C would not be my first choice for a beginning language.

Things like pointer arithmetic, memory allocation, etc. distract from the basics and high-level concepts.

I would start with Python or Ruby these days. If Python is good enough for intros at places like MIT, it's good enough for me.

C is a masterpiece of elegance and simplicity and gives you a mental model of how a real computer works, and it gives you a foundation for understanding Unix, but I just think it overloads a beginner with stuff that gets in the way.

came to comments expecting a lot of reasons why Scheme or Lisp was a better choice LOL


It's tempting for us, as programmers, to consider Python an easier language to learn than C. I'm not so sure. Python is simple for programmers to learn because it uses a large number of common programmery things: lists, hash tables, classes, modules, etc. Things that beginners won't have a clue about. C doesn't have any of these.

Sure, it takes a while for you to wrap your head around pointers. But it also takes a while for you to wrap your head around inheritance. By virtue of being a smaller language, C has less of these things to wrap your head around.

If you need a dynamic array, you have to build it yourself from what you already know. Instead of having to learn yet another opaque concept, you're reinforcing your knowledge of pointers and dynamic memory allocation. Double benefit!

And once you know C, it can give you a boost towards learning other languages - most of them are implemented using C! If I don't understand something, I can always look at the source to see how it works. Since they're implemented in C, many languages tend to map easily to C concepts.

The same cannot be said about going the other way. A Ruby programmer learning C has to unlearn all the assumptions and niceties Ruby has spoiled them with. How much of Ruby is going to be applicable to C?


Python might not be an easier language to learn than C, but it certainly an easier language to be productive in. Unless your student is highly motivated you will need more exciting examples than writing your own dynamic array to keep him going. Building things that actually do something useful is nearly impossible for a beginner using C, but definitely within reach if you leverage Python's huge libraries.

As long as you want to teach someone programming and not computer science I think a high level language is much better suited.


I just think when you're learning how to use lists and hash tables, you're learning how to abstract, decompose, solve problems, which is the 'science' part... when you're learning about pointers, also, but the emphasis is more learning about the 'computer' part .

If you want to do 'computer science' or engineering, eventually you have to learn both.

But for a beginner, who might end up just casually coding as part of another discipline, or wants to understand computers as part of a liberal arts education and never goes past the first language, I think the higher level abstractions are a better place to start. (And the languages are a little gentler)


> A Ruby programmer learning C has to unlearn all the assumptions and niceties Ruby has spoiled them with.

Only if he wants to learn C.

But why bother? For the overwhelming absolute majority of programmers it's a complete waste of time.


There's so much existing C code in the world powering almost everything we do that you just have to know C in order to get a thorough understanding of computing.

C is to computer science as latin is to medicine. All medical doctors need to know a little bit of latin


A month ago I went to advice the teacher of a high-school level video game design program about the curriculum of the program. I'm a C/C++ developer, but I too thought as you did that introducing a bunch of high school kids to programming using C or C++ would be too hard.

The school in question is mostly low-income kids looking for a vocational experience, so we expected a bunch of kids who would be out of their depth if thrown in without a garbage collector. But then the teacher mentioned that none of the kids had a hard time learning C or C++, pointers included.

So the group of game developers who had come to advice pretty much all agreed that learning C/C++ first was actually the best thing that could happen to those kids. Show them C first to teach them what the machine is REALLY doing (with only a minimal abstraction layer), give them OOP with C++, and THEN transition them to a scripting language and higher level environment so they can produce something interesting before the class is over.


I agree. C is a great language to learn at first. It's exacting and forces you to think like a programmer, but it isn't that hard to do very basic things. It's only when it comes to building larger, useful programs that it becomes difficult. The key though is you learn a lot of low-level stuff that's really useful later.


Our goal was to efficiently prepare a bright non-programmer to an entry level .NET support job. Python and Ruby are great (I hack in both daily) but each is a diversion from the stated goal. By contrast, K&R is only 228 pages and is a good introduction to C#.


K&R isn't a good introduction to modern C programming. Even the ANSI edition of the book is outdated. There are better intros in the forms of books and web sites.


Yes but K&R is a classic. Its clear style paved the way for programming books ever since. You are correct that the standards have evolved over time but remember the employer in the discussion doesn't need a C programmer, they need a .NET developer. This dude will probably never write a single line of C. Spolsky recommends K&R because it's a good test and it teaches a methodology that serves a student well in .NET. And it's only 228 pages.


> C is a masterpiece of elegance and simplicity and gives you a mental model of how a real computer works

It used to. Now, with multicore systems and out-of-order execution and cache and nearly everything else that makes modern hardware fast enough to use, it is an over-simplified view of the abstraction the hardware wants you to see, but is not detailed enough to guide you when you need extra performance.

As an example, C says nothing about cache lines; its memory model is completely flat. Cache is ideally 'transparent', but we all know that cache hits are a lot faster than misses. C gives you no guidance on how to arrange your data structures to ensure as many hits as possible.


A beginner won't have to worry about that stuff.


So why draw the line there? Why is it ok to abstract away modern hardware, but not ok to abstract away memory management or pointers?


You can write a useful program that only uses one core. You can't write a useful program that doesn't use memory.


But you can write a great deal of useful code that never does any manual memory management.


You asked where you draw the line. My point is that there's a real difference between not using a resource (more than one core) or using it without understanding how it works, because it's hidden behind a paper curtain.


Because understanding pointers makes understanding references a lot easier.


I suppose most of us would give a similar answer as what Joel gave, but I found this nice pair of comments:

Is this really the only way to be a good programmer? I'm someone that wants to study programming in my spare time and that's a bit of a daunting list for someone new to the industry... – toleero Jul 28 at 14:54

No, it is not the only way to be a good programmer. But if three books is a "daunting list" than you might be underestimating what it is that programmers do. – Joel Spolsky Jul 28 at 15:05

Seriously.


I think by "daunting" he means that the books do not look accessible to someone outside programming.

I would say "Learn Python the Hard Way" if you're absolutely new to it. It moves quickly and will give a good sense of what it's like to actually push the bits around. Most people who really like programming seem to have a similar first experience (like using BASIC on their PC or calculator); replicating that seems like a good first step, no?


For a bright student with no programming experience their first go should be something like Learn Python the Hard Way; with a clueful person around to give guidance. (Not to answer questions, but to push the student to ask the right questions in search engines).

Once they've done that they're set up for K&R, etc. I get the impression that the development environment is invisible to programmers because it's just there. But imagine someone handed you K&R and asked to to write and compile a program. What compiler would you download? How would you install it? Where would you go for help?

---EDIT--- the child comment shows my next paragraph is pretty stupid. (In my defence I was remembering a time before Google would know what C or C++ meant; and before students had Internet access during lessons.)

The other concept that beginners have trouble with in C is "how do I know the names of the functions that I'm supposed to use to do stuff? How do I know what displays text on the screen? How do I know what reads keyboard input?".


If you are trying to teach someone the basics of programming, basic operators are enough. A great exercise is "write a function that takes an ASCII string in as input, parses it, and returns an integer. And write a program that takes the string input, parses it, and displays the output to the screen."

It's a great exercise. Writing your own atoi function was one of the first things I did when teaching myself C.

This is the difference between learning the language as a training exercise, and learning the language with an eye to writing software in it. C is a very easy language to learn as a training exercise. It's only when you get into writing significantly complex programs that C becomes a language that takes a lot of time to master.

The goal here is to learn how to program, not learning how to write programs in C, so the sorts of tasks and things are really pretty approachable.

Ok, if you really want something more advanced, write a program that, say, shuffles a deck of virtual playing cards. Again, something a beginner at programming would learn a lot from (pointers, data structures, memory management) but a professional C programmer would laugh at as trivial.


What compiler would you download? How would you install it? Where would you go for help? (...) how do I know the names of the functions that I'm supposed to use to do stuff? How do I know what displays text on the screen? How do I know what reads keyboard input?

Search engines?

"How do I read keyboard input in C" → the first result is scanf() "How do I display text on the screen in C" → returns a tutorial on how to use puts and printf


The compiler thing is actually the major problem. There are lots of choices and it's easy to become lost in those woods. LPTHW has the benefit of "download this", which removes an entire realm of choosing that's much less important than stoking some fires while the spark burns.


Not all "programmers" are Joel. Not all programmers do something hard. For instance, writing an online survey is not hard. It is programming, but it is relatively easy, but boring and repetitive. It's also something that is commonly done by "programmers."

Not all programming is cutting edge computer science.


Most easy programming being done nowadays probably have pre-built solutions available that you'd just have to configure. Online surveys are a great example.


By this definition, virtually all Linux/Unix sysadmins are programmers, right?


There's an aspect of "I had to do this, so you should too" in recommending that list, but it's not fatal to the overall principle: if enough programmers get on the same canon of fundamentals, then they will all benefit from sharing a common paradigm.


I am firmly convinced that if you don't enjoy programming you'll never actually become a programmer.

It does not matter how smart you are.

It does not matter how much you WANT to become a programmer.

So the first thing I do when introducing someone to programming is point them at tools that will allow them to be productive quickly to see if they "get a kick" out of creating things with code.

I think Joel's advice to start with C is GREAT. But only if the question is "I think I love to program. But I'm a noob. I want to become a great programmer. How do I start to become one?"


I believe this is largely incorrect.

You can learn to love something, largely because you find yourself successful at it. On other threads you can hear from people who hated math until they learned logic and they were off to the races. I am sure that programming has the same thing; in fact, I know programming has the same dynamic. I know this because there is a steady stream of "how Python/Ruby/mobile apps/ditching my boss made me fall in love with programming again" which suggests strongly that one can change affect. Too strongly for it not to be true.


"how Python/Ruby/mobile apps/ditching my boss made me fall in love with programming again"

How many examples do we have without that qualifier (i.e. people who had never enjoyed in the past)?


On HN? None. But I personally know people who did not care about programming until they encountered a language or teacher or project which really clicked. LPTHW seems to do that an awful lot, in fact.

I take that back, on HN, at least one, in this very thread it seems: http://news.ycombinator.com/item?id=3205485


There's also a degree of selection bias to account for here. Any ideas on how to get around that?


>You can learn to love something

True. Nothing is fun until you're good at it.


I agree you can learn to love something, but to say that nothing is fun until you are good at it is simply Not True.

When I got my Apple ][ in 1980 I was not good at programming. But I had a freaking BLAST writing AppleSoft BASIC code till the cows came home. Weeks previously I probably couldn't have even SPELLED programmer.


Agreed. I've done plenty of things that I was bad at, but enjoyed anyhow.

Programming is first on the list, though you might say I had a natural aptitude for it. Despite the long road of learning (mostly) on my own, I enjoyed it all the way. (I say 'mostly' because there was a short programming class in 4th grade that introduced me to it.)

Bowling is another. I'm really not good at it. With practice I might be, but I don't practice enough. I enjoy it anyhow.


You can enjoy programming and not necessarily enjoy slogging through books.


Maybe Joel is right, maybe not. It might depend on the person. At 13 years old I tried learning C++, at 14 Java, at 16 C.

None of them stuck. Part of it was that the concepts were too difficult to understand. In which case Joel's list might help, as it is more about concepts than teaching programming. I think a bigger part of the problem was that it was too difficult to write anything useful in those languages. After a semester of C (Intro to Programming) I don't think I could write a single useful program. If it was a semester of Python I'm damn sure I could write useful code after it.

I wish a book like "Learn Python The Hard Way" existed back then. It would have been great to understand the basic concepts of programming, and see the code actually work. I remember a C class where the instructor was teaching about arrays. I don't think a single person in the class really understood what he was talking about. Compare that to Python lists and dictionaries, you can see them work, you understand what they do and how to use them immediately.

I'll take Joel's advice and read those books. But I'm not sure they are best way for people to start.


Why not make them start in machine language, I mean since they need to learn how everything works before they can program in the language they choose right? I find it somewhat funny that C and C++ are the default "starting point" languages to learn programming when there are still lower-level languages than these. I see no problem with starting out at a higher level language and going from there, part of programming is having enthusiasm and being able to play around. By starting with easier languages, a person's interest may better be captured and can lead to more self-learning, the way I see it, the natural progression is to want to learn the harder, lower-level stuff as you move along in a development career anyways.


Code is pretty much explaining machine code.


I'm a bright person, and I learned to program in 6 months with only "programming" experience being Excel VBA scripts.

Joel's method would have made me drop out and convinced me that programming was terribly boring-- much like a how a kid who wants to play the piano would feel after starting out on 6-months of only scales.

I learned so quickly because I loved it, and I loved it because I could be instantly productive making apps with Ruby on Rails. There was still a lot to learn, but those things could be learned later "in the field" as they were relevant.


I completely agree. My initial foray into programming was writing sites in PHP, and I learned a lot about logic, functions, classes, and structuring projects because of it— and my transition into other languages. Had I said I wanted to learn to program and someone tossed K&R at me and told me to "work through it by sheer force", I doubt I would have found the subject to be so interesting.

Certainly K&R is a must-read, but Joel's advice...

"If you can't get through this sequence, you're not going to be able to program, so you might as well give up now."

...is terrible. There is more than one way to introduce someone to programming.


The first programs I wrote were complex BASH scripts that solved specific problems I needed solving. I had a friend that was there to help me a long the way. But the main reasons I continued learning were because I had the immediate benefit of having my problems solved and also because I had tangible evidence of my progress. The rest snowballed from there.

I understand where Joel is coming from with his advice, but I don't think I agree with it. I had read quite a few programming books in the past (C & Java), muddled through the exercises, and thought I understood the concepts, but until I dived into solving real problems, I didn't "get it".


Exactly - that's how I learned programming, too. It didn't matter so much about the syntax, but what problem you could solve. For me it was a Perl CGI script for letter writing and earlier a Hypercard registration system that got me into programming, despite my father and teachers repeatedly trying to teach me BASIC. Today it might be adding javascript to a page, or customizing/coding a game using tools like Scratch or Gamemaker.

And there is decades worth of research backing this up (problem-based learning, situated cognition), but CS education has been slow to adapt.


I did this. It's not for everyone, but for a highly motivated person it's not that hard either.

A friend who's in Med School wanted better working conditions than working at crappy call centers. I gave him a high-level overview of what programming is about and threw him the Django tutorial and made myself available for questions. As I gave him practical requests for useful web code, I delved into the details of Python and web programming in general, while slowly introducing him to tools of the trade like the shell, debuggers, and data structures, trying to make every step of the way easier for him.

Six months later he got a job at a biomedical NGO. He began by documenting and testing existing code, but now does quite a bit of coding by himself.

I've found that teaching theory along with the tools that utilize it to be a very good strategy.


I'm surprised no one made any mention of what this bright person knows in the first place.

One of the best ways to teach is to map known concepts to new concepts. For instance, I might introduce methods to people who already know what functions are, but talk about messages to those who don't.


Yeah, Joel is dead on for how to learn to program. Unfortunately, I think we need to question that question more.

What folks are often asking when they ask "how do I program?" is something entirely different.

How do I learn how to make a website? Use wordpress or yola or squarespace.

How do I learn how to make that website look better? Read design blogs, some starter tutorials on html and css and start diving into the CSS. It's not that hard a standalone thing to learn.

How do I learn how to make that website do something fancy like fade things out? Start diving into the javascript in the same way.

How do I learn how to make that website do something functional like send an email, or save a session? Pick a modern framework (Ruby on Rails, Node on Express), read/watch the starter tutorials on it, and dive in.

How do I learn how to make that website do something functional and NOT have it be a horrendous hack job? Time to start reading those programming books :D

Just sayin' it's not always step 1. Depends on what you need to get done, and what you really want to learn how to do. Yeah?


To get some of the concepts of computer science and most basic features of programming, see:

http://csunplugged.org/

http://scratch.mit.edu/

For application development, I'd start with an environment that actually tried to make things simpler for the programmer and showed how you can use programming to tweak a design first before programming from scratch - an IDE (see Microsoft's beginner resources http://msdn.microsoft.com/en-us/beginner/default.aspx ) or game development environment (like gamemaker or XNA).

See also actual curricula for high school or non-computer science students (CS0 courses), like

http://csta.acm.org/

http://coweb.cc.gatech.edu/mediaComp-teach

I would not throw SICP at someone who didn't already know how to program.


Joel likes the sink or swim approach apparently.

The first book, Code - Charles Petzold, is only ten bucks on the Kindle right now (SOLD) and I own the other two.

I remember SICP being like 90 bucks for my hardcover copy and I'm pretty sure I bought it after reading the Joel on Software article "The Perils of Java Schools" while I was loving life as a bank IT guy so this is workable advice. It only took five years to become a "real" programmer after that. Also I sure don't remember K&R being $68[1]. (Yes I know SICP and K&R are both online for free)

[1] http://www.amazon.com/Programming-Language-2nd-Brian-Kernigh...

[2] http://www.joelonsoftware.com/articles/ThePerilsofJavaSchool...


Could you link to a free (and legal) version of K&R?

For anyone interested, I found SICP at http://mitpress.mit.edu/sicp/


For completeness, SICP in .mobi format for Kindle: https://github.com/twcamper/sicp-kindle


I'm pretty sure there was a link to it from Dennis ritchies' home page:

http://www.cs.bell-labs.com/who/dmr

The site is down now though. Some forums say it's temporary but I don't know for sure.


Google: Kernighan Ritchie filetype:pdf


I think in addition to studying these books, the student should be put right to work coding simple tasks. The best way to learn is by doing, not by theory.

The student should get in and start hacking away at some non critical part of the code base. Perhaps the UI, or some reporting or tests. Something it's OK to break for a day or two.

Pair them up with a more experienced developer and make sure they do code reviews. Discus the code and talk about what went right and what went wrong.


Agreed. I wish more books would adopt this strategy: "Ok, stop. To move forward with the application we will need code that does X, Y, and Z. Here is the spec that I want you to code for the next step. Work on it for a bit. If you're having trouble check out our forum for other people's attempts at a response and you can see 10 ways to solve the problem. On the next page you will see the code answer and the book will continue with an explanation of the code."

Even as a beginner you can tackle small snippets of code using the basic control structures.


>Even as a beginner you can tackle small snippets of code using the basic control structures.

Sadly, this is how programming books used to be written. Case in point: about a decade ago in high school we learned programming from a book called C by example. It had a handful of well thought out programming exercises after every section. A few years later I went looking for the book to give to a friend who was starting out. Lo and behold, in an updated version the exercises were all removed! It's mind-boggling that somehow the publisher thought removing the exercises was adding value to the book. Perhaps your average reader feels exercises makes a book seem too technical, so they skip it for the Learn X in Y hours style books? Nowadays you can't find good programming exercises outside of a dry textbook.


What the fuck?

Here is how I read this thread: Q: How do I quickly get a complete coder newbie up to speed to support .NET apps? A: Teach him Lisp and C

What a silly, dogmatic, and unpractical answer. No one even bothered to ask about what kind of work this person is going to be doing, what sort of apps he'll be writing, what industry he's working in, etc.

Obviously, the company where this guy works is giving him a shot at a super-entry-level programming job. If he has any hope of succeeding (which means being productive ASAP) in this job, learning C and Lisp is not going to help him.


The only well-known language worse than C to start with would be C++. We used this in high school AP Computer Science before they switched to Java. Spent about 80% of our time learning pass by reference, templates, and about segfaults. The other 20% was on algorithms. Made programming seem horribly complex.


Don't knows about you, but I wouldn't consider someone a "programmer" unless the had a solid grounding in pass by value vs. reference, generic programming, and hardware memory protection mechanisms. Yeah, and algorithms are important too.


Ah yes, the "no true scotsman."

A programmer is someone who programs for a living. I program for a living. I have never programmed in C, have no idea what generic programming is, and I really couldn't care less about hardware memory protection.


Yes, but not C++


I actually don't disagree with your opinion: I wouldn't pick C++ as an environment to introduce programming either. But your reasoning was frankly awful. If those are the reasons you think C++ is "too complicated", then you really need to spend some more time learning C++.


You're right, those aren't good reasons for why C++ is awful. I think ERR i give up. You're right my whole rationale is pretty confusing. The class wasn't so bad in reality. We didn't really look deeply into the horrible "features" of C++.


My first "real" job was with a software company, in the QA department. I got lucky getting that job, as I had no requisite skills. Anyway, I wanted to learn how to program. I took a C class at a local university. I remember, vividly, being so frustrated that I was in tears when trying to learn about pointers and recursive functions. Very basic to me now, but at the time, extremely frustrating - I simply didn't understand the concepts. I reached out to a few of the developers at that company and they helped me. I'm not a stud programmer, not by any stretch of the imagination, but I've made a good living over the years - and I have not once programmed in C since that class. BUT --- the things I learned in that class have helped me tremendously. For example, it helped me when I was programming in VB and needed to tap into raw winapi functions. A combination of C programming basics, along with OS basics, will get you a LONG way when trying to understand and solve challenges you'll come across in "normal" programming challenges.


Ok, I give up. The class was great! Sorry Mr Khan!

Really I just hate C++. The class was pretty good and bug hunting skills are not something to take for-granted. That's for sure!


I wasn't trying to argue with you, rather I was trying to point out that some good can come from the C/C++ path. I've had a lot of people ask me which language to use as a starting point in programming, and not once have I said C/C++ -- maybe, upon reflection, I should have??


Then I blame the course curriculum and your teachers, not the language.


I am helping a friend learn to program. He was doing "Learn Python The Hard Way" but was loosing steam. I created a project for him to do, a simple ASCII game: https://github.com/antoinehersen/ASCII-Dungeon-of-DOOM. Having a project with visual and interactive elements is far more motivating. Also it push you to learn the real skills needed for programing, not only knowing what a loop is or how to open a file, but figuring what to use when, and how to organize and connect it all. This experience will create a reference frame for everything he might subsequently read in a book.

I strongly believe that accomplishing a small scale project that as some value is the best way to learn.


wow, i disagree! i can't even imagine starting with C. that's like saying the only way to learn to build with legos is to learn how to mold plastic first. sure it's possible, but desirable?


What's the most important thing when learning to program?

My answer: Exacting logic.

C is a great language at teaching exacting logic. "Wax on, Wax off" kinds of things.....


Learning how to write code is actually fairly easy. Kids can do it. The hard part is to write code that doesn't make your co-workers hate your guts because they have to maintain it. I'd recommend getting a copy of McConnell's "Code Complete".


I found a simple method that works really well: ask the student to invent their own language syntax on paper. We start with a simple language that can draw on screen, and proceed from there.

I'm noticing that Students naturally discover many of the issues and tricks that they would otherwise have to "study" (such as flow of execution, nesting, variables)

I occasionally teach at schools in NY, and this semester my students "invented" the smalltalk syntax. It was heartwarming to witness.


Weird to recommend not starting on the most relevant language (C#) but instead with C and Scheme...

Sounds more like "old-hats" regurgitating their learning process.

Why the line is always drawn with C as well, is interesting. Not assembly, not Java, or Python... but C.

An analogy: Go and learn to use a mouse and keyboard, before you get an iPhone, otherwise things like the "On-Screen Keyboard" and the idea of "Touching (Clicking)" things is too difficult to grasp.


If you want to get someone hireable as quickly as possible, tell them to learn C# or Java by all means.

But if you want them to "achieve enlightenment", you've got to get them on scheme. And if you want them to see through all the smoke and mirrors, and gaze upon the simplicity that is the machine, you've got to teach them C. Assembly will work too, with the proper arch (and x86 is not it).


I completely agree with you. I think Joel was answering the 'how do I achieve enlightenment' question. Maybe he only considers someone a programmer if they are enlightened; seems kind of narrow-minded to me.


I feel the same way, you can still write decent code without knowing all the nitty gritty deep-down details, I'm a firm believer in learning as much as possible about lower-level, Eventually.. but its often a case of Too Much Information for someone starting out and, with Java, for example, it does garbage collection for you, so as long as you learn some basic patterns of use, the code shouldn't become overly maintainable. If they're using C# or Java, I'm pretty sure its not some mission critical app that has to be uber fast or the wheels will fall off of the company's entire operations


You will learn things picking up C that you need when using those other languages, but which might not be immediately obvious if you start with them. Pointers, memory management, et cetera...


No one wants to learn to program.

The question is fundamentally flawed.

People want to build something, not learn to program.

To teach someone to build something, teach them how to use the tools of the trade. No carpenter ever learned the physics behind wood in order to build their first table.

The answers to this question all come from people with a perspective that this is what they would do if they wanted to take a bright person and put them into a developer role at their company. And, they all have the defeatist attitude that unless you read these three theory books, and spend four months of intense study, you cannot be a programmer. What hogwash.

Most people cannot take four months off from work and intensely study programming. But, anyone can learn to tinker on something they care about with the right mentorship and using the right tools. Anyone can learn to use jQuery, Ruby on Rails or HTML if-and-only-if the point of doing so is not to learn jQuery, RoR or HTML but is to build something that inspires them. All you need is a path with baby steps and inspiration.


Inspire them, forget the details. Show them how to get a computer to leap into life with their creation, and they will take it from there.

Showed my middle son how to animate a java graphic on a website. He never stopped. Now is a CS graduate student at CM designing/implementing the next internet routing protocols (and writing Kinect games).


Give them progressively harder programming problems to solve. Each new problem should be just baaaarely within their capability to solve, so they are forced to learn something new with each new problem. Each new problem should also 1) build on the foundations of the previous problem, and 2) introduce new fundamental concepts.

I know of no other way. You learn programming by programming. This, by the way, is exactly what a good CS curriculum will do.


I could not agree more with Joel. To be a good/proper programmer you need to know C inside out. I would never, ever hire anyone who doesn't understand pointers.


tl;dr: If a book were going to teach somebody how to program then you wouldn't be asking this question. A book requires a self-motivated person to pick it up and read it. Assuming you're trying to teach somebody who didn't have the desire to learn on their own, then you're going to have to relate to them and explain it in a way that they'll understand. Getting them interested and excited about it usually seems to be the best kick-starter.

You know what? I've actually unintentionally taught several people the basics of programming. I work at a small company with other bright people who are not programmers by study, but have learned how to communicate programming with me. I truly believe that teaching somebody to program is as easy as figuring out how to best explain something sequential to them. Usually, I begin my stealth teachings under the guise of me just trying to help them communicate their program requirements or ideas to me. I quickly jump into the lower level explanations of things and somehow I manage to get them excited or interested enough that they actually feel like talking "programming" with me will be cool. I try to explain to them that programming is like giving the computer an ordered list of things to do, and it'll do them exactly in that order. As simple as 1, 2, 3. I then explain loops. I don't get into functions and classes because that's not necessarily important to them understanding the flow of logic. Most people seem to easily grasp the concepts of if-then, and loops ... this is all really simple stuff for most people to understand at a basic level. I don't even necessarily mean code, but more so logically what an if-else means. I find that most people generally give you the "yea yea I got it" type of response while explaining this. Once I get them to this point, they're where I wanted them to be. I then try to include them in scenarios where I'm debugging something. I don't deliberately ask them to help me do my job, but I more so start talking out loud and usually they'll come over to help brainstorm. I show them my code, and explain the logic flow (if-else, loops, etc). You'll be surprised at how easy it is for people to actually help out here. Sometimes it's the non-programmers who can give the most obvious advice. At this point in time, it's up to them. If they're interested enough they'll progress from them, otherwise it may just no be their thing. I work with 8 people, and I have actually had one guy learn Python and he now writes Python tools for us - he knew zero programming. I also got our accountant into programming because he would always ask me the status of projects, and slowly over time I got him speaking "programmer-speak" with me, and now he basically understands all my programming yammer. I realize that these guys are not writing the next Redis or Twitter, but they were non-technical guys who are now on their way to being on their way to writing those things. :)


Great posting! You actually have an idea of how to self-motivate people in order to learn something. The suggestion "come back after reading three books" looks to me as one of the best ways to make sure that they will never ever become interested in coding again. It is like telling a person interested in learning a language that they should come back after learning the dictionary. Just completely inable to relate to non-coding people.


On a generic level, the best way to make a smart non-programmer write programs is to describe how to do it but not tell them it's programming.


I'm pretty sure this is the only way: http://abstrusegoose.com/249


I would think that the Stanford open class CS106a with Prof. Sahami would be an excellent start.



http://www.avc.com/a_vc/2011/10/program-or-be-programmed.htm...

"To get 'skills' and expertise in the technical parts of computing, especially for Web 2.0, here is an outline in 8 steps:"

Also it happens that that post, like the OP, is aimed at .NET.

Also the post emphasizes just one book which has introductions to HTML, CSS, Visual Basic .NET, ASP.NET, and ADO.NET, including relational database normal forms, and contains nothing on 'algorithms' such as in Knuth's TACP or subsequent, similar books. Also the post has nothing on C, C++, Java, Python, PHP, Ruby, or Linux.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: